A low-power and high-quality implementation of the discrete cosine transformation
NASA Astrophysics Data System (ADS)
Heyne, B.; Götze, J.
2007-06-01
In this paper a computationally efficient and high-quality preserving DCT architecture is presented. It is obtained by optimizing the Loeffler DCT based on the Cordic algorithm. The computational complexity is reduced from 11 multiply and 29 add operations (Loeffler DCT) to 38 add and 16 shift operations (which is similar to the complexity of the binDCT). The experimental results show that the proposed DCT algorithm not only reduces the computational complexity significantly, but also retains the good transformation quality of the Loeffler DCT. Therefore, the proposed Cordic based Loeffler DCT is especially suited for low-power and high-quality CODECs in battery-based systems.
Evaluating Computational Gene Ontology Annotations.
Škunca, Nives; Roberts, Richard J; Steffen, Martin
2017-01-01
Two avenues to understanding gene function are complementary and often overlapping: experimental work and computational prediction. While experimental annotation generally produces high-quality annotations, it is low throughput. Conversely, computational annotations have broad coverage, but the quality of annotations may be variable, and therefore evaluating the quality of computational annotations is a critical concern.In this chapter, we provide an overview of strategies to evaluate the quality of computational annotations. First, we discuss why evaluating quality in this setting is not trivial. We highlight the various issues that threaten to bias the evaluation of computational annotations, most of which stem from the incompleteness of biological databases. Second, we discuss solutions that address these issues, for example, targeted selection of new experimental annotations and leveraging the existing experimental annotations.
A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS
Fine-scale Computational Fluid Dynamics (CFD) simulation of pollutant concentrations within roadway and building microenvironments is feasible using high performance computing. Unlike currently used regulatory air quality models, fine-scale CFD simulations are able to account rig...
Andersen, Johan H; Fallentin, Nils; Thomsen, Jane F; Mikkelsen, Sigurd
2011-05-12
To summarize systematic reviews that 1) assessed the evidence for causal relationships between computer work and the occurrence of carpal tunnel syndrome (CTS) or upper extremity musculoskeletal disorders (UEMSDs), or 2) reported on intervention studies among computer users/or office workers. PubMed, Embase, CINAHL and Web of Science were searched for reviews published between 1999 and 2010. Additional publications were provided by content area experts. The primary author extracted all data using a purpose-built form, while two of the authors evaluated the quality of the reviews using recommended standard criteria from AMSTAR; disagreements were resolved by discussion. The quality of evidence syntheses in the included reviews was assessed qualitatively for each outcome and for the interventions. Altogether, 1,349 review titles were identified, 47 reviews were retrieved for full text relevance assessment, and 17 reviews were finally included as being relevant and of sufficient quality. The degrees of focus and rigorousness of these 17 reviews were highly variable. Three reviews on risk factors for carpal tunnel syndrome were rated moderate to high quality, 8 reviews on risk factors for UEMSDs ranged from low to moderate/high quality, and 6 reviews on intervention studies were of moderate to high quality. The quality of the evidence for computer use as a risk factor for CTS was insufficient, while the evidence for computer use and UEMSDs was moderate regarding pain complaints and limited for specific musculoskeletal disorders. From the reviews on intervention studies no strong evidence based recommendations could be given. Computer use is associated with pain complaints, but it is still not very clear if this association is causal. The evidence for specific disorders or diseases is limited. No effective interventions have yet been documented.
Using management information systems to enhance health care quality assurance.
Rosser, L H; Kleiner, B H
1995-01-01
Examines how computers and quality assurance are being used to improve the quality of health care delivery. Traditional quality assurance methods have been limited in their ability to effectively manage the high volume of data generated by the health care process. Computers on the other hand are able to handle large volumes of data as well as monitor patient care activities in both the acute care and ambulatory care settings. Discusses the use of computers to collect and analyse patient data so that changes and problems can be identified. In addition, computer models for reminding physicians to order appropriate preventive health measures for their patients are presented. Concludes that the use of computers to augment quality improvement is essential if the quality of patient care and health promotion are to be improved.
[Diagnostic possibilities of digital volume tomography].
Lemkamp, Michael; Filippi, Andreas; Berndt, Dorothea; Lambrecht, J Thomas
2006-01-01
Cone beam computed tomography allows high quality 3D images of cranio-facial structures. Although detail resolution is increased, x-ray exposition is reduced compared to classic computer tomography. The volume is analysed in three orthogonal plains, which can be rotated independently without quality loss. Cone beam computed tomography seems to be a less expensive and less x-ray exposing alternative to classic computer tomography.
Andersen, Johan H.; Fallentin, Nils; Thomsen, Jane F.; Mikkelsen, Sigurd
2011-01-01
Background To summarize systematic reviews that 1) assessed the evidence for causal relationships between computer work and the occurrence of carpal tunnel syndrome (CTS) or upper extremity musculoskeletal disorders (UEMSDs), or 2) reported on intervention studies among computer users/or office workers. Methodology/Principal Findings PubMed, Embase, CINAHL and Web of Science were searched for reviews published between 1999 and 2010. Additional publications were provided by content area experts. The primary author extracted all data using a purpose-built form, while two of the authors evaluated the quality of the reviews using recommended standard criteria from AMSTAR; disagreements were resolved by discussion. The quality of evidence syntheses in the included reviews was assessed qualitatively for each outcome and for the interventions. Altogether, 1,349 review titles were identified, 47 reviews were retrieved for full text relevance assessment, and 17 reviews were finally included as being relevant and of sufficient quality. The degrees of focus and rigorousness of these 17 reviews were highly variable. Three reviews on risk factors for carpal tunnel syndrome were rated moderate to high quality, 8 reviews on risk factors for UEMSDs ranged from low to moderate/high quality, and 6 reviews on intervention studies were of moderate to high quality. The quality of the evidence for computer use as a risk factor for CTS was insufficient, while the evidence for computer use and UEMSDs was moderate regarding pain complaints and limited for specific musculoskeletal disorders. From the reviews on intervention studies no strong evidence based recommendations could be given. Conclusions/Significance Computer use is associated with pain complaints, but it is still not very clear if this association is causal. The evidence for specific disorders or diseases is limited. No effective interventions have yet been documented. PMID:21589875
Computer program CDCID: an automated quality control program using CDC update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, G.L.; Aguilar, F.
1984-04-01
A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
Assessing product image quality for online shopping
NASA Astrophysics Data System (ADS)
Goswami, Anjan; Chung, Sung H.; Chittar, Naren; Islam, Atiq
2012-01-01
Assessing product-image quality is important in the context of online shopping. A high quality image that conveys more information about a product can boost the buyer's confidence and can get more attention. However, the notion of image quality for product-images is not the same as that in other domains. The perception of quality of product-images depends not only on various photographic quality features but also on various high level features such as clarity of the foreground or goodness of the background etc. In this paper, we define a notion of product-image quality based on various such features. We conduct a crowd-sourced experiment to collect user judgments on thousands of eBay's images. We formulate a multi-class classification problem for modeling image quality by classifying images into good, fair and poor quality based on the guided perceptual notions from the judges. We also conduct experiments with regression using average crowd-sourced human judgments as target. We compute a pseudo-regression score with expected average of predicted classes and also compute a score from the regression technique. We design many experiments with various sampling and voting schemes with crowd-sourced data and construct various experimental image quality models. Most of our models have reasonable accuracies (greater or equal to 70%) on test data set. We observe that our computed image quality score has a high (0.66) rank correlation with average votes from the crowd sourced human judgments.
Real-time computer treatment of THz passive device images with the high image quality
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Trofimov, Vladislav V.
2012-06-01
We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.
Quality user support: Supporting quality users
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolley, T.C.
1994-12-31
During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less
NASA Astrophysics Data System (ADS)
Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem
2017-11-01
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...
2017-10-24
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
Kleinman, L; Leidy, N K; Crawley, J; Bonomi, A; Schoenfeld, P
2001-02-01
Although most health-related quality of life questionnaires are self-administered by means of paper and pencil, new technologies for automated computer administration are becoming more readily available. Novel methods of instrument administration must be assessed for score equivalence in addition to consistency in reliability and validity. The present study compared the psychometric characteristics (score equivalence and structure, internal consistency, and reproducibility reliability and construct validity) of the Quality of Life in Reflux And Dyspepsia (QOLRAD) questionnaire when self-administered by means of paper and pencil versus touch-screen computer. The influence of age, education, and prior experience with computers on score equivalence was also examined. This crossover trial randomized 134 patients with gastroesophageal reflux disease to 1 of 2 groups: paper-and-pencil questionnaire administration followed by computer administration or computer administration followed by use of paper and pencil. To minimize learning effects and respondent fatigue, administrations were scheduled 3 days apart. A random sample of 32 patients participated in a 1-week reproducibility evaluation of the computer-administered QOLRAD. QOLRAD scores were equivalent across the 2 methods of administration regardless of subject age, education, and prior computer use. Internal consistency levels were very high (alpha = 0.93-0.99). Interscale correlations were strong and generally consistent across methods (r = 0.7-0.87). Correlations between the QOLRAD and Short Form 36 (SF-36) were high, with no significant differences by method. Test-retest reliability of the computer-administered QOLRAD was also very high (ICC = 0.93-0.96). Results of the present study suggest that the QOLRAD is reliable and valid when self-administered by means of computer touch-screen or paper and pencil.
Public library computer training for older adults to access high-quality Internet health information
Xie, Bo; Bugg, Julie M.
2010-01-01
An innovative experiment to develop and evaluate a public library computer training program to teach older adults to access and use high-quality Internet health information involved a productive collaboration among public libraries, the National Institute on Aging and the National Library of Medicine of the National Institutes of Health (NIH), and a Library and Information Science (LIS) academic program at a state university. One hundred and thirty-one older adults aged 54–89 participated in the study between September 2007 and July 2008. Key findings include: a) participants had overwhelmingly positive perceptions of the training program; b) after learning about two NIH websites (http://nihseniorhealth.gov and http://medlineplus.gov) from the training, many participants started using these online resources to find high quality health and medical information and, further, to guide their decision-making regarding a health- or medically-related matter; and c) computer anxiety significantly decreased (p < .001) while computer interest and efficacy significantly increased (p = .001 and p < .001, respectively) from pre- to post-training, suggesting statistically significant improvements in computer attitudes between pre- and post-training. The findings have implications for public libraries, LIS academic programs, and other organizations interested in providing similar programs in their communities. PMID:20161649
A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks
NASA Technical Reports Server (NTRS)
Cui, Zhenqian
1999-01-01
With the development of high-speed networking technology, computer networks, including local-area networks (LANs), wide-area networks (WANs) and the Internet, are extending their traditional roles of carrying computer data. They are being used for Internet telephony, multimedia applications such as conferencing and video on demand, distributed simulations, and other real-time applications. LANs are even used for distributed real-time process control and computing as a cost-effective approach. Differing from traditional data transfer, these new classes of high-speed network applications (video, audio, real-time process control, and others) are delay sensitive. The usefulness of data depends not only on the correctness of received data, but also the time that data are received. In other words, these new classes of applications require networks to provide guaranteed services or quality of service (QoS). Quality of service can be defined by a set of parameters and reflects a user's expectation about the underlying network's behavior. Traditionally, distinct services are provided by different kinds of networks. Voice services are provided by telephone networks, video services are provided by cable networks, and data transfer services are provided by computer networks. A single network providing different services is called an integrated-services network.
beta-Aminoalcohols as Potential Reactivators of Aged Sarin-/Soman-Inhibited Acetylcholinesterase
2017-02-08
This approach includes high - quality quantum mechanical/molecular mechanical calcula- tions, providing reliable reactivation steps and energetics...I. V. Khavrutskii Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced...b] Dr. A. Wallqvist Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced
Data Processing Aspects of MEDLARS
Austin, Charles J.
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files. PMID:14119287
DATA PROCESSING ASPECTS OF MEDLARS.
AUSTIN, C J
1964-01-01
The speed and volume requirements of MEDLARS necessitate the use of high-speed data processing equipment, including paper-tape typewriters, a digital computer, and a special device for producing photo-composed output. Input to the system is of three types: variable source data, including citations from the literature and search requests; changes to such master files as the medical subject headings list and the journal record file; and operating instructions such as computer programs and procedures for machine operators. MEDLARS builds two major stores of data on magnetic tape. The Processed Citation File includes bibliographic citations in expanded form for high-quality printing at periodic intervals. The Compressed Citation File is a coded, time-sequential citation store which is used for high-speed searching against demand request input. Major design considerations include converting variable-length, alphanumeric data to mechanical form quickly and accurately; serial searching by the computer within a reasonable period of time; high-speed printing that must be of graphic quality; and efficient maintenance of various complex computer files.
ERIC Educational Resources Information Center
Cheung, Yin Ling
2016-01-01
Much research has been conducted to investigate the quality of writing and high-level revisions in word processing-assisted and pen-and-paper writing modes. Studies that address cognitive aspects, such as experience and comfort with computers, by which students compose essays during writing assessments have remained relatively unexplored. To fill…
An image compression algorithm for a high-resolution digital still camera
NASA Technical Reports Server (NTRS)
Nerheim, Rosalee
1989-01-01
The Electronic Still Camera (ESC) project will provide for the capture and transmission of high-quality images without the use of film. The image quality will be superior to video and will approach the quality of 35mm film. The camera, which will have the same general shape and handling as a 35mm camera, will be able to send images to earth in near real-time. Images will be stored in computer memory (RAM) in removable cartridges readable by a computer. To save storage space, the image will be compressed and reconstructed at the time of viewing. Both lossless and loss-y image compression algorithms are studied, described, and compared.
ERIC Educational Resources Information Center
Chou, Pao-Nan; Chang, Chi-Cheng
2011-01-01
This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…
Optical Computers and Space Technology
NASA Technical Reports Server (NTRS)
Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela
1995-01-01
The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.
Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications
NASA Astrophysics Data System (ADS)
Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu
2007-11-01
Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.
NASA Astrophysics Data System (ADS)
Bhardwaj, Jyotirmoy; Gupta, Karunesh K.; Gupta, Rajiv
2018-02-01
New concepts and techniques are replacing traditional methods of water quality parameter measurement systems. This paper introduces a cyber-physical system (CPS) approach for water quality assessment in a distribution network. Cyber-physical systems with embedded sensors, processors and actuators can be designed to sense and interact with the water environment. The proposed CPS is comprised of sensing framework integrated with five different water quality parameter sensor nodes and soft computing framework for computational modelling. Soft computing framework utilizes the applications of Python for user interface and fuzzy sciences for decision making. Introduction of multiple sensors in a water distribution network generates a huge number of data matrices, which are sometimes highly complex, difficult to understand and convoluted for effective decision making. Therefore, the proposed system framework also intends to simplify the complexity of obtained sensor data matrices and to support decision making for water engineers through a soft computing framework. The target of this proposed research is to provide a simple and efficient method to identify and detect presence of contamination in a water distribution network using applications of CPS.
Treating child and adolescent anxiety effectively: Overview of systematic reviews.
Bennett, Kathryn; Manassis, Katharina; Duda, Stephanie; Bagnell, Alexa; Bernstein, Gail A; Garland, E Jane; Miller, Lynn D; Newton, Amanda; Thabane, Lehana; Wilansky, Pamela
2016-12-01
We conducted an overview of systematic reviews about child and adolescent anxiety treatment options (psychosocial; medication; combination; web/computer-based treatment) to support evidence informed decision-making. Three questions were addressed: (i) Is the treatment more effective than passive controls? (ii) Is there evidence that the treatment is superior to or non-inferior to (i.e., as good as) active controls? (iii) What is the quality of evidence for the treatment? Pre-specified inclusion criteria identified high quality systematic reviews (2000-2015) reporting treatment effects on anxiety diagnosis and symptom severity. Evidence quality (EQ) was rated using Oxford evidence levels [EQ1 (highest); EQ5 (lowest)]. Twenty-two of 39 eligible reviews were high quality (AMSTAR score≥3/5). CBT (individual or group, with or without parents) was more effective than passive controls (EQ1). CBT effects compared to active controls were mixed (EQ1). SSRI/SNRI were more effective than placebo (EQ1) but comparative effectiveness remains uncertain. EQ for combination therapy could not be determined. RCTs of web/computer-based interventions showed mixed results (EQ1). CBM/ABM was not more efficacious than active controls (EQ1). No other interventions could be rated. High quality RCTs support treatment with CBT and medication. Findings for combination and web/computer-based treatment are encouraging but further RCTs are required. Head-to-head comparisons of active treatment options are needed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Study of Image Qualities From 6D Robot-Based CBCT Imaging System of Small Animal Irradiator.
Sharma, Sunil; Narayanasamy, Ganesh; Clarkson, Richard; Chao, Ming; Moros, Eduardo G; Zhang, Xin; Yan, Yulong; Boerma, Marjan; Paudel, Nava; Morrill, Steven; Corry, Peter; Griffin, Robert J
2017-01-01
To assess the quality of cone beam computed tomography images obtained by a robotic arm-based and image-guided small animal conformal radiation therapy device. The small animal conformal radiation therapy device is equipped with a 40 to 225 kV X-ray tube mounted on a custom made gantry, a 1024 × 1024 pixels flat panel detector (200 μm resolution), a programmable 6 degrees of freedom robot for cone beam computed tomography imaging and conformal delivery of radiation doses. A series of 2-dimensional radiographic projection images were recorded in cone beam mode by placing and rotating microcomputed tomography phantoms on the "palm' of the robotic arm. Reconstructed images were studied for image quality (spatial resolution, image uniformity, computed tomography number linearity, voxel noise, and artifacts). Geometric accuracy was measured to be 2% corresponding to 0.7 mm accuracy on a Shelley microcomputed tomography QA phantom. Qualitative resolution of reconstructed axial computed tomography slices using the resolution coils was within 200 μm. Quantitative spatial resolution was found to be 3.16 lp/mm. Uniformity of the system was measured within 34 Hounsfield unit on a QRM microcomputed tomography water phantom. Computed tomography numbers measured using the linearity plate were linear with material density ( R 2 > 0.995). Cone beam computed tomography images of the QRM multidisk phantom had minimal artifacts. Results showed that the small animal conformal radiation therapy device is capable of producing high-quality cone beam computed tomography images for precise and conformal small animal dose delivery. With its high-caliber imaging capabilities, the small animal conformal radiation therapy device is a powerful tool for small animal research.
Computer method for identification of boiler transfer functions
NASA Technical Reports Server (NTRS)
Miles, J. H.
1972-01-01
Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.
GPUs: An Emerging Platform for General-Purpose Computation
2007-08-01
programming; real-time cinematic quality graphics Peak stream (26) License required (limited time no- cost evaluation program) Commercially...folding.stanford.edu (accessed 30 March 2007). 2. Fan, Z.; Qiu, F.; Kaufman, A.; Yoakum-Stover, S. GPU Cluster for High Performance Computing. ACM/IEEE...accessed 30 March 2007). 8. Goodnight, N.; Wang, R.; Humphreys, G. Computation on Programmable Graphics Hardware. IEEE Computer Graphics and
Full-color large-scaled computer-generated holograms using RGB color filters.
Tsuchiyama, Yasuhiro; Matsushima, Kyoji
2017-02-06
A technique using RGB color filters is proposed for creating high-quality full-color computer-generated holograms (CGHs). The fringe of these CGHs is composed of more than a billion pixels. The CGHs reconstruct full-parallax three-dimensional color images with a deep sensation of depth caused by natural motion parallax. The simulation technique as well as the principle and challenges of high-quality full-color reconstruction are presented to address the design of filter properties suitable for large-scaled CGHs. Optical reconstructions of actual fabricated full-color CGHs are demonstrated in order to verify the proposed techniques.
Measuring the Impact of High Quality Instant Feedback on Learning
ERIC Educational Resources Information Center
Nutbrown, Stephen; Higgins, Colin; Beesley, Su
2016-01-01
This paper examines the impact of a novel assessment technique that has been used to improve the feedback given to second year Computer Science students at the University of Nottingham. Criteria for effective, high quality feedback are discussed. An automated marking system (The Marker's Apprentice--TMA) produces instant feedback in synergy with…
Teaching Quality Object-Oriented Programming
ERIC Educational Resources Information Center
Feldman, Yishai A.
2005-01-01
Computer science students need to learn how to write high-quality software. An important methodology for achieving quality is design-by-contract, in which code is developed together with its specification, which is given as class invariants and method pre- and postconditions. This paper describes practical experience in teaching design-by-contract…
NASA Astrophysics Data System (ADS)
Nakatsuji, Noriaki; Matsushima, Kyoji
2017-03-01
Full-parallax high-definition CGHs composed of more than billion pixels were so far created only by the polygon-based method because of its high performance. However, GPUs recently allow us to generate CGHs much faster by the point cloud. In this paper, we measure computation time of object fields for full-parallax high-definition CGHs, which are composed of 4 billion pixels and reconstruct the same scene, by using the point cloud with GPU and the polygon-based method with CPU. In addition, we compare the optical and simulated reconstructions between CGHs created by these techniques to verify the image quality.
ERIC Educational Resources Information Center
Debuse, Justin C. W.; Lawley, Meredith
2016-01-01
Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…
Creating Printed Materials for Mathematics with a Macintosh Computer.
ERIC Educational Resources Information Center
Mahler, Philip
This document gives instructions on how to use a Macintosh computer to create printed materials for mathematics. A Macintosh computer, Microsoft Word, and objected-oriented (Draw-type) art program, and a function-graphing program are capable of producing high quality printed instructional materials for mathematics. Word 5.1 has an equation editor…
Fully Convolutional Architecture for Low-Dose CT Image Noise Reduction
NASA Astrophysics Data System (ADS)
Badretale, S.; Shaker, F.; Babyn, P.; Alirezaie, J.
2017-10-01
One of the critical topics in medical low-dose Computed Tomography (CT) imaging is how best to maintain image quality. As the quality of images decreases with lowering the X-ray radiation dose, improving image quality is extremely important and challenging. We have proposed a novel approach to denoise low-dose CT images. Our algorithm learns directly from an end-to-end mapping from the low-dose Computed Tomography images for denoising the normal-dose CT images. Our method is based on a deep convolutional neural network with rectified linear units. By learning various low-level to high-level features from a low-dose image the proposed algorithm is capable of creating a high-quality denoised image. We demonstrate the superiority of our technique by comparing the results with two other state-of-the-art methods in terms of the peak signal to noise ratio, root mean square error, and a structural similarity index.
NASA Technical Reports Server (NTRS)
Scott, D. W.
1994-01-01
This report describes efforts to use digital motion video compression technology to develop a highly portable device that would convert 1990-91 era IBM-compatible and/or MacIntosh notebook computers into full-color, motion-video capable multimedia training systems. An architecture was conceived that would permit direct conversion of existing laser-disk-based multimedia courses with little or no reauthoring. The project did not physically demonstrate certain critical video keying techniques, but their implementation should be feasible. This investigation of digital motion video has spawned two significant spaceflight projects at MSFC: one to downlink multiple high-quality video signals from Spacelab, and the other to uplink videoconference-quality video in realtime and high quality video off-line, plus investigate interactive, multimedia-based techniques for enhancing onboard science operations. Other airborne or spaceborne spinoffs are possible.
NASA Astrophysics Data System (ADS)
Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin
2018-03-01
Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.
Statistical process control based chart for information systems security
NASA Astrophysics Data System (ADS)
Khan, Mansoor S.; Cui, Lirong
2015-07-01
Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.
NASA Technical Reports Server (NTRS)
Carpentier, R. P.; Pietrzyk, J. P.; Beyer, R. R.; Kalafut, J. S.
1976-01-01
Computer-designed sensor, consisting of single-stage electrostatically-focused, triode image intensifier, provides high quality imaging characterized by exceptionally low geometric distortion, low shading, and high center-and-corner modulation transfer function.
Norbäck, D; Nordström, K
2008-08-01
The effects of ventilation in computer classrooms were studied with university students (n = 355) in a blinded study, 31% were women and 3.8% had asthma. Two classrooms had a higher air exchange (4.1-5.2 ac/h); two others had a lower air exchange (2.3-2.6 ac/h). After 1 week, ventilation conditions were shifted. The students reported environmental perceptions during the last hour. Room temperature, RH, CO2, PM10 and ultra-fine particles were measured simultaneously. Mean CO2 was 1185 ppm at lower and 922 ppm at higher air exchange. Mean temperature was 23.2 degrees C at lower and 22.1 degrees C at higher air exchange. After mutual adjustment (temperature, RH, CO2, air exchange), measured temperature was associated with a perception of higher temperature (P < 0.001), lower air movement (P < 0.001), and poorer air quality (P < 0.001). Higher air exchange was associated with a perception of lower temperature (P < 0.001), higher air movement (P = 0.001), and better air quality (P < 0.001). In the longitudinal analysis (n = 83), increased air exchange caused a perception of lower temperature (P = 0.002), higher air movement (P < 0.001), better air quality (P = 0.001), and less odor (P = 0.02). In conclusion, computer classrooms have CO2 levels above 1000 ppm and temperatures above 22 degrees C. Increased ventilation from 7 l/s per person to 10-13 l/s per person can improve thermal comfort and air quality. Computer classrooms are crowded indoor environments with a high thermal load from both students and computer equipment. It is important to control room temperature either by air conditioning, sun shields, or sufficiently high ventilation flow. A high ventilation flow is also crucial to achieving good perceived air quality. Personal ventilation flow should be at least 10 l/s. Possible loss of learning ability due to poor indoor air quality in university buildings deserves more attention.
ERIC Educational Resources Information Center
McKissick, Bethany R.; Diegelmann, Karen M.; Parker, Sarah
2017-01-01
Providing high-quality special education services in rural settings has a variety of challenges such as geographic isolation and a lack of resources. One particularly challenging aspect of rural special education is providing general curriculum access. Computer-assisted instruction is one way to provide high-quality specialized instruction that…
Gardner, William; Morton, Suzanne; Byron, Sepheen C; Tinoco, Aldo; Canan, Benjamin D; Leonhart, Karen; Kong, Vivian; Scholle, Sarah Hudson
2014-01-01
Objective To determine whether quality measures based on computer-extracted EHR data can reproduce findings based on data manually extracted by reviewers. Data Sources We studied 12 measures of care indicated for adolescent well-care visits for 597 patients in three pediatric health systems. Study Design Observational study. Data Collection/Extraction Methods Manual reviewers collected quality data from the EHR. Site personnel programmed their EHR systems to extract the same data from structured fields in the EHR according to national health IT standards. Principal Findings Overall performance measured via computer-extracted data was 21.9 percent, compared with 53.2 percent for manual data. Agreement measures were high for immunizations. Otherwise, agreement between computer extraction and manual review was modest (Kappa = 0.36) because computer-extracted data frequently missed care events (sensitivity = 39.5 percent). Measure validity varied by health care domain and setting. A limitation of our findings is that we studied only three domains and three sites. Conclusions The accuracy of computer-extracted EHR quality reporting depends on the use of structured data fields, with the highest agreement found for measures and in the setting that had the greatest concentration of structured fields. We need to improve documentation of care, data extraction, and adaptation of EHR systems to practice workflow. PMID:24471935
Pygmalion in Media-Based Learning: Effects of Quality Expectancies on Learning Outcomes
ERIC Educational Resources Information Center
Fries, Stefan; Horz, Holger; Haimerl, Charlotte
2006-01-01
Two studies investigated how quality expectations affect students' outcomes of media-based learning. Experiment 1 (N=62) demonstrated that students expecting a high-end computer-based training programme learned most, whereas students expecting a programme of ambiguous quality learned least and students having no expectations performed in between.…
NASA Astrophysics Data System (ADS)
Salha, A. A.; Stevens, D. K.
2013-12-01
This study presents numerical application and statistical development of Stream Water Quality Modeling (SWQM) as a tool to investigate, manage, and research the transport and fate of water pollutants in Lower Bear River, Box elder County, Utah. The concerned segment under study is the Bear River starting from Cutler Dam to its confluence with the Malad River (Subbasin HUC 16010204). Water quality problems arise primarily from high phosphorus and total suspended sediment concentrations that were caused by five permitted point source discharges and complex network of canals and ducts of varying sizes and carrying capacities that transport water (for farming and agriculture uses) from Bear River and then back to it. Utah Department of Environmental Quality (DEQ) has designated the entire reach of the Bear River between Cutler Reservoir and Great Salt Lake as impaired. Stream water quality modeling (SWQM) requires specification of an appropriate model structure and process formulation according to nature of study area and purpose of investigation. The current model is i) one dimensional (1D), ii) numerical, iii) unsteady, iv) mechanistic, v) dynamic, and vi) spatial (distributed). The basic principle during the study is using mass balance equations and numerical methods (Fickian advection-dispersion approach) for solving the related partial differential equations. Model error decreases and sensitivity increases as a model becomes more complex, as such: i) uncertainty (in parameters, data input and model structure), and ii) model complexity, will be under investigation. Watershed data (water quality parameters together with stream flow, seasonal variations, surrounding landscape, stream temperature, and points/nonpoint sources) were obtained majorly using the HydroDesktop which is a free and open source GIS enabled desktop application to find, download, visualize, and analyze time series of water and climate data registered with the CUAHSI Hydrologic Information System. Processing, assessment of validity, and distribution of time-series data was explored using the GNU R language (statistical computing and graphics environment). Physical, chemical, and biological processes equations were written in FORTRAN codes (High Performance Fortran) in order to compute and solve their hyperbolic and parabolic complexities. Post analysis of results conducted using GNU R language. High performance computing (HPC) will be introduced to expedite solving complex computational processes using parallel programming. It is expected that the model will assess nonpoint sources and specific point sources data to understand pollutants' causes, transfer, dispersion, and concentration in different locations of Bear River. Investigation the impact of reduction/removal in non-point nutrient loading to Bear River water quality management could be addressed. Keywords: computer modeling; numerical solutions; sensitivity analysis; uncertainty analysis; ecosystem processes; high Performance computing; water quality.
Computer Courses in Higher-Education: Improving Learning by Screencast Technology
ERIC Educational Resources Information Center
Ghilay, Yaron; Ghilay, Ruth
2015-01-01
The aim of the study was to find out a method designated to improve the learning of computer courses by adding Screencast technology. The intention was to measure the influence of high-quality clips produced by Screencast technology, on the learning process of computer courses. It was required to find out the characteristics (pedagogical and…
A Review of Resources for Evaluating K-12 Computer Science Education Programs
ERIC Educational Resources Information Center
Randolph, Justus J.; Hartikainen, Elina
2004-01-01
Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…
Luckman, Matthew; Hans, Didier; Cortez, Natalia; Nishiyama, Kyle K; Agarawal, Sanchita; Zhang, Chengchen; Nikkel, Lucas; Iyer, Sapna; Fusaro, Maria; Guo, Edward X; McMahon, Donald J; Shane, Elizabeth; Nickolas, Thomas L
2017-04-03
Studies using high-resolution peripheral quantitative computed tomography showed progressive abnormalities in cortical and trabecular microarchitecture and biomechanical competence over the first year after kidney transplantation. However, high-resolution peripheral computed tomography is a research tool lacking wide availability. In contrast, the trabecular bone score is a novel and widely available tool that uses gray-scale variograms of the spine image from dual-energy x-ray absorptiometry to assess trabecular quality. There are no studies assessing whether trabecular bone score characterizes bone quality in kidney transplant recipients. Between 2009 and 2010, we conducted a study to assess changes in peripheral skeletal microarchitecture, measured by high-resolution peripheral computed tomography, during the first year after transplantation in 47 patients managed with early corticosteroid-withdrawal immunosuppression. All adult first-time transplant candidates were eligible. Patients underwent imaging with high-resolution peripheral computed tomography and dual-energy x-ray absorptiometry pretransplantation and 3, 6, and 12 months post-transplantation. We now test if, during the first year after transplantation, trabecular bone score assesses the evolution of bone microarchitecture and biomechanical competence as determined by high-resolution peripheral computed tomography. At baseline and follow-up, among the 72% and 78%, respectively, of patients having normal bone mineral density by dual-energy x-ray absorptiometry, 53% and 50%, respectively, were classified by trabecular bone score as having high fracture risk. At baseline, trabecular bone score correlated with spine, hip, and ultradistal radius bone mineral density by dual-energy x-ray absorptiometry and cortical area, density, thickness, and porosity; trabecular density, thickness, separation, and heterogeneity; and stiffness and failure load by high-resolution peripheral computed tomography. Longitudinally, each percentage increase in trabecular bone score was associated with increases in trabecular number (0.35%±1.4%); decreases in trabecular thickness (-0.45%±0.15%), separation (-0.40%±0.15%), and network heterogeneity (-0.48%±0.20%); and increases in failure load (0.22%±0.09%) by high-resolution peripheral computed tomography (all P <0.05). Trabecular bone score may be a useful method to assess and monitor bone quality and strength and classify fracture risk in kidney transplant recipients. Copyright © 2017 by the American Society of Nephrology.
Luckman, Matthew; Hans, Didier; Cortez, Natalia; Nishiyama, Kyle K.; Agarawal, Sanchita; Zhang, Chengchen; Nikkel, Lucas; Iyer, Sapna; Fusaro, Maria; Guo, Edward X.; McMahon, Donald J.; Shane, Elizabeth
2017-01-01
Background and objectives Studies using high-resolution peripheral quantitative computed tomography showed progressive abnormalities in cortical and trabecular microarchitecture and biomechanical competence over the first year after kidney transplantation. However, high-resolution peripheral computed tomography is a research tool lacking wide availability. In contrast, the trabecular bone score is a novel and widely available tool that uses gray-scale variograms of the spine image from dual-energy x-ray absorptiometry to assess trabecular quality. There are no studies assessing whether trabecular bone score characterizes bone quality in kidney transplant recipients. Design, settings, participants, & measurements Between 2009 and 2010, we conducted a study to assess changes in peripheral skeletal microarchitecture, measured by high-resolution peripheral computed tomography, during the first year after transplantation in 47 patients managed with early corticosteroid–withdrawal immunosuppression. All adult first-time transplant candidates were eligible. Patients underwent imaging with high-resolution peripheral computed tomography and dual-energy x-ray absorptiometry pretransplantation and 3, 6, and 12 months post-transplantation. We now test if, during the first year after transplantation, trabecular bone score assesses the evolution of bone microarchitecture and biomechanical competence as determined by high-resolution peripheral computed tomography. Results At baseline and follow-up, among the 72% and 78%, respectively, of patients having normal bone mineral density by dual-energy x-ray absorptiometry, 53% and 50%, respectively, were classified by trabecular bone score as having high fracture risk. At baseline, trabecular bone score correlated with spine, hip, and ultradistal radius bone mineral density by dual-energy x-ray absorptiometry and cortical area, density, thickness, and porosity; trabecular density, thickness, separation, and heterogeneity; and stiffness and failure load by high-resolution peripheral computed tomography. Longitudinally, each percentage increase in trabecular bone score was associated with increases in trabecular number (0.35%±1.4%); decreases in trabecular thickness (−0.45%±0.15%), separation (−0.40%±0.15%), and network heterogeneity (−0.48%±0.20%); and increases in failure load (0.22%±0.09%) by high-resolution peripheral computed tomography (all P<0.05). Conclusions Trabecular bone score may be a useful method to assess and monitor bone quality and strength and classify fracture risk in kidney transplant recipients. PMID:28348031
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.; Aganin, Alexei
2000-01-01
The transonic nozzle transmission problem and the open rotor noise radiation problem are solved computationally. Both are multiple length scales problems. For efficient and accurate numerical simulation, the multiple-size-mesh multiple-time-step Dispersion-Relation-Preserving scheme is used to calculate the time periodic solution. To ensure an accurate solution, high quality numerical boundary conditions are also needed. For the nozzle problem, a set of nonhomogeneous, outflow boundary conditions are required. The nonhomogeneous boundary conditions not only generate the incoming sound waves but also, at the same time, allow the reflected acoustic waves and entropy waves, if present, to exit the computation domain without reflection. For the open rotor problem, there is an apparent singularity at the axis of rotation. An analytic extension approach is developed to provide a high quality axis boundary treatment.
Digital multicolor printing: state of the art and future challenges
NASA Astrophysics Data System (ADS)
Kipphan, Helmut
1995-04-01
During the last 5 years, digital techniques have become extremely important in the graphic arts industry. All sections in the production flow for producing multicolor printed products - prepress, printing and postpress - are influenced by digitalization, in an evolutionary and revolutionary way. New equipment and network techniques bring all the sections closer together. The focus is put on high-quality multicolor printing, together with high productivity. Conventional offset printing technology is compared with the leading nonimpact printing technologies. Computer to press is contrasted with computer to print techniques. The newest available digital multicolor presses are described - the direct imaging offset printing press from HEIDELBERG with new laser imaging technique as well as the INDIGO and XEIKON presses based on electrophotography. Regarding technical specifications, economic calculations and print quality, it is worked out that each technique has its own market segments. An outlook is given for future computer to press techniques and the potential of nonimpact printing technologies for advanced high-speed multicolor computer to print equipment. Synergy effects from the NIP-technologies to the conventional printing technologies and vice versa are possible for building up innovative new products, for example hybrid printing systems. It is also shown that there is potential for improving the print quality, based on special screening algorithms, and a higher number of grey levels per pixel by using NIP-technologies. As an intermediate step in digitalization of the production flow, but also as an economical solution computer to plate equipment is described. By producing printed products totally in a digital way, digital color proofing as well as color management systems are needed. The newest high-tech equipment using NIP-technologies for producing proofs is explained. All in all it is shown that the state of the art in digital multicolor printing has reached a very high level in technology, productivity and quality, but that there is still space for improvements and innovations. Manufacturers of equipment and producers of printed products can take part in a successful evolution-changes, chances and challenges must be recognized and considered for future orientated activities and investments.
Williams, Peter Huw; de Lusignan, Simon
2006-01-01
The Royal College of Physicians (RCP) have produced guidelines for stroke management in primary care; this guidance is taken to be the gold standard for the care of people with stroke. UK general practitioners now have a quality-based contract which includes a Quality and Outcomes Framework (QOF). This consists of financially remunerated 'quality points' for specific disease areas, including stroke. Achievement of these quality points is measured by extracting a limited list of computer codes from practice computer systems. To investigate whether a high stroke quality score is associated with adherence to RCP guidelines. Examination of computer and written medical records of all patients with a diagnosis of stroke. Two general practices, one in southwest London, one in Surrey, with a combined practice population of over 20 000. Both practices had a similar age-sex profile and prevalence of stroke. One practice scored 93.5% (29/31) of the available stroke quality points. The other practice achieved 73.4% (22.75/31), and only did better in one stroke quality target. However, the practice scoring fewer quality points had much better adherence to RCP guidance: 96% of patients were assessed in secondary care compared with 79% (P=0.001); 64% of stroke patients were seen the same day, compared with 44%; 56% received rehabilitation compared with 37%. Higher quality points did not reflect better adherence to RCP guidance. This audit highlights a gap between relatively simplistic measures of quality in the QOF, dependent on the recording of a narrow range of computer codes, and the actual standard of care being delivered. Research is needed to see whether this finding is generalisable and how the Quality and Outcomes Framework might be better aligned with delivering best practice.
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2017-01-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
NASA Astrophysics Data System (ADS)
Engwirda, Darren
2017-06-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
Cutting Costs on Computer Forms.
ERIC Educational Resources Information Center
Rupp, Robert V., Jr.
1989-01-01
Using the experience of Ford Motor Company, Oscar Meyer, and IBM, this article shows that companies are enjoying high quality product performance and substantially lower costs by converting from premium white bond computer stock forms to blended bond forms. School administrators are advised to do likewise. (MLH)
Evaluation of a continuous-rotation, high-speed scanning protocol for micro-computed tomography.
Kerl, Hans Ulrich; Isaza, Cristina T; Boll, Hanne; Schambach, Sebastian J; Nolte, Ingo S; Groden, Christoph; Brockmann, Marc A
2011-01-01
Micro-computed tomography is used frequently in preclinical in vivo research. Limiting factors are radiation dose and long scan times. The purpose of the study was to compare a standard step-and-shoot to a continuous-rotation, high-speed scanning protocol. Micro-computed tomography of a lead grid phantom and a rat femur was performed using a step-and-shoot and a continuous-rotation protocol. Detail discriminability and image quality were assessed by 3 radiologists. The signal-to-noise ratio and the modulation transfer function were calculated, and volumetric analyses of the femur were performed. The radiation dose of the scan protocols was measured using thermoluminescence dosimeters. The 40-second continuous-rotation protocol allowed a detail discriminability comparable to the step-and-shoot protocol at significantly lower radiation doses. No marked differences in volumetric or qualitative analyses were observed. Continuous-rotation micro-computed tomography significantly reduces scanning time and radiation dose without relevantly reducing image quality compared with a normal step-and-shoot protocol.
Root, Jenny R; Stevenson, Bradley S; Davis, Luann Ley; Geddes-Hall, Jennifer; Test, David W
2017-02-01
Computer-assisted instruction (CAI) is growing in popularity and has demonstrated positive effects for students with disabilities, including those with autism spectrum disorder (ASD). In this review, criteria for group experimental and single case studies were used to determine quality (Horner et al., Exceptional Children 71:165-179, 2005; Gersten et al., Exceptional Children 71:149-164, 2005; National Technical Assistance Center on Transition Center 2015). Included studies of high and adequate quality were further analyzed in terms of content, context, and specific instructional practices. Based on the NTACT criteria, this systematic review has established CAI as an evidence-based practice for teaching academics to students with ASD with support from 10 single-case and two group design studies of high or adequate quality. Suggestions for future research and implications for practice are discussed.
ERIC Educational Resources Information Center
Crowe, Suzy; Penney, Elaine
This book is the first volume in the "Kids and Computers" series, a series of books designed to help adults easily use high-quality, developmentally appropriate software with children. After reviewing the basics of selected software packages (how to start the program, stop the program, move around, and use special keys) several ideas and…
Kazakauskaite, Egle; Husmann, Lars; Stehli, Julia; Fuchs, Tobias; Fiechter, Michael; Klaeser, Bernd; Ghadri, Jelena R; Gebhard, Catherine; Gaemperli, Oliver; Kaufmann, Philipp A
2013-02-01
A new generation of high definition computed tomography (HDCT) 64-slice devices complemented by a new iterative image reconstruction algorithm-adaptive statistical iterative reconstruction, offer substantially higher resolution compared to standard definition CT (SDCT) scanners. As high resolution confers higher noise we have compared image quality and radiation dose of coronary computed tomography angiography (CCTA) from HDCT versus SDCT. Consecutive patients (n = 93) underwent HDCT, and were compared to 93 patients who had previously undergone CCTA with SDCT matched for heart rate (HR), HR variability and body mass index (BMI). Tube voltage and current were adapted to the patient's BMI, using identical protocols in both groups. The image quality of all CCTA scans was evaluated by two independent readers in all coronary segments using a 4-point scale (1, excellent image quality; 2, blurring of the vessel wall; 3, image with artefacts but evaluative; 4, non-evaluative). Effective radiation dose was calculated from DLP multiplied by a conversion factor (0.014 mSv/mGy × cm). The mean image quality score from HDCT versus SDCT was comparable (2.02 ± 0.68 vs. 2.00 ± 0.76). Mean effective radiation dose did not significantly differ between HDCT (1.7 ± 0.6 mSv, range 1.0-3.7 mSv) and SDCT (1.9 ± 0.8 mSv, range 0.8-5.5 mSv; P = n.s.). HDCT scanners allow low-dose 64-slice CCTA scanning with higher resolution than SDCT but maintained image quality and equally low radiation dose. Whether this will translate into higher accuracy of HDCT for CAD detection remains to be evaluated.
ERIC Educational Resources Information Center
Vekli, Gülsah Sezen; Çimer, Atilla
2017-01-01
This study investigated development of students' scientific argumentation levels in the applications made with Problem-Based Computer-Aided Material (PBCAM) designed about Human Endocrine System. The case study method was used: The study group was formed of 43 students in the 11th grade of the science high school in Rize. Human Endocrine System…
Kiely, Daniel J; Stephanson, Kirk; Ross, Sue
2011-10-01
Low-cost laparoscopic box trainers built using home computers and webcams may provide residents with a useful tool for practice at home. This study set out to evaluate the image quality of low-cost laparoscopic box trainers compared with a commercially available model. Five low-cost laparoscopic box trainers including the components listed were compared in random order to one commercially available box trainer: A (high-definition USB 2.0 webcam, PC laptop), B (Firewire webcam, Mac laptop), C (high-definition USB 2.0 webcam, Mac laptop), D (standard USB webcam, PC desktop), E (Firewire webcam, PC desktop), and F (the TRLCD03 3-DMEd Standard Minimally Invasive Training System). Participants observed still image quality and performed a peg transfer task using each box trainer. Participants rated still image quality, image quality with motion, and whether the box trainer had sufficient image quality to be useful for training. Sixteen residents in obstetrics and gynecology took part in the study. The box trainers showing no statistically significant difference from the commercially available model were A, B, C, D, and E for still image quality; A for image quality with motion; and A and B for usefulness of the simulator based on image quality. The cost of the box trainers A-E is approximately $100 to $160 each, not including a computer or laparoscopic instruments. Laparoscopic box trainers built from a high-definition USB 2.0 webcam with a PC (box trainer A) or from a Firewire webcam with a Mac (box trainer B) provide image quality comparable with a commercial standard.
Image quality improvement in cone-beam CT using the super-resolution technique.
Oyama, Asuka; Kumagai, Shinobu; Arai, Norikazu; Takata, Takeshi; Saikawa, Yusuke; Shiraishi, Kenshiro; Kobayashi, Takenori; Kotoku, Jun'ichi
2018-04-05
This study was conducted to improve cone-beam computed tomography (CBCT) image quality using the super-resolution technique, a method of inferring a high-resolution image from a low-resolution image. This technique is used with two matrices, so-called dictionaries, constructed respectively from high-resolution and low-resolution image bases. For this study, a CBCT image, as a low-resolution image, is represented as a linear combination of atoms, the image bases in the low-resolution dictionary. The corresponding super-resolution image was inferred by multiplying the coefficients and the high-resolution dictionary atoms extracted from planning CT images. To evaluate the proposed method, we computed the root mean square error (RMSE) and structural similarity (SSIM). The resulting RMSE and SSIM between the super-resolution images and the planning CT images were, respectively, as much as 0.81 and 1.29 times better than those obtained without using the super-resolution technique. We used super-resolution technique to improve the CBCT image quality.
ERIC Educational Resources Information Center
Capshaw, Norman Clark
2008-01-01
The disruptive technologies of the Internet and computers are changing our world in myriad ways. These technologies are also increasingly being employed in higher education but to what effect? Are the effects on higher education quality measurable, and if so, what is the effect on the traditional gap between high-income and low- to middle-income…
NASA Technical Reports Server (NTRS)
Buckner, J. D.; Council, H. W.; Edwards, T. R.
1974-01-01
Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.
Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane
NASA Technical Reports Server (NTRS)
Gera, Joseph; Bosworth, John T.
1987-01-01
Novel flight test and analysis techniques in the flight dynamics and handling qualities area are described. These techniques were utilized at NASA Ames-Dryden during the initial flight envelope clearance of the X-29A aircraft. It is shown that the open-loop frequency response of an aircraft with highly relaxed static stability can be successfully computed on the ground from telemetry data. Postflight closed-loop frequency response data were obtained from pilot-generated frequency sweeps and it is found that the current handling quality requirements for high-maneuverability aircraft are generally applicable to the X-29A.
Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models
NASA Astrophysics Data System (ADS)
Zang, Tianwu
Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.
Wavefront measurement using computational adaptive optics.
South, Fredrick A; Liu, Yuan-Zhi; Bower, Andrew J; Xu, Yang; Carney, P Scott; Boppart, Stephen A
2018-03-01
In many optical imaging applications, it is necessary to correct for aberrations to obtain high quality images. Optical coherence tomography (OCT) provides access to the amplitude and phase of the backscattered optical field for three-dimensional (3D) imaging samples. Computational adaptive optics (CAO) modifies the phase of the OCT data in the spatial frequency domain to correct optical aberrations without using a deformable mirror, as is commonly done in hardware-based adaptive optics (AO). This provides improvement of image quality throughout the 3D volume, enabling imaging across greater depth ranges and in highly aberrated samples. However, the CAO aberration correction has a complicated relation to the imaging pupil and is not a direct measurement of the pupil aberrations. Here we present new methods for recovering the wavefront aberrations directly from the OCT data without the use of hardware adaptive optics. This enables both computational measurement and correction of optical aberrations.
A High-Performance Genetic Algorithm: Using Traveling Salesman Problem as a Case
Tsai, Chun-Wei; Tseng, Shih-Pang; Yang, Chu-Sing
2014-01-01
This paper presents a simple but efficient algorithm for reducing the computation time of genetic algorithm (GA) and its variants. The proposed algorithm is motivated by the observation that genes common to all the individuals of a GA have a high probability of surviving the evolution and ending up being part of the final solution; as such, they can be saved away to eliminate the redundant computations at the later generations of a GA. To evaluate the performance of the proposed algorithm, we use it not only to solve the traveling salesman problem but also to provide an extensive analysis on the impact it may have on the quality of the end result. Our experimental results indicate that the proposed algorithm can significantly reduce the computation time of GA and GA-based algorithms while limiting the degradation of the quality of the end result to a very small percentage compared to traditional GA. PMID:24892038
A high-performance genetic algorithm: using traveling salesman problem as a case.
Tsai, Chun-Wei; Tseng, Shih-Pang; Chiang, Ming-Chao; Yang, Chu-Sing; Hong, Tzung-Pei
2014-01-01
This paper presents a simple but efficient algorithm for reducing the computation time of genetic algorithm (GA) and its variants. The proposed algorithm is motivated by the observation that genes common to all the individuals of a GA have a high probability of surviving the evolution and ending up being part of the final solution; as such, they can be saved away to eliminate the redundant computations at the later generations of a GA. To evaluate the performance of the proposed algorithm, we use it not only to solve the traveling salesman problem but also to provide an extensive analysis on the impact it may have on the quality of the end result. Our experimental results indicate that the proposed algorithm can significantly reduce the computation time of GA and GA-based algorithms while limiting the degradation of the quality of the end result to a very small percentage compared to traditional GA.
APQ-102 imaging radar digital image quality study
NASA Technical Reports Server (NTRS)
Griffin, C. R.; Estes, J. M.
1982-01-01
A modified APQ-102 sidelooking radar collected synthetic aperture radar (SAR) data which was digitized and recorded on wideband magnetic tape. These tapes were then ground processed into computer compatible tapes (CCT's). The CCT's may then be processed into high resolution radar images by software on the CYBER computer.
Shortcomings of low-cost imaging systems for viewing computed radiographs.
Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N
2000-01-01
To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.
Prazeres, Carlos Eduardo Elias Dos; Magalhães, Tiago Augusto; de Castro Carneiro, Adriano Camargo; Cury, Roberto Caldeira; de Melo Moreira, Valéria; Bello, Juliana Hiromi Silva Matsumoto; Rochitte, Carlos Eduardo
The aim of this study was to compare image quality and radiation dose of coronary computed tomography (CT) angiography performed with dual-source CT scanner using 2 different protocols in patients with atrial fibrillation. Forty-seven patients with AF underwent 2 different acquisition protocols: double high-pitch (DHP) spiral acquisition and retrospective spiral acquisition. The image quality was ranked according to a qualitative score by 2 experts: 1, no evident motion; 2, minimal motion not influencing coronary artery luminal evaluation; and 3, motion with impaired luminal evaluation. A third expert solved any disagreement. A total of 732 segments were evaluated. The DHP group (24 patients, 374 segments) showed more segments classified as score 1 than the retrospective spiral acquisition group (71.3% vs 37.4%). Image quality evaluation agreement was high between observers (κ = 0.8). There was significantly lower radiation exposure for the DHP group (3.65 [1.29] vs 23.57 [10.32] mSv). In this original direct comparison, a DHP spiral protocol for coronary CT angiography acquisition in patients with atrial fibrillation resulted in lower radiation exposure and superior image quality compared with conventional spiral retrospective acquisition.
Sohrabi, Mehdi; Parsi, Masoumeh; Sina, Sedigheh
2018-05-17
A diagnostic reference level is an advisory dose level set by a regulatory authority in a country as an efficient criterion for protection of patients from unwanted medical exposure. In computed tomography, the direct dose measurement and data collection methods are commonly applied for determination of diagnostic reference levels. Recently, a new quality-control-based dose survey method was proposed by the authors to simplify the diagnostic reference-level determination using a retrospective quality control database usually available at a regulatory authority in a country. In line with such a development, a prospective dual-purpose quality control dosimetry protocol is proposed for determination of diagnostic reference levels in a country, which can be simply applied by quality control service providers. This new proposed method was applied to five computed tomography scanners in Shiraz, Iran, and diagnostic reference levels for head, abdomen/pelvis, sinus, chest, and lumbar spine examinations were determined. The results were compared to those obtained by the data collection and quality-control-based dose survey methods, carried out in parallel in this study, and were found to agree well within approximately 6%. This is highly acceptable for quality-control-based methods according to International Atomic Energy Agency tolerance levels (±20%).
Shao, Feng; Li, Kemeng; Lin, Weisi; Jiang, Gangyi; Yu, Mei; Dai, Qionghai
2015-10-01
Quality assessment of 3D images encounters more challenges than its 2D counterparts. Directly applying 2D image quality metrics is not the solution. In this paper, we propose a new full-reference quality assessment for stereoscopic images by learning binocular receptive field properties to be more in line with human visual perception. To be more specific, in the training phase, we learn a multiscale dictionary from the training database, so that the latent structure of images can be represented as a set of basis vectors. In the quality estimation phase, we compute sparse feature similarity index based on the estimated sparse coefficient vectors by considering their phase difference and amplitude difference, and compute global luminance similarity index by considering luminance changes. The final quality score is obtained by incorporating binocular combination based on sparse energy and sparse complexity. Experimental results on five public 3D image quality assessment databases demonstrate that in comparison with the most related existing methods, the devised algorithm achieves high consistency with subjective assessment.
A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.
De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc
2010-09-01
In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.
The persuasiveness of synthetic speech versus human speech.
Stern, S E; Mullennix, J W; Dyson, C; Wilson, S J
1999-12-01
Is computer-synthesized speech as persuasive as the human voice when presenting an argument? After completing an attitude pretest, 193 participants were randomly assigned to listen to a persuasive appeal under three conditions: a high-quality synthesized speech system (DECtalk Express), a low-quality synthesized speech system (Monologue), and a tape recording of a human voice. Following the appeal, participants completed a posttest attitude survey and a series of questionnaires designed to assess perceptions of speech qualities, perceptions of the speaker, and perceptions of the message. The human voice was generally perceived more favorably than the computer-synthesized voice, and the speaker was perceived more favorably when the voice was a human voice than when it was computer synthesized. There was, however, no evidence that computerized speech, as compared with the human voice, affected persuasion or perceptions of the message. Actual or potential applications of this research include issues that should be considered when designing synthetic speech systems.
Zhu, Liping; Aono, Masashi; Kim, Song-Ju; Hara, Masahiko
2013-04-01
A single-celled, multi-nucleated amoeboid organism, a plasmodium of the true slime mold Physarum polycephalum, can perform sophisticated computing by exhibiting complex spatiotemporal oscillatory dynamics while deforming its amorphous body. We previously devised an "amoeba-based computer (ABC)" to quantitatively evaluate the optimization capability of the amoeboid organism in searching for a solution to the traveling salesman problem (TSP) under optical feedback control. In ABC, the organism changes its shape to find a high quality solution (a relatively shorter TSP route) by alternately expanding and contracting its pseudopod-like branches that exhibit local photoavoidance behavior. The quality of the solution serves as a measure of the optimality of which the organism maximizes its global body area (nutrient absorption) while minimizing the risk of being illuminated (exposure to aversive stimuli). ABC found a high quality solution for the 8-city TSP with a high probability. However, it remains unclear whether intracellular communication among the branches of the organism is essential for computing. In this study, we conducted a series of control experiments using two individual cells (two single-celled organisms) to perform parallel searches in the absence of intercellular communication. We found that ABC drastically lost its ability to find a solution when it used two independent individuals. However, interestingly, when two individuals were prepared by dividing one individual, they found a solution for a few tens of minutes. That is, the two divided individuals remained correlated even though they were spatially separated. These results suggest the presence of a long-term memory in the intrinsic dynamics of this organism and its significance in performing sophisticated computing. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liansheng, Sui; Yin, Cheng; Bing, Li; Ailing, Tian; Krishna Asundi, Anand
2018-07-01
A novel computational ghost imaging scheme based on specially designed phase-only masks, which can be efficiently applied to encrypt an original image into a series of measured intensities, is proposed in this paper. First, a Hadamard matrix with a certain order is generated, where the number of elements in each row is equal to the size of the original image to be encrypted. Each row of the matrix is rearranged into the corresponding 2D pattern. Then, each pattern is encoded into the phase-only masks by making use of an iterative phase retrieval algorithm. These specially designed masks can be wholly or partially used in the process of computational ghost imaging to reconstruct the original information with high quality. When a significantly small number of phase-only masks are used to record the measured intensities in a single-pixel bucket detector, the information can be authenticated without clear visualization by calculating the nonlinear correlation map between the original image and its reconstruction. The results illustrate the feasibility and effectiveness of the proposed computational ghost imaging mechanism, which will provide an effective alternative for enriching the related research on the computational ghost imaging technique.
Machine vision methods for use in grain variety discrimination and quality analysis
NASA Astrophysics Data System (ADS)
Winter, Philip W.; Sokhansanj, Shahab; Wood, Hugh C.
1996-12-01
Decreasing cost of computer technology has made it feasible to incorporate machine vision technology into the agriculture industry. The biggest attraction to using a machine vision system is the computer's ability to be completely consistent and objective. One use is in the variety discrimination and quality inspection of grains. Algorithms have been developed using Fourier descriptors and neural networks for use in variety discrimination of barley seeds. RGB and morphology features have been used in the quality analysis of lentils, and probability distribution functions and L,a,b color values for borage dockage testing. These methods have been shown to be very accurate and have a high potential for agriculture. This paper presents the techniques used and results obtained from projects including: a lentil quality discriminator, a barley variety classifier, a borage dockage tester, a popcorn quality analyzer, and a pistachio nut grading system.
Yang, Wen Jie; Zhang, Huan; Xiao, Hua; Li, Jian Ying; Liu, Yan; Pan, Zi Lai; Chen, Ke Min
2012-01-01
The evaluation of coronary stents by computed tomography (CT) remains difficult. We assessed the imaging performance of a high-definition CT scanner (HDCT) by comparing with a conventional 64-row standard-definition CT (SDCT). One hundred thirty-eight consecutive stented patients underwent coronary CT angiography, among whom 66 patients were examined by HDCT, and 72 patients by SDCT (LightSpeed VCT XT; GE Healthcare, Waukesha, Wis). The image quality score, the inner stent diameter (ISD), and the radiation dose were analyzed. All data were statistically tested by SPSS 13.0 software (SPSS Inc, Chicago, Ill). In 72 patients examined using SDCT, 135 stents were detected; in 66 patients examined using HDCT, 119 stents were detected. The image quality score on HDCT was significantly better than that on SDCT (1.4 [SD, 0.7] vs 1.9 [SD, 0.8]). The ISD on HDCT was significantly higher than that on SDCT (1.8 [SD, 0.5] vs 1.6 [SD, 0.4]). There was no significant difference of either image quality score or ISD between the HDCT and SDCT groups in stents with 2.5-mm diameter. Images on HDCT showed significantly better image quality score and larger ISD than images on SDCT in 2.75-, 3-, and 3.5-mm stents. For patients examined by retrospective electrocardiogram-gated technique, the radiation dose on HDCT was significantly lower than that on SDCT (11.3 [SD, 2.9] vs 15.1 [SD, 3.8] mSv). High-definition CT scanner offered improved image quality and measurement accuracy for imaging coronary stents compared with conventional SDCT, providing higher spatial resolution and lower dose for evaluating coronary stents with 2.75- to 3.5-mm diameter.
Emphasizing Planning for Essay Writing with a Computer-Based Graphic Organizer
ERIC Educational Resources Information Center
Evmenova, Anya S.; Regan, Kelley; Boykin, Andrea; Good, Kevin; Hughes, Melissa; MacVittie, Nichole; Sacco, Donna; Ahn, Soo Y.; Chirinos, David
2016-01-01
The authors conducted a multiple-baseline study to investigate the effects of a computer-based graphic organizer (CBGO) with embedded self-regulated learning strategies on the quantity and quality of persuasive essay writing by students with high-incidence disabilities. Ten seventh- and eighth-grade students with learning disabilities, emotional…
Lee, Casey J.; Murphy, Jennifer C.; Crawford, Charles G.; Deacon, Jeffrey R.
2017-10-24
The U.S. Geological Survey publishes information on concentrations and loads of water-quality constituents at 111 sites across the United States as part of the U.S. Geological Survey National Water Quality Network (NWQN). This report details historical and updated methods for computing water-quality loads at NWQN sites. The primary updates to historical load estimation methods include (1) an adaptation to methods for computing loads to the Gulf of Mexico; (2) the inclusion of loads computed using the Weighted Regressions on Time, Discharge, and Season (WRTDS) method; and (3) the inclusion of loads computed using continuous water-quality data. Loads computed using WRTDS and continuous water-quality data are provided along with those computed using historical methods. Various aspects of method updates are evaluated in this report to help users of water-quality loading data determine which estimation methods best suit their particular application.
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Full 3-D OCT-based pseudophakic custom computer eye model
Sun, M.; Pérez-Merino, P.; Martinez-Enriquez, E.; Velasco-Ocana, M.; Marcos, S.
2016-01-01
We compared measured wave aberrations in pseudophakic eyes implanted with aspheric intraocular lenses (IOLs) with simulated aberrations from numerical ray tracing on customized computer eye models, built using quantitative 3-D OCT-based patient-specific ocular geometry. Experimental and simulated aberrations show high correlation (R = 0.93; p<0.0001) and similarity (RMS for high order aberrations discrepancies within 23.58%). This study shows that full OCT-based pseudophakic custom computer eye models allow understanding the relative contribution of optical geometrical and surgically-related factors to image quality, and are an excellent tool for characterizing and improving cataract surgery. PMID:27231608
NASA Astrophysics Data System (ADS)
Mickevicius, Nikolai J.; Paulson, Eric S.
2017-04-01
The purpose of this work is to investigate the effects of undersampling and reconstruction algorithm on the total processing time and image quality of respiratory phase-resolved 4D MRI data. Specifically, the goal is to obtain quality 4D-MRI data with a combined acquisition and reconstruction time of five minutes or less, which we reasoned would be satisfactory for pre-treatment 4D-MRI in online MRI-gRT. A 3D stack-of-stars, self-navigated, 4D-MRI acquisition was used to scan three healthy volunteers at three image resolutions and two scan durations. The NUFFT, CG-SENSE, SPIRiT, and XD-GRASP reconstruction algorithms were used to reconstruct each dataset on a high performance reconstruction computer. The overall image quality, reconstruction time, artifact prevalence, and motion estimates were compared. The CG-SENSE and XD-GRASP reconstructions provided superior image quality over the other algorithms. The combination of a 3D SoS sequence and parallelized reconstruction algorithms using computing hardware more advanced than those typically seen on product MRI scanners, can result in acquisition and reconstruction of high quality respiratory correlated 4D-MRI images in less than five minutes.
A self-synchronized high speed computational ghost imaging system: A leap towards dynamic capturing
NASA Astrophysics Data System (ADS)
Suo, Jinli; Bian, Liheng; Xiao, Yudong; Wang, Yongjin; Zhang, Lei; Dai, Qionghai
2015-11-01
High quality computational ghost imaging needs to acquire a large number of correlated measurements between the to-be-imaged scene and different reference patterns, thus ultra-high speed data acquisition is of crucial importance in real applications. To raise the acquisition efficiency, this paper reports a high speed computational ghost imaging system using a 20 kHz spatial light modulator together with a 2 MHz photodiode. Technically, the synchronization between such high frequency illumination and bucket detector needs nanosecond trigger precision, so the development of synchronization module is quite challenging. To handle this problem, we propose a simple and effective computational self-synchronization scheme by building a general mathematical model and introducing a high precision synchronization technique. The resulted efficiency is around 14 times faster than state-of-the-arts, and takes an important step towards ghost imaging of dynamic scenes. Besides, the proposed scheme is a general approach with high flexibility for readily incorporating other illuminators and detectors.
NASA Astrophysics Data System (ADS)
Ota, Junko; Umehara, Kensuke; Ishimaru, Naoki; Ohno, Shunsuke; Okamoto, Kentaro; Suzuki, Takanori; Shirai, Naoki; Ishida, Takayuki
2017-02-01
As the capability of high-resolution displays grows, high-resolution images are often required in Computed Tomography (CT). However, acquiring high-resolution images takes a higher radiation dose and a longer scanning time. In this study, we applied the Sparse-coding-based Super-Resolution (ScSR) method to generate high-resolution images without increasing the radiation dose. We prepared the over-complete dictionary learned the mapping between low- and highresolution patches and seek a sparse representation of each patch of the low-resolution input. These coefficients were used to generate the high-resolution output. For evaluation, 44 CT cases were used as the test dataset. We up-sampled images up to 2 or 4 times and compared the image quality of the ScSR scheme and bilinear and bicubic interpolations, which are the traditional interpolation schemes. We also compared the image quality of three learning datasets. A total of 45 CT images, 91 non-medical images, and 93 chest radiographs were used for dictionary preparation respectively. The image quality was evaluated by measuring peak signal-to-noise ratio (PSNR) and structure similarity (SSIM). The differences of PSNRs and SSIMs between the ScSR method and interpolation methods were statistically significant. Visual assessment confirmed that the ScSR method generated a high-resolution image with sharpness, whereas conventional interpolation methods generated over-smoothed images. To compare three different training datasets, there were no significance between the CT, the CXR and non-medical datasets. These results suggest that the ScSR provides a robust approach for application of up-sampling CT images and yields substantial high image quality of extended images in CT.
NASA Technical Reports Server (NTRS)
Schulbach, Catherine H. (Editor)
2000-01-01
The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.
Guidelines for Calibration and Application of Storm.
1977-12-01
combination method uses the SCS method on pervious areas and the coefficient method on impervious areas of the watershed. Storm water quality is computed...stations, it should be accomplished according to procedures outlined In Reference 7. Adequate storm water quality data are the most difficult and costly...mass discharge of pollutants is negligible. The state-of-the-art in urban storm water quality modeling precludes highly accurate simulation of
Application of machine vision to pup loaf bread evaluation
NASA Astrophysics Data System (ADS)
Zayas, Inna Y.; Chung, O. K.
1996-12-01
Intrinsic end-use quality of hard winter wheat breeding lines is routinely evaluated at the USDA, ARS, USGMRL, Hard Winter Wheat Quality Laboratory. Experimental baking test of pup loaves is the ultimate test for evaluating hard wheat quality. Computer vision was applied to developing an objective methodology for bread quality evaluation for the 1994 and 1995 crop wheat breeding line samples. Computer extracted features for bread crumb grain were studied, using subimages (32 by 32 pixel) and features computed for the slices with different threshold settings. A subsampling grid was located with respect to the axis of symmetry of a slice to provide identical topological subimage information. Different ranking techniques were applied to the databases. Statistical analysis was run on the database with digital image and breadmaking features. Several ranking algorithms and data visualization techniques were employed to create a sensitive scale for porosity patterns of bread crumb. There were significant linear correlations between machine vision extracted features and breadmaking parameters. Crumb grain scores by human experts were correlated more highly with some image features than with breadmaking parameters.
Convolutional Sparse Coding for RGB+NIR Imaging.
Hu, Xuemei; Heide, Felix; Dai, Qionghai; Wetzstein, Gordon
2018-04-01
Emerging sensor designs increasingly rely on novel color filter arrays (CFAs) to sample the incident spectrum in unconventional ways. In particular, capturing a near-infrared (NIR) channel along with conventional RGB color is an exciting new imaging modality. RGB+NIR sensing has broad applications in computational photography, such as low-light denoising, it has applications in computer vision, such as facial recognition and tracking, and it paves the way toward low-cost single-sensor RGB and depth imaging using structured illumination. However, cost-effective commercial CFAs suffer from severe spectral cross talk. This cross talk represents a major challenge in high-quality RGB+NIR imaging, rendering existing spatially multiplexed sensor designs impractical. In this work, we introduce a new approach to RGB+NIR image reconstruction using learned convolutional sparse priors. We demonstrate high-quality color and NIR imaging for challenging scenes, even including high-frequency structured NIR illumination. The effectiveness of the proposed method is validated on a large data set of experimental captures, and simulated benchmark results which demonstrate that this work achieves unprecedented reconstruction quality.
SCEAPI: A unified Restful Web API for High-Performance Computing
NASA Astrophysics Data System (ADS)
Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi
2017-10-01
The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.
ERIC Educational Resources Information Center
McColskey, Wendy; Parke, Helen; Furtak, Erin; Butler, Susan
This article addresses what was learned through the National Computational Science Leadership Program about involving teachers in planning high quality units of instruction around computational science investigations. Two cohorts of roughly 25 teacher teams nationwide were given opportunities to develop "replacement units." The goal was to support…
NASA Technical Reports Server (NTRS)
Lawson, Charles L.; Krogh, Fred; Van Snyder, W.; Oken, Carol A.; Mccreary, Faith A.; Lieske, Jay H.; Perrine, Jack; Coffin, Ralph S.; Wayne, Warren J.
1994-01-01
MATH77 is high-quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for basic computational processes of science and engineering. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. MATH77 release 4.0 subroutine library designed to be usable on any computer system supporting full ANSI standard FORTRAN 77 language.
ERIC Educational Resources Information Center
Ranney, John D.; Troop-Gordon, Wendy
2012-01-01
Because of recent technological innovations, college freshmen can readily communicate with friends who they see infrequently (e.g., friends from home). The current study addressed whether computer-mediated communication with these distant friends can compensate for a lack of high-quality on-campus friendships during students' first semester of…
Some Problems of Computer-Aided Testing and "Interview-Like Tests"
ERIC Educational Resources Information Center
Smoline, D.V.
2008-01-01
Computer-based testing--is an effective teacher's tool, intended to optimize course goals and assessment techniques in a comparatively short time. However, this is accomplished only if we deal with high-quality tests. It is strange, but despite the 100-year history of Testing Theory (see, Anastasi, A., Urbina, S. (1997). Psychological testing.…
ERIC Educational Resources Information Center
Hahn, H. A.; And Others
The purposes of this research were to evaluate the cost effectiveness of using Asynchronous Computer Conferencing (ACC) and to develop guidelines for effectively conducting high quality military training using ACC. The evaluation used a portion of the Engineer Officer Advanced Course (EOAC) as a test bed. Course materials which taught the same…
A La Carts: You Want Wireless Mobility? Have a COW
ERIC Educational Resources Information Center
Villano, Matt
2006-01-01
Computers on wheels, or COWs, combine the wireless technology of today with the audio/visual carts of yesteryear for an entirely new spin on mobility. Increasingly used by districts with laptop computing initiatives, COWs are among the hottest high-tech sellers in schools today, according to market research firm Quality Education Data. In this…
The Effect of Experimental Variables on Industrial X-Ray Micro-Computed Sensitivity
NASA Technical Reports Server (NTRS)
Roth, Don J.; Rauser, Richard W.
2014-01-01
A study was performed on the effect of experimental variables on radiographic sensitivity (image quality) in x-ray micro-computed tomography images for a high density thin wall metallic cylinder containing micro-EDM holes. Image quality was evaluated in terms of signal-to-noise ratio, flaw detectability, and feature sharpness. The variables included: day-to-day reproducibility, current, integration time, voltage, filtering, number of frame averages, number of projection views, beam width, effective object radius, binning, orientation of sample, acquisition angle range (180deg to 360deg), and directional versus transmission tube.
Real-time control system for adaptive resonator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flath, L; An, J; Brase, J
2000-07-24
Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.
High performance transcription factor-DNA docking with GPU computing
2012-01-01
Background Protein-DNA docking is a very challenging problem in structural bioinformatics and has important implications in a number of applications, such as structure-based prediction of transcription factor binding sites and rational drug design. Protein-DNA docking is very computational demanding due to the high cost of energy calculation and the statistical nature of conformational sampling algorithms. More importantly, experiments show that the docking quality depends on the coverage of the conformational sampling space. It is therefore desirable to accelerate the computation of the docking algorithm, not only to reduce computing time, but also to improve docking quality. Methods In an attempt to accelerate the sampling process and to improve the docking performance, we developed a graphics processing unit (GPU)-based protein-DNA docking algorithm. The algorithm employs a potential-based energy function to describe the binding affinity of a protein-DNA pair, and integrates Monte-Carlo simulation and a simulated annealing method to search through the conformational space. Algorithmic techniques were developed to improve the computation efficiency and scalability on GPU-based high performance computing systems. Results The effectiveness of our approach is tested on a non-redundant set of 75 TF-DNA complexes and a newly developed TF-DNA docking benchmark. We demonstrated that the GPU-based docking algorithm can significantly accelerate the simulation process and thereby improving the chance of finding near-native TF-DNA complex structures. This study also suggests that further improvement in protein-DNA docking research would require efforts from two integral aspects: improvement in computation efficiency and energy function design. Conclusions We present a high performance computing approach for improving the prediction accuracy of protein-DNA docking. The GPU-based docking algorithm accelerates the search of the conformational space and thus increases the chance of finding more near-native structures. To the best of our knowledge, this is the first ad hoc effort of applying GPU or GPU clusters to the protein-DNA docking problem. PMID:22759575
Development of Moire machine vision
NASA Technical Reports Server (NTRS)
Harding, Kevin G.
1987-01-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
Development of Moire machine vision
NASA Astrophysics Data System (ADS)
Harding, Kevin G.
1987-10-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
NASA Astrophysics Data System (ADS)
Hachaj, Tomasz; Ogiela, Marek R.
2012-10-01
The proposed framework for cognitive analysis of perfusion computed tomography images is a fusion of image processing, pattern recognition, and image analysis procedures. The output data of the algorithm consists of: regions of perfusion abnormalities, anatomy atlas description of brain tissues, measures of perfusion parameters, and prognosis for infracted tissues. That information is superimposed onto volumetric computed tomography data and displayed to radiologists. Our rendering algorithm enables rendering large volumes on off-the-shelf hardware. This portability of rendering solution is very important because our framework can be run without using expensive dedicated hardware. The other important factors are theoretically unlimited size of rendered volume and possibility of trading of image quality for rendering speed. Such rendered, high quality visualizations may be further used for intelligent brain perfusion abnormality identification, and computer aided-diagnosis of selected types of pathologies.
[Computer aided design and manufacture of the porcelain fused to metal crown].
Nie, Xin; Cheng, Xiaosheng; Dai, Ning; Yu, Qing; Hao, Guodong; Sun, Quanping
2009-04-01
In order to satisfy the current demand for fast and high-quality prosthodontics, we have carried out a research in the fabrication process of the porcelain fused to metal crown on molar with CAD/CAM technology. Firstly, we get the data of the surface mesh on preparation teeth through a 3D-optical grating measuring system. Then, we reconstruct the 3D-model crown with the computer-aided design software which was developed by ourselves. Finally, with the 3D-model data, we produce a metallic crown on a high-speed CNC carving machine. The result has proved that the metallic crown can match the preparation teeth ideally. The fabrication process is reliable and efficient, and the restoration is precise and steady in quality.
1990-02-01
noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is
Visual ergonomic aspects of glare on computer displays: glossy screens and angular dependence
NASA Astrophysics Data System (ADS)
Brunnström, Kjell; Andrén, Börje; Konstantinides, Zacharias; Nordström, Lukas
2007-02-01
Recently flat panel computer displays and notebook computer are designed with a so called glare panel i.e. highly glossy screens, have emerged on the market. The shiny look of the display appeals to the costumers, also there are arguments that the contrast, colour saturation etc improves by using a glare panel. LCD displays suffer often from angular dependent picture quality. This has been even more pronounced by the introduction of Prism Light Guide plates into displays for notebook computers. The TCO label is the leading labelling system for computer displays. Currently about 50% of all computer displays on the market are certified according to the TCO requirements. The requirements are periodically updated to keep up with the technical development and the latest research in e.g. visual ergonomics. The gloss level of the screen and the angular dependence has recently been investigated by conducting user studies. A study of the effect of highly glossy screens compared to matt screens has been performed. The results show a slight advantage for the glossy screen when no disturbing reflexes are present, however the difference was not statistically significant. When disturbing reflexes are present the advantage is changed into a larger disadvantage and this difference is statistically significant. Another study of angular dependence has also been performed. The results indicates a linear relationship between the picture quality and the centre luminance of the screen.
Design Aids for Real-Time Systems (DARTS)
NASA Technical Reports Server (NTRS)
Szulewski, P. A.
1982-01-01
Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.
White, Pam; Roudsari, Abdul
2014-01-01
In the United Kingdom's National Health Service, quality indicators are generally measured electronically by using queries and data extraction, resulting in overlap and duplication of query components. Electronic measurement of health care quality indicators could be improved through an ontology intended to reduce duplication of effort during healthcare quality monitoring. While much research has been published on ontologies for computer-interpretable guidelines, quality indicators have lagged behind. We aimed to determine progress on the use of ontologies to facilitate computer-interpretable healthcare quality indicators. We assessed potential for improvements to computer-interpretable healthcare quality indicators in England. We concluded that an ontology for a large, diverse set of healthcare quality indicators could benefit the NHS and reduce workload, with potential lessons for other countries.
Establishing a Cloud Computing Success Model for Hospitals in Taiwan.
Lian, Jiunn-Woei
2017-01-01
The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services.
Establishing a Cloud Computing Success Model for Hospitals in Taiwan
Lian, Jiunn-Woei
2017-01-01
The purpose of this study is to understand the critical quality-related factors that affect cloud computing success of hospitals in Taiwan. In this study, private cloud computing is the major research target. The chief information officers participated in a questionnaire survey. The results indicate that the integration of trust into the information systems success model will have acceptable explanatory power to understand cloud computing success in the hospital. Moreover, information quality and system quality directly affect cloud computing satisfaction, whereas service quality indirectly affects the satisfaction through trust. In other words, trust serves as the mediator between service quality and satisfaction. This cloud computing success model will help hospitals evaluate or achieve success after adopting private cloud computing health care services. PMID:28112020
The State of Software for Evolutionary Biology.
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-05-01
With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.
Design of k-Space Channel Combination Kernels and Integration with Parallel Imaging
Beatty, Philip J.; Chang, Shaorong; Holmes, James H.; Wang, Kang; Brau, Anja C. S.; Reeder, Scott B.; Brittain, Jean H.
2014-01-01
Purpose In this work, a new method is described for producing local k-space channel combination kernels using a small amount of low-resolution multichannel calibration data. Additionally, this work describes how these channel combination kernels can be combined with local k-space unaliasing kernels produced by the calibration phase of parallel imaging methods such as GRAPPA, PARS and ARC. Methods Experiments were conducted to evaluate both the image quality and computational efficiency of the proposed method compared to a channel-by-channel parallel imaging approach with image-space sum-of-squares channel combination. Results Results indicate comparable image quality overall, with some very minor differences seen in reduced field-of-view imaging. It was demonstrated that this method enables a speed up in computation time on the order of 3–16X for 32-channel data sets. Conclusion The proposed method enables high quality channel combination to occur earlier in the reconstruction pipeline, reducing computational and memory requirements for image reconstruction. PMID:23943602
New seismogenic stress fields for southern Italy from a Bayesian approach
NASA Astrophysics Data System (ADS)
Totaro, Cristina; Orecchio, Barbara; Presti, Debora; Scolaro, Silvia; Neri, Giancarlo
2017-04-01
A new database of high-quality waveform inversion focal mechanism has been compiled for southern Italy by integrating the highest quality solutions, available from literature and catalogues, and 146 newly-computed ones. All the selected focal mechanisms are (i) coming from the Italian CMT, Regional CMT and TDMT catalogues (Pondrelli et al., PEPI 2006, PEPI 2011; http://www.ingv.it), or (ii) computed by using the Cut And Paste (CAP) method (Zhao & Helmberger, BSSA 1994; Zhu & Helmberger, BSSA 1996). Specific tests have been carried out in order to evaluate the robustness of the obtained solutions (e.g., by varying both seismic network configuration and Earth structure parameters) and to estimate uncertainties on the focal mechanism parameters. Only the resulting highest-quality solutions have been enclosed in the database, that has then been used for computation of posterior density distributions of stress tensor components by a Bayesian method (Arnold & Townend, GJI 2007). This algorithm furnishes the posterior density function of the principal components of stress tensor (maximum σ1, intermediate σ2, and minimum σ3 compressive stress, respectively) and the stress-magnitude ratio (R). Before stress computation, we applied the k-means clustering algorithm to subdivide the focal mechanism catalog on the basis of earthquake locations. This approach allows identifying the sectors to be investigated without any "a priori" constraint from faulting type distribution. The large amount of data and the application of the Bayesian algorithm allowed us to provide a more accurate local-to-regional scale stress distribution that has shed new light on the kinematics and dynamics of this very complex area, where lithospheric unit configuration and geodynamic engines are still strongly debated. The new high-quality information here furnished will then represent very useful tools and constraints for future geophysical analyses and geodynamic modeling.
NASA Astrophysics Data System (ADS)
Kokubun, Y.; Washizuka, S.; Ushizawa, J.; Watanabe, M.; Fukuda, T.
1982-11-01
The properties of GaP single crystals grown by an automatically diameter controlled liquid encapsulated Czochralski technique using a computer have been studied. A dislocation density less than 5×104 cm-2 has been observed for crystal grown in a temperature gradient lower than 70 °C/cm near the solid-liquid interface. Crystals have about 10% higher electron mobility than that of commercially available coracle controlled crystals and have 0.2˜0.5 compensation ratios. Yellow light emitting diodes using computer controlled (100) substrates have shown extremely high external quantum efficiency of 0.3%.
What Does Quality Programming Mean for High Achieving Students?
ERIC Educational Resources Information Center
Samudzi, Cleo
2008-01-01
The Missouri Academy of Science, Mathematics and Computing (Missouri Academy) is a two-year accelerated, early-entrance-to-college, residential school that matches the level, complexity and pace of the curriculum with the readiness and motivation of high achieving high school students. The school is a part of Northwest Missouri State University…
Gradient Magnitude Similarity Deviation: A Highly Efficient Perceptual Image Quality Index.
Xue, Wufeng; Zhang, Lei; Mou, Xuanqin; Bovik, Alan C
2014-02-01
It is an important task to faithfully evaluate the perceptual quality of output images in many applications, such as image compression, image restoration, and multimedia streaming. A good image quality assessment (IQA) model should not only deliver high quality prediction accuracy, but also be computationally efficient. The efficiency of IQA metrics is becoming particularly important due to the increasing proliferation of high-volume visual data in high-speed networks. We present a new effective and efficient IQA model, called gradient magnitude similarity deviation (GMSD). The image gradients are sensitive to image distortions, while different local structures in a distorted image suffer different degrees of degradations. This motivates us to explore the use of global variation of gradient based local quality map for overall image quality prediction. We find that the pixel-wise gradient magnitude similarity (GMS) between the reference and distorted images combined with a novel pooling strategy-the standard deviation of the GMS map-can predict accurately perceptual image quality. The resulting GMSD algorithm is much faster than most state-of-the-art IQA methods, and delivers highly competitive prediction accuracy. MATLAB source code of GMSD can be downloaded at http://www4.comp.polyu.edu.hk/~cslzhang/IQA/GMSD/GMSD.htm.
ERIC Educational Resources Information Center
Pierson, Susan Jacques
2015-01-01
One way to provide high quality instruction for underserved English Language Learners around the world is to combine Task-Based English Language Learning with Computer- Assisted Instruction. As part of an ongoing project, "Bridges to Swaziland," these approaches have been implemented in a determined effort to improve the ESL program for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habte, A.; Sengupta, M.; Wilcox, S.
This report was part of a multiyear collaboration with the University of Wisconsin and the National Oceanic and Atmospheric Administration (NOAA) to produce high-quality, satellite-based, solar resource datasets for the United States. High-quality, solar resource assessment accelerates technology deployment by making a positive impact on decision making and reducing uncertainty in investment decisions. Satellite-based solar resource datasets are used as a primary source in solar resource assessment. This is mainly because satellites provide larger areal coverage and longer periods of record than ground-based measurements. With the advent of newer satellites with increased information content and faster computers that can processmore » increasingly higher data volumes, methods that were considered too computationally intensive are now feasible. One class of sophisticated methods for retrieving solar resource information from satellites is a two-step, physics-based method that computes cloud properties and uses the information in a radiative transfer model to compute solar radiation. This method has the advantage of adding additional information as satellites with newer channels come on board. This report evaluates the two-step method developed at NOAA and adapted for solar resource assessment for renewable energy with the goal of identifying areas that can be improved in the future.« less
Numerical Boundary Conditions for Computational Aeroacoustics Benchmark Problems
NASA Technical Reports Server (NTRS)
Tam, Chritsopher K. W.; Kurbatskii, Konstantin A.; Fang, Jun
1997-01-01
Category 1, Problems 1 and 2, Category 2, Problem 2, and Category 3, Problem 2 are solved computationally using the Dispersion-Relation-Preserving (DRP) scheme. All these problems are governed by the linearized Euler equations. The resolution requirements of the DRP scheme for maintaining low numerical dispersion and dissipation as well as accurate wave speeds in solving the linearized Euler equations are now well understood. As long as 8 or more mesh points per wavelength is employed in the numerical computation, high quality results are assured. For the first three categories of benchmark problems, therefore, the real challenge is to develop high quality numerical boundary conditions. For Category 1, Problems 1 and 2, it is the curved wall boundary conditions. For Category 2, Problem 2, it is the internal radiation boundary conditions inside the duct. For Category 3, Problem 2, they are the inflow and outflow boundary conditions upstream and downstream of the blade row. These are the foci of the present investigation. Special nonhomogeneous radiation boundary conditions that generate the incoming disturbances and at the same time allow the outgoing reflected or scattered acoustic disturbances to leave the computation domain without significant reflection are developed. Numerical results based on these boundary conditions are provided.
A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potok, Thomas E; Schuman, Catherine D; Young, Steven R
Current Deep Learning models use highly optimized convolutional neural networks (CNN) trained on large graphical processing units (GPU)-based computers with a fairly simple layered network topology, i.e., highly connected layers, without intra-layer connections. Complex topologies have been proposed, but are intractable to train on current systems. Building the topologies of the deep learning network requires hand tuning, and implementing the network in hardware is expensive in both cost and power. In this paper, we evaluate deep learning models using three different computing architectures to address these problems: quantum computing to train complex topologies, high performance computing (HPC) to automatically determinemore » network topology, and neuromorphic computing for a low-power hardware implementation. Due to input size limitations of current quantum computers we use the MNIST dataset for our evaluation. The results show the possibility of using the three architectures in tandem to explore complex deep learning networks that are untrainable using a von Neumann architecture. We show that a quantum computer can find high quality values of intra-layer connections and weights, while yielding a tractable time result as the complexity of the network increases; a high performance computer can find optimal layer-based topologies; and a neuromorphic computer can represent the complex topology and weights derived from the other architectures in low power memristive hardware. This represents a new capability that is not feasible with current von Neumann architecture. It potentially enables the ability to solve very complicated problems unsolvable with current computing technologies.« less
Chiesura, Gabriele; Luyckx, Geert; Voet, Eli; Lammens, Nicolas; Van Paepegem, Wim; Degrieck, Joris; Dierick, Manuel; Van Hoorebeke, Luc; Vanderniepen, Pieter; Sulejmani, Sanne; Sonnenfeld, Camille; Geernaert, Thomas; Berghmans, Francis
2015-01-01
Quality of embedment of optical fibre sensors in carbon fibre-reinforced polymers plays an important role in the resultant properties of the composite, as well as for the correct monitoring of the structure. Therefore, availability of a tool able to check the optical fibre sensor-composite interaction becomes essential. High-resolution 3D X-ray Micro-Computed Tomography, or Micro-CT, is a relatively new non-destructive inspection technique which enables investigations of the internal structure of a sample without actually compromising its integrity. In this work the feasibility of inspecting the position, the orientation and, more generally, the quality of the embedment of an optical fibre sensor in a carbon fibre reinforced laminate at unit cell level have been proven. PMID:25961383
NASA Astrophysics Data System (ADS)
Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.
2016-12-01
Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.
JPEG vs. JPEG 2000: an objective comparison of image encoding quality
NASA Astrophysics Data System (ADS)
Ebrahimi, Farzad; Chamik, Matthieu; Winkler, Stefan
2004-11-01
This paper describes an objective comparison of the image quality of different encoders. Our approach is based on estimating the visual impact of compression artifacts on perceived quality. We present a tool that measures these artifacts in an image and uses them to compute a prediction of the Mean Opinion Score (MOS) obtained in subjective experiments. We show that the MOS predictions by our proposed tool are a better indicator of perceived image quality than PSNR, especially for highly compressed images. For the encoder comparison, we compress a set of 29 test images with two JPEG encoders (Adobe Photoshop and IrfanView) and three JPEG2000 encoders (JasPer, Kakadu, and IrfanView) at various compression ratios. We compute blockiness, blur, and MOS predictions as well as PSNR of the compressed images. Our results show that the IrfanView JPEG encoder produces consistently better images than the Adobe Photoshop JPEG encoder at the same data rate. The differences between the JPEG2000 encoders in our test are less pronounced; JasPer comes out as the best codec, closely followed by IrfanView and Kakadu. Comparing the JPEG- and JPEG2000-encoding quality of IrfanView, we find that JPEG has a slight edge at low compression ratios, while JPEG2000 is the clear winner at medium and high compression ratios.
Hydrogen Surfactant Effect on ZnO/GaN Heterostructures Growth
NASA Astrophysics Data System (ADS)
Zhang, Jingzhao; Zhang, Yiou; Tse, Kinfai; Zhu, Junyi
To grow high quality heterostructures based on ZnO and GaN, growth conditions that favor the layer by layer (Frank-Van der Merwe) growth mode have to be applied. However, if A wets B, B would not wet A without special treatments. A famous example is the epitaxial growth of Si/Ge/Si heterostructure with the help of arsenic surfactant in the late 1980s. It has been confirmed by the previous experiments and our calculations that poor crystal quality and 3D growth mode were obtained when GaN grown on ZnO polar surfaces while high quality ZnO was achieved on (0001) and (000-1)-oriented GaN. During the standard OMVPE growth processes, hydrogen is a common impurity and hydrogen-involved surface reconstructions have been well investigated experimentally and theoretically elsewhere. Due to the above facts, we proposed key growth strategies by using hydrogen as a surfactant to achieve ideal growth mode for GaN on ZnO (000-1) surface. This novel strategy may for the first time make the growth of high quality GaN single crystal on ZnO substrate possible. This surfactant effect is expected to largely improve the crystal quality and the efficiency of ZnO/GaN super lattices or other heterostructure devices. Part of the computing resources was provided by the High Performance Cluster Computing Centre, Hong Kong Baptist University. This work was supported by the start-up funding and direct Grant with the Project code of 4053134 and 3132748 at CUHK.
NASA Technical Reports Server (NTRS)
Lawrence, Stella
1992-01-01
This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.
NASA Technical Reports Server (NTRS)
Redhed, D. D.
1978-01-01
Three possible goals for the Numerical Aerodynamic Simulation Facility (NASF) are: (1) a computational fluid dynamics (as opposed to aerodynamics) algorithm development tool; (2) a specialized research laboratory facility for nearly intractable aerodynamics problems that industry encounters; and (3) a facility for industry to use in its normal aerodynamics design work that requires high computing rates. The central system issue for industry use of such a computer is the quality of the user interface as implemented in some kind of a front end to the vector processor.
Ajlan, Amr M; Binzaqr, Salma; Jadkarim, Dalia A; Jamjoom, Lamia G; Leipsic, Jonathon
2016-01-01
The purpose of this study was to compare qualitative and quantitative image parameters of dual-source high-pitch helical computed tomographic pulmonary angiography (CTPA) in breath-holding (BH) versus free-breathing (FB) patients. Ninety-nine consented patients (61 female individuals; mean age±SD, 49±18.7 y) were randomized into BH (n=45) versus FB (n=54) high-pitch helical CTPA. Patient characteristics and CTPA radiation doses were analyzed. Two readers assessed for pulmonary embolism (PE), transient interruption of contrast, and respiratory and cardiac motion. The readers used a subjective 3-point scale to rate the pulmonary artery opacification and lung parenchymal appearance. A single reader assessed mean pulmonary artery signal intensity, noise, contrast, signal to noise ratio, and contrast to noise ratio. PE was diagnosed in 16% BH and 19% FB patients. CTPAs of both groups were of excellent or acceptable quality for PE evaluation and of similar mean radiation doses (1.3 mSv). Transient interruption of contrast was seen in 5/45 (11%) BH and 5/54 (9%) FB patients (not statistically significant, P=0.54). No statistically significant difference was noted in cardiac, diaphragmatic, and lung parenchymal motion. Lung parenchymal assessment was excellent in all cases, except for 5/54 (9%) motion-affected FB cases with acceptable quality (statistically significant, P=0.03). No CTPA was considered nondiagnostic by any of the readers. No objective image quality differences were noted between both groups (P>0.05). High-pitch helical CTPA acquired during BH or in FB yields comparable image quality for the diagnosis of PE and lung pathology, with low radiation exposure. Only a modest increase in lung parenchymal artifacts is encountered in FB high-pitch helical CTPA.
Bringing Computational Thinking into the High School Science and Math Classroom
NASA Astrophysics Data System (ADS)
Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern
2013-01-01
Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.
AnnotCompute: annotation-based exploration and meta-analysis of genomics experiments
Zheng, Jie; Stoyanovich, Julia; Manduchi, Elisabetta; Liu, Junmin; Stoeckert, Christian J.
2011-01-01
The ever-increasing scale of biological data sets, particularly those arising in the context of high-throughput technologies, requires the development of rich data exploration tools. In this article, we present AnnotCompute, an information discovery platform for repositories of functional genomics experiments such as ArrayExpress. Our system leverages semantic annotations of functional genomics experiments with controlled vocabulary and ontology terms, such as those from the MGED Ontology, to compute conceptual dissimilarities between pairs of experiments. These dissimilarities are then used to support two types of exploratory analysis—clustering and query-by-example. We show that our proposed dissimilarity measures correspond to a user's intuition about conceptual dissimilarity, and can be used to support effective query-by-example. We also evaluate the quality of clustering based on these measures. While AnnotCompute can support a richer data exploration experience, its effectiveness is limited in some cases, due to the quality of available annotations. Nonetheless, tools such as AnnotCompute may provide an incentive for richer annotations of experiments. Code is available for download at http://www.cbil.upenn.edu/downloads/AnnotCompute. Database URL: http://www.cbil.upenn.edu/annotCompute/ PMID:22190598
Okuda, Kyohei; Sakimoto, Shota; Fujii, Susumu; Ida, Tomonobu; Moriyama, Shigeru
The frame-of-reference using computed-tomography (CT) coordinate system on single-photon emission computed tomography (SPECT) reconstruction is one of the advanced characteristics of the xSPECT reconstruction system. The aim of this study was to reveal the influence of the high-resolution frame-of-reference on the xSPECT reconstruction. 99m Tc line-source phantom and National Electrical Manufacturers Association (NEMA) image quality phantom were scanned using the SPECT/CT system. xSPECT reconstructions were performed with the reference CT images in different sizes of the display field-of-view (DFOV) and pixel. The pixel sizes of the reconstructed xSPECT images were close to 2.4 mm, which is acquired as originally projection data, even if the reference CT resolution was varied. The full width at half maximum (FWHM) of the line-source, absolute recovery coefficient, and background variability of image quality phantom were independent on the sizes of DFOV in the reference CT images. The results of this study revealed that the image quality of the reconstructed xSPECT images is not influenced by the resolution of frame-of-reference on SPECT reconstruction.
Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia
2014-04-01
[Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.
13 point video tape quality guidelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaunt, R.
1997-05-01
Until high definition television (ATV) arrives, in the U.S. we must still contend with the National Television Systems Committee (NTSC) video standard (or PAL or SECAM-depending on your country). NTSC, a 40-year old standard designed for transmission of color video camera images over a small bandwidth, is not well suited for the sharp, full-color images that todays computers are capable of producing. PAL and SECAM also suffers from many of NTSC`s problems, but to varying degrees. Video professionals, when working with computer graphic (CG) images, use two monitors: a computer monitor for producing CGs and an NTSC monitor to viewmore » how a CG will look on video. More often than not, the NTSC image will differ significantly from the CG image, and outputting it to NTSC as an artist works enables the him or her to see the images as others will see it. Below are thirteen guidelines designed to increase the quality of computer graphics recorded onto video tape. Viewing your work in NTSC and attempting to follow the below tips will enable you to create higher quality videos. No video is perfect, so don`t expect to abide by every guideline every time.« less
NASA Astrophysics Data System (ADS)
Olsson, O.
2018-01-01
We present a novel heuristic derived from a probabilistic cost model for approximate N-body simulations. We show that this new heuristic can be used to guide tree construction towards higher quality trees with improved performance over current N-body codes. This represents an important step beyond the current practice of using spatial partitioning for N-body simulations, and enables adoption of a range of state-of-the-art algorithms developed for computer graphics applications to yield further improvements in N-body simulation performance. We outline directions for further developments and review the most promising such algorithms.
An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, Randal Scott
CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework
NASA Astrophysics Data System (ADS)
Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain
2014-05-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.
Draft versus finished sequence data for DNA and protein diagnostic signature development
Gardner, Shea N.; Lam, Marisa W.; Smith, Jason R.; Torres, Clinton L.; Slezak, Tom R.
2005-01-01
Sequencing pathogen genomes is costly, demanding careful allocation of limited sequencing resources. We built a computational Sequencing Analysis Pipeline (SAP) to guide decisions regarding the amount of genomic sequencing necessary to develop high-quality diagnostic DNA and protein signatures. SAP uses simulations to estimate the number of target genomes and close phylogenetic relatives (near neighbors or NNs) to sequence. We use SAP to assess whether draft data are sufficient or finished sequencing is required using Marburg and variola virus sequences. Simulations indicate that intermediate to high-quality draft with error rates of 10−3–10−5 (∼8× coverage) of target organisms is suitable for DNA signature prediction. Low-quality draft with error rates of ∼1% (3× to 6× coverage) of target isolates is inadequate for DNA signature prediction, although low-quality draft of NNs is sufficient, as long as the target genomes are of high quality. For protein signature prediction, sequencing errors in target genomes substantially reduce the detection of amino acid sequence conservation, even if the draft is of high quality. In summary, high-quality draft of target and low-quality draft of NNs appears to be a cost-effective investment for DNA signature prediction, but may lead to underestimation of predicted protein signatures. PMID:16243783
Henshaw, Helen; Ferguson, Melanie A.
2013-01-01
Background Auditory training involves active listening to auditory stimuli and aims to improve performance in auditory tasks. As such, auditory training is a potential intervention for the management of people with hearing loss. Objective This systematic review (PROSPERO 2011: CRD42011001406) evaluated the published evidence-base for the efficacy of individual computer-based auditory training to improve speech intelligibility, cognition and communication abilities in adults with hearing loss, with or without hearing aids or cochlear implants. Methods A systematic search of eight databases and key journals identified 229 articles published since 1996, 13 of which met the inclusion criteria. Data were independently extracted and reviewed by the two authors. Study quality was assessed using ten pre-defined scientific and intervention-specific measures. Results Auditory training resulted in improved performance for trained tasks in 9/10 articles that reported on-task outcomes. Although significant generalisation of learning was shown to untrained measures of speech intelligibility (11/13 articles), cognition (1/1 articles) and self-reported hearing abilities (1/2 articles), improvements were small and not robust. Where reported, compliance with computer-based auditory training was high, and retention of learning was shown at post-training follow-ups. Published evidence was of very-low to moderate study quality. Conclusions Our findings demonstrate that published evidence for the efficacy of individual computer-based auditory training for adults with hearing loss is not robust and therefore cannot be reliably used to guide intervention at this time. We identify a need for high-quality evidence to further examine the efficacy of computer-based auditory training for people with hearing loss. PMID:23675431
A digital gigapixel large-format tile-scan camera.
Ben-Ezra, M
2011-01-01
Although the resolution of single-lens reflex (SLR) and medium-format digital cameras has increased in recent years, applications for cultural-heritage preservation and computational photography require even higher resolutions. Addressing this issue, a large-format cameras' large image planes can achieve very high resolution without compromising pixel size and thus can provide high-quality, high-resolution images.This digital large-format tile scan camera can acquire high-quality, high-resolution images of static scenes. It employs unique calibration techniques and a simple algorithm for focal-stack processing of very large images with significant magnification variations. The camera automatically collects overlapping focal stacks and processes them into a high-resolution, extended-depth-of-field image.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
NASA Astrophysics Data System (ADS)
Melli, S. Ali; Wahid, Khan A.; Babyn, Paul; Cooper, David M. L.; Gopi, Varun P.
2016-12-01
Synchrotron X-ray Micro Computed Tomography (Micro-CT) is an imaging technique which is increasingly used for non-invasive in vivo preclinical imaging. However, it often requires a large number of projections from many different angles to reconstruct high-quality images leading to significantly high radiation doses and long scan times. To utilize this imaging technique further for in vivo imaging, we need to design reconstruction algorithms that reduce the radiation dose and scan time without reduction of reconstructed image quality. This research is focused on using a combination of gradient-based Douglas-Rachford splitting and discrete wavelet packet shrinkage image denoising methods to design an algorithm for reconstruction of large-scale reduced-view synchrotron Micro-CT images with acceptable quality metrics. These quality metrics are computed by comparing the reconstructed images with a high-dose reference image reconstructed from 1800 equally spaced projections spanning 180°. Visual and quantitative-based performance assessment of a synthetic head phantom and a femoral cortical bone sample imaged in the biomedical imaging and therapy bending magnet beamline at the Canadian Light Source demonstrates that the proposed algorithm is superior to the existing reconstruction algorithms. Using the proposed reconstruction algorithm to reduce the number of projections in synchrotron Micro-CT is an effective way to reduce the overall radiation dose and scan time which improves in vivo imaging protocols.
Controlling under-actuated robot arms using a high speed dynamics process
NASA Technical Reports Server (NTRS)
Jain, Abhinandan (Inventor); Rodriguez, Guillermo (Inventor)
1994-01-01
The invention controls an under-actuated manipulator by first obtaining predetermined active joint accelerations of the active joints and the passive joint friction forces of the passive joints, then computing articulated body qualities for each of the joints from the current positions of the links, and finally computing from the articulated body qualities and from the active joint accelerations and the passive joint forces, active joint forces of the active joints. Ultimately, the invention transmits servo commands to the active joint forces thus computed to the respective ones of the joint servos. The computation of the active joint forces is accomplished using a recursive dynamics algorithm. In this computation, an inward recursion is first carried out for each link, beginning with the outermost link in order to compute the residual link force of each link from the active joint acceleration if the corresponding joint is active, or from the known passive joint force if the corresponding joint is passive. Then, an outward recursion is carried out for each link in which the active joint force is computed from the residual link force if the corresponding joint is active or the passive joint acceleration is computed from the residual link force if the corresponding joint is passive.
The Gain of Resource Delegation in Distributed Computing Environments
NASA Astrophysics Data System (ADS)
Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander
In this paper, we address job scheduling in Distributed Computing Infrastructures, that is a loosely coupled network of autonomous acting High Performance Computing systems. In contrast to the common approach of mutual workload exchange, we consider the more intuitive operator's viewpoint of load-dependent resource reconfiguration. In case of a site's over-utilization, the scheduling system is able to lease resources from other sites to keep up service quality for its local user community. Contrary, the granting of idle resources can increase utilization in times of low local workload and thus ensure higher efficiency. The evaluation considers real workload data and is done with respect to common service quality indicators. For two simple resource exchange policies and three basic setups we show the possible gain of this approach and analyze the dynamics in workload-adaptive reconfiguration behavior.
Silva, Luiz Antonio F.; Barriviera, Mauricio; Januário, Alessandro L.; Bezerra, Ana Cristina B.; Fioravanti, Maria Clorinda S.
2011-01-01
The development of veterinary dentistry has substantially improved the ability to diagnose canine and feline dental abnormalities. Consequently, examinations previously performed only on humans are now available for small animals, thus improving the diagnostic quality. This has increased the need for technical qualification of veterinary professionals and increased technological investments. This study evaluated the use of cone beam computed tomography and intraoral radiography as complementary exams for diagnosing dental abnormalities in dogs and cats. Cone beam computed tomography was provided faster image acquisition with high image quality, was associated with low ionizing radiation levels, enabled image editing, and reduced the exam duration. Our results showed that radiography was an effective method for dental radiographic examination with low cost and fast execution times, and can be performed during surgical procedures. PMID:22122905
Optical correction of refractive error for preventing and treating eye symptoms in computer users.
Heus, Pauline; Verbeek, Jos H; Tikka, Christina
2018-04-10
Computer users frequently complain about problems with seeing and functioning of the eyes. Asthenopia is a term generally used to describe symptoms related to (prolonged) use of the eyes like ocular fatigue, headache, pain or aching around the eyes, and burning and itchiness of the eyelids. The prevalence of asthenopia during or after work on a computer ranges from 46.3% to 68.5%. Uncorrected or under-corrected refractive error can contribute to the development of asthenopia. A refractive error is an error in the focusing of light by the eye and can lead to reduced visual acuity. There are various possibilities for optical correction of refractive errors including eyeglasses, contact lenses and refractive surgery. To examine the evidence on the effectiveness, safety and applicability of optical correction of refractive error for reducing and preventing eye symptoms in computer users. We searched the Cochrane Central Register of Controlled Trials (CENTRAL); PubMed; Embase; Web of Science; and OSH update, all to 20 December 2017. Additionally, we searched trial registries and checked references of included studies. We included randomised controlled trials (RCTs) and quasi-randomised trials of interventions evaluating optical correction for computer workers with refractive error for preventing or treating asthenopia and their effect on health related quality of life. Two authors independently assessed study eligibility and risk of bias, and extracted data. Where appropriate, we combined studies in a meta-analysis. We included eight studies with 381 participants. Three were parallel group RCTs, three were cross-over RCTs and two were quasi-randomised cross-over trials. All studies evaluated eyeglasses, there were no studies that evaluated contact lenses or surgery. Seven studies evaluated computer glasses with at least one focal area for the distance of the computer screen with or without additional focal areas in presbyopic persons. Six studies compared computer glasses to other types of glasses; and one study compared them to an ergonomic workplace assessment. The eighth study compared optimal correction of refractive error with the actual spectacle correction in use. Two studies evaluated computer glasses in persons with asthenopia but for the others the glasses were offered to all workers regardless of symptoms. The risk of bias was unclear in five, high in two and low in one study. Asthenopia was measured as eyestrain or a summary score of symptoms but there were no studies on health-related quality of life. Adverse events were measured as headache, nausea or dizziness. Median asthenopia scores at baseline were about 30% of the maximum possible score.Progressive computer glasses versus monofocal glassesOne study found no considerable difference in asthenopia between various progressive computer glasses and monofocal computer glasses after one-year follow-up (mean difference (MD) change scores 0.23, 95% confidence interval (CI) -5.0 to 5.4 on a 100 mm VAS scale, low quality evidence). For headache the results were in favour of progressive glasses.Progressive computer glasses with an intermediate focus in the upper part of the glasses versus other glassesIn two studies progressive computer glasses with intermediate focus led to a small decrease in asthenopia symptoms (SMD -0.49, 95% CI -0.75 to -0.23, low-quality evidence) but not in headache score in the short-term compared to general purpose progressive glasses. There were similar small decreases in dizziness. At medium term follow-up, in one study the effect size was not statistically significant (SMD -0.64, 95% CI -1.40 to 0.12). The study did not assess adverse events.Another study found no considerable difference in asthenopia between progressive computer glasses and monofocal computer glasses after one-year follow-up (MD change scores 1.44, 95% CI -6.95 to 9.83 on a 100 mm VAS scale, very low quality evidence). For headache the results were inconsistent.Progressive computer glasses with far-distance focus in the upper part of the glasses versus other glassesOne study found no considerable difference in number of persons with asthenopia between progressive computer glasses with far-distance focus and bifocal computer glasses after four weeks' follow-up (OR 1.00, 95% CI 0.40 to 2.50, very low quality evidence). The number of persons with headache, nausea and dizziness was also not different between groups.Another study found no considerable difference in asthenopia between progressive computer glasses with far-distance focus and monofocal computer glasses after one-year follow-up (MD change scores -1.79, 95% CI -11.60 to 8.02 on a 100 mm VAS scale, very low quality evidence). The effects on headaches were inconsistent.One study found no difference between progressive far-distance focus computer glasses and trifocal glasses in effect on eyestrain severity (MD -0.50, 95% CI -1.07 to 0.07, very low quality evidence) or on eyestrain frequency (MD -0.75, 95% CI -1.61 to 0.11, very low quality evidence).Progressive computer glasses versus ergonomic assessment with habitual (computer) glassesOne study found that computer glasses optimised for individual needs reduced asthenopia sum score more than an ergonomic assessment and habitual (computer) glasses (MD -8.9, 95% CI -16.47 to -1.33, scale 0 to 140, very low quality evidence) but there was no effect on the frequency of eyestrain (OR 1.08, 95% CI 0.38 to 3.11, very low quality evidence).We rated the quality of the evidence as low or very low due to risk of bias in the included studies, inconsistency in the results and imprecision. There is low to very low quality evidence that providing computer users with progressive computer glasses does not lead to a considerable decrease in problems with the eyes or headaches compared to other computer glasses. Progressive computer glasses might be slightly better than progressive glasses for daily use in the short term but not in the intermediate term and there is no data on long-term follow-up. The quality of the evidence is low or very low and therefore we are uncertain about this conclusion. Larger studies with several hundreds of participants are needed with proper randomisation, validated outcome measurement methods, and longer follow-up of at least one year to improve the quality of the evidence.
The State of Software for Evolutionary Biology
Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros
2018-01-01
Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525
Computer graphics application in the engineering design integration system
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.
1975-01-01
The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnstad, H.
The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less
A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm
Chen, Jui-Le; Yang, Chu-Sing
2013-01-01
The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864
Design considerations for computationally constrained two-way real-time video communication
NASA Astrophysics Data System (ADS)
Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.
2009-08-01
Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.
Radiological interpretation of images displayed on tablet computers: a systematic review.
Caffery, L J; Armfield, N R; Smith, A C
2015-06-01
To review the published evidence and to determine if radiological diagnostic accuracy is compromised when images are displayed on a tablet computer and thereby inform practice on using tablet computers for radiological interpretation by on-call radiologists. We searched the PubMed and EMBASE databases for studies on the diagnostic accuracy or diagnostic reliability of images interpreted on tablet computers. Studies were screened for inclusion based on pre-determined inclusion and exclusion criteria. Studies were assessed for quality and risk of bias using Quality Appraisal of Diagnostic Reliability Studies or the revised Quality Assessment of Diagnostic Accuracy Studies tool. Treatment of studies was reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). 11 studies met the inclusion criteria. 10 of these studies tested the Apple iPad(®) (Apple, Cupertino, CA). The included studies reported high sensitivity (84-98%), specificity (74-100%) and accuracy rates (98-100%) for radiological diagnosis. There was no statistically significant difference in accuracy between a tablet computer and a digital imaging and communication in medicine-calibrated control display. There was a near complete consensus from authors on the non-inferiority of diagnostic accuracy of images displayed on a tablet computer. All of the included studies were judged to be at risk of bias. Our findings suggest that the diagnostic accuracy of radiological interpretation is not compromised by using a tablet computer. This result is only relevant to the Apple iPad and to the modalities of CT, MRI and plain radiography. The iPad may be appropriate for an on-call radiologist to use for radiological interpretation.
Burton, Kirsteen R; Perlis, Nathan; Aviv, Richard I; Moody, Alan R; Kapral, Moira K; Krahn, Murray D; Laupacis, Andreas
2014-03-01
This study reviews the quality of economic evaluations of imaging after acute stroke and identifies areas for improvement. We performed full-text searches of electronic databases that included Medline, Econlit, the National Health Service Economic Evaluation Database, and the Tufts Cost Effectiveness Analysis Registry through July 2012. Search strategy terms included the following: stroke*; cost*; or cost-benefit analysis*; and imag*. Inclusion criteria were empirical studies published in any language that reported the results of economic evaluations of imaging interventions for patients with stroke symptoms. Study quality was assessed by a commonly used checklist (with a score range of 0% to 100%). Of 568 unique potential articles identified, 5 were included in the review. Four of 5 articles were explicit in their analysis perspectives, which included healthcare system payers, hospitals, and stroke services. Two studies reported results during a 5-year time horizon, and 3 studies reported lifetime results. All included the modified Rankin Scale score as an outcome measure. The median quality score was 84.4% (range=71.9%-93.5%). Most studies did not consider the possibility that patients could not tolerate contrast media or could incur contrast-induced nephropathy. Three studies compared perfusion computed tomography with unenhanced computed tomography but assumed that outcomes guided by the results of perfusion computed tomography were equivalent to outcomes guided by the results of magnetic resonance imaging or noncontrast computed tomography. Economic evaluations of imaging modalities after acute ischemic stroke were generally of high methodological quality. However, important radiology-specific clinical components were missing from all of these analyses.
Intelligent fuzzy approach for fast fractal image compression
NASA Astrophysics Data System (ADS)
Nodehi, Ali; Sulong, Ghazali; Al-Rodhaan, Mznah; Al-Dhelaan, Abdullah; Rehman, Amjad; Saba, Tanzila
2014-12-01
Fractal image compression (FIC) is recognized as a NP-hard problem, and it suffers from a high number of mean square error (MSE) computations. In this paper, a two-phase algorithm was proposed to reduce the MSE computation of FIC. In the first phase, based on edge property, range and domains are arranged. In the second one, imperialist competitive algorithm (ICA) is used according to the classified blocks. For maintaining the quality of the retrieved image and accelerating algorithm operation, we divided the solutions into two groups: developed countries and undeveloped countries. Simulations were carried out to evaluate the performance of the developed approach. Promising results thus achieved exhibit performance better than genetic algorithm (GA)-based and Full-search algorithms in terms of decreasing the number of MSE computations. The number of MSE computations was reduced by the proposed algorithm for 463 times faster compared to the Full-search algorithm, although the retrieved image quality did not have a considerable change.
Initial Determination of Low Earth Orbits Using Commercial Telescopes
2008-03-01
many new technologies have significantly changed the face of private astronomy . Developments such as inexpensive but high-quality sensors, rapid... astronomy . Unpar- alleled access to quality equipment, rapid personal computing, and extensive community support enable nearly anyone to achieve feats in...other subdisciplines of astronomy , this field benefits greatly from recent advances. This project examines how modern equipment is used to track Low Earth
Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min
2015-06-01
The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.
A Study of Quality of Service Communication for High-Speed Packet-Switching Computer Sub-Networks
NASA Technical Reports Server (NTRS)
Cui, Zhenqian
1999-01-01
In this thesis, we analyze various factors that affect quality of service (QoS) communication in high-speed, packet-switching sub-networks. We hypothesize that sub-network-wide bandwidth reservation and guaranteed CPU processing power at endpoint systems for handling data traffic are indispensable to achieving hard end-to-end quality of service. Different bandwidth reservation strategies, traffic characterization schemes, and scheduling algorithms affect the network resources and CPU usage as well as the extent that QoS can be achieved. In order to analyze those factors, we design and implement a communication layer. Our experimental analysis supports our research hypothesis. The Resource ReSerVation Protocol (RSVP) is designed to realize resource reservation. Our analysis of RSVP shows that using RSVP solely is insufficient to provide hard end-to-end quality of service in a high-speed sub-network. Analysis of the IEEE 802.lp protocol also supports the research hypothesis.
Full field image reconstruction is suitable for high-pitch dual-source computed tomography.
Mahnken, Andreas H; Allmendinger, Thomas; Sedlmair, Martin; Tamm, Miriam; Reinartz, Sebastian D; Flohr, Thomas
2012-11-01
The field of view (FOV) in high-pitch dual-source computed tomography (DSCT) is limited by the size of the second detector. The goal of this study was to develop and evaluate a full FOV image reconstruction technique for high-pitch DSCT. For reconstruction beyond the FOV of the second detector, raw data of the second system were extended to the full dimensions of the first system, using the partly existing data of the first system in combination with a very smooth transition weight function. During the weighted filtered backprojection, the data of the second system were applied with an additional weighting factor. This method was tested for different pitch values from 1.5 to 3.5 on a simulated phantom and on 25 high-pitch DSCT data sets acquired at pitch values of 1.6, 2.0, 2.5, 2.8, and 3.0. Images were reconstructed with FOV sizes of 260 × 260 and 500 × 500 mm. Image quality was assessed by 2 radiologists using a 5-point Likert scale and analyzed with repeated-measure analysis of variance. In phantom and patient data, full FOV image quality depended on pitch. Where complete projection data from both tube-detector systems were available, image quality was unaffected by pitch changes. Full FOV image quality was not compromised at pitch values of 1.6 and remained fully diagnostic up to a pitch of 2.0. At higher pitch values, there was an increasing difference in image quality between limited and full FOV images (P = 0.0097). With this new image reconstruction technique, full FOV image reconstruction can be used up to a pitch of 2.0.
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Low-rate image coding using vector quantization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makur, A.
1990-01-01
This thesis deals with the development and analysis of a computationally simple vector quantization image compression system for coding monochrome images at low bit rate. Vector quantization has been known to be an effective compression scheme when a low bit rate is desirable, but the intensive computation required in a vector quantization encoder has been a handicap in using it for low rate image coding. The present work shows that, without substantially increasing the coder complexity, it is indeed possible to achieve acceptable picture quality while attaining a high compression ratio. Several modifications to the conventional vector quantization coder aremore » proposed in the thesis. These modifications are shown to offer better subjective quality when compared to the basic coder. Distributed blocks are used instead of spatial blocks to construct the input vectors. A class of input-dependent weighted distortion functions is used to incorporate psychovisual characteristics in the distortion measure. Computationally simple filtering techniques are applied to further improve the decoded image quality. Finally, unique designs of the vector quantization coder using electronic neural networks are described, so that the coding delay is reduced considerably.« less
Computer-generated holographic near-eye display system based on LCoS phase only modulator
NASA Astrophysics Data System (ADS)
Sun, Peng; Chang, Shengqian; Zhang, Siman; Xie, Ting; Li, Huaye; Liu, Siqi; Wang, Chang; Tao, Xiao; Zheng, Zhenrong
2017-09-01
Augmented reality (AR) technology has been applied in various areas, such as large-scale manufacturing, national defense, healthcare, movie and mass media and so on. An important way to realize AR display is using computer-generated hologram (CGH), which is accompanied by low image quality and heavy computing defects. Meanwhile, the diffraction of Liquid Crystal on Silicon (LCoS) has a negative effect on image quality. In this paper, a modified algorithm based on traditional Gerchberg-Saxton (GS) algorithm was proposed to improve the image quality, and new method to establish experimental system was used to broaden field of view (FOV). In the experiment, undesired zero-order diffracted light was eliminated and high definition 2D image was acquired with FOV broadened to 36.1 degree. We have also done some pilot research in 3D reconstruction with tomography algorithm based on Fresnel diffraction. With the same experimental system, experimental results demonstrate the feasibility of 3D reconstruction. These modifications are effective and efficient, and may provide a better solution in AR realization.
Computational chemistry in pharmaceutical research: at the crossroads.
Bajorath, Jürgen
2012-01-01
Computational approaches are an integral part of pharmaceutical research. However, there are many of unsolved key questions that limit the scientific progress in the still evolving computational field and its impact on drug discovery. Importantly, a number of these questions are not new but date back many years. Hence, it might be difficult to conclusively answer them in the foreseeable future. Moreover, the computational field as a whole is characterized by a high degree of heterogeneity and so is, unfortunately, the quality of its scientific output. In light of this situation, it is proposed that changes in scientific standards and culture should be seriously considered now in order to lay a foundation for future progress in computational research.
Electron tubes for industrial applications
NASA Astrophysics Data System (ADS)
Gellert, Bernd
1994-05-01
This report reviews research and development efforts within the last years for vacuum electron tubes, in particular power grid tubes for industrial applications. Physical and chemical effects are discussed that determine the performance of todays devices. Due to the progress made in the fundamental understanding of materials and newly developed processes the reliability and reproducibility of power grid tubes could be improved considerably. Modern computer controlled manufacturing methods ensure a high reproducibility of production and continuous quality certification according to ISO 9001 guarantees future high quality standards. Some typical applications of these tubes are given as an example.
Franklin, Marvin A.
2000-01-01
The U.S. Geological Survey, Water Resources Division, has a policy that requires each District office to prepare a Surface Water Quality-Assurance Plan. The plan for each District describes the policies and procedures that ensure high quality in the collection, processing, analysis, computer storage, and publication of surface-water data. The North Florida Program Office Surface Water Quality-Assurance Plan documents the standards, policies, and procedures used by the North Florida Program office for activities related to the collection, processing, storage, analysis, and publication of surface-water data.
ERIC Educational Resources Information Center
Halpern, Arthur M.; Glendening, Eric D.
2013-01-01
A project for students in an upper-level course in quantum or computational chemistry is described in which they are introduced to the concepts and applications of a high quality, ab initio treatment of the ground-state potential energy curve (PEC) for H[subscript 2] and D[subscript 2]. Using a commercial computational chemistry application and a…
Instrumentation for Verification of Bomb Damage Repair Computer Code.
1981-09-01
record the data, a conventional 14-track FM analog tape recorder was retained. The unknown factors of signal duration, test duration, and signal ...Kirtland Air Force Base computer centers for more detailed analyses. In addition to the analog recorder, signal conditioning equipment and amplifiers were...necessary to allow high quality data to be recorded. An Interrange Instrumentation Group (IRIG) code generator/reader placed a coded signal on the tape
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
NASA Astrophysics Data System (ADS)
Moan, T.
2017-12-01
An overview of integrity management of offshore structures, with emphasis on the oil and gas energy sector, is given. Based on relevant accident experiences and means to control the associated risks, accidents are categorized from a technical-physical as well as human and organizational point of view. Structural risk relates to extreme actions as well as structural degradation. Risk mitigation measures, including adequate design criteria, inspection, repair and maintenance as well as quality assurance and control of engineering processes, are briefly outlined. The current status of risk and reliability methodology to aid decisions in the integrity management is briefly reviewed. Finally, the need to balance the uncertainties in data, methods and computational efforts and the cautious use and quality assurance and control in applying high fidelity methods to avoid human errors, is emphasized, and with a plea to develop both high fidelity as well as efficient, simplified methods for design.
Robertson, W M; Parker, J M
2012-03-01
A straightforward and inexpensive implementation of acoustic impulse response measurement is described utilizing the signal processing technique of coherent averaging. The technique is capable of high signal-to-noise measurements with personal computer data acquisition equipment, an amplifier/speaker, and a high quality microphone. When coupled with simple waveguide test systems fabricated from commercial PVC plumbing pipe, impulse response measurement has proven to be ideal for undergraduate research projects-often of publishable quality-or for advanced laboratory experiments. The technique provides important learning objectives for science or engineering students in areas such as interfacing and computer control of experiments; analog-to-digital conversion and sampling; time and frequency analysis using Fourier transforms; signal processing; and insight into a variety of current research areas such as acoustic bandgap materials, acoustic metamaterials, and fast and slow wave manipulation. © 2012 Acoustical Society of America
Installation of new Generation General Purpose Computer (GPC) compact unit
NASA Technical Reports Server (NTRS)
1991-01-01
In the Kennedy Space Center's (KSC's) Orbiter Processing Facility (OPF) high bay 2, Spacecraft Electronics technician Ed Carter (right), wearing clean suit, prepares for (26864) and installs (26865) the new Generation General Purpose Computer (GPC) compact IBM unit in Atlantis', Orbiter Vehicle (OV) 104's, middeck avionics bay as Orbiter Systems Quality Control technician Doug Snider looks on. Both men work for NASA contractor Lockheed Space Operations Company. All three orbiters are being outfitted with the compact IBM unit, which replaces a two-unit earlier generation computer.
Modeling, Monitoring and Fault Diagnosis of Spacecraft Air Contaminants
NASA Technical Reports Server (NTRS)
Ramirez, W. Fred; Skliar, Mikhail; Narayan, Anand; Morgenthaler, George W.; Smith, Gerald J.
1996-01-01
Progress and results in the development of an integrated air quality modeling, monitoring, fault detection, and isolation system are presented. The focus was on development of distributed models of the air contaminants transport, the study of air quality monitoring techniques based on the model of transport process and on-line contaminant concentration measurements, and sensor placement. Different approaches to the modeling of spacecraft air contamination are discussed, and a three-dimensional distributed parameter air contaminant dispersion model applicable to both laminar and turbulent transport is proposed. A two-dimensional approximation of a full scale transport model is also proposed based on the spatial averaging of the three dimensional model over the least important space coordinate. A computer implementation of the transport model is considered and a detailed development of two- and three-dimensional models illustrated by contaminant transport simulation results is presented. The use of a well established Kalman filtering approach is suggested as a method for generating on-line contaminant concentration estimates based on both real time measurements and the model of contaminant transport process. It is shown that high computational requirements of the traditional Kalman filter can render difficult its real-time implementation for high-dimensional transport model and a novel implicit Kalman filtering algorithm is proposed which is shown to lead to an order of magnitude faster computer implementation in the case of air quality monitoring.
A key factor for improving models of ecosystem benefits is the availability of high quality spatial data. High resolution LIDAR data are now commonly available and can be used to produce more accurate model outputs. However, increased resolution leads to higher computer resource...
Applying the Multisim Technology to Teach the Course of High Frequency Power Amplifier
ERIC Educational Resources Information Center
Lv, Gang; Xue, Yuan-Sheng
2011-01-01
As one important professional base course in the electric information specialty, the course of "high frequency electronic circuit" has strong theoretical characteristic and abstract content. To enhance the teaching quality of this course, the computer simulation technology based on Multisim is introduced into the teaching of "high…
Colour calibration of a laboratory computer vision system for quality evaluation of pre-sliced hams.
Valous, Nektarios A; Mendoza, Fernando; Sun, Da-Wen; Allen, Paul
2009-01-01
Due to the high variability and complex colour distribution in meats and meat products, the colour signal calibration of any computer vision system used for colour quality evaluations, represents an essential condition for objective and consistent analyses. This paper compares two methods for CIE colour characterization using a computer vision system (CVS) based on digital photography; namely the polynomial transform procedure and the transform proposed by the sRGB standard. Also, it presents a procedure for evaluating the colour appearance and presence of pores and fat-connective tissue on pre-sliced hams made from pork, turkey and chicken. Our results showed high precision, in colour matching, for device characterization when the polynomial transform was used to match the CIE tristimulus values in comparison with the sRGB standard approach as indicated by their ΔE(ab)(∗) values. The [3×20] polynomial transfer matrix yielded a modelling accuracy averaging below 2.2 ΔE(ab)(∗) units. Using the sRGB transform, high variability was appreciated among the computed ΔE(ab)(∗) (8.8±4.2). The calibrated laboratory CVS, implemented with a low-cost digital camera, exhibited reproducible colour signals in a wide range of colours capable of pinpointing regions-of-interest and allowed the extraction of quantitative information from the overall ham slice surface with high accuracy. The extracted colour and morphological features showed potential for characterizing the appearance of ham slice surfaces. CVS is a tool that can objectively specify colour and appearance properties of non-uniformly coloured commercial ham slices.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Z.T.
2001-11-15
The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.
2010-01-01
Background Both minimally invasive surgery (MIS) and computer-assisted surgery (CAS) for total hip arthroplasty (THA) have gained popularity in recent years. We conducted a qualitative and systematic review to assess the effectiveness of MIS, CAS and computer-assisted MIS for THA. Methods An extensive computerised literature search of PubMed, Medline, Embase and OVIDSP was conducted. Both randomised clinical trials and controlled clinical trials on the effectiveness of MIS, CAS and computer-assisted MIS for THA were included. Methodological quality was independently assessed by two reviewers. Effect estimates were calculated and a best-evidence synthesis was performed. Results Four high-quality and 14 medium-quality studies with MIS THA as study contrast, and three high-quality and four medium-quality studies with CAS THA as study contrast were included. No studies with computer-assisted MIS for THA as study contrast were identified. Strong evidence was found for a decrease in operative time and intraoperative blood loss for MIS THA, with no difference in complication rates and risk for acetabular outliers. Strong evidence exists that there is no difference in physical functioning, measured either by questionnaires or by gait analysis. Moderate evidence was found for a shorter length of hospital stay after MIS THA. Conflicting evidence was found for a positive effect of MIS THA on pain in the early postoperative period, but that effect diminished after three months postoperatively. Strong evidence was found for an increase in operative time for CAS THA, and limited evidence was found for a decrease in intraoperative blood loss. Furthermore, strong evidence was found for no difference in complication rates, as well as for a significantly lower risk for acetabular outliers. Conclusions The results indicate that MIS THA is a safe surgical procedure, without increases in operative time, blood loss, operative complication rates and component malposition rates. However, the beneficial effect of MIS THA on functional recovery has to be proven. The results also indicate that CAS THA, though resulting in an increase in operative time, may have a positive effect on operative blood loss and operative complication rates. More importantly, the use of CAS results in better positioning of acetabular component of the prosthesis. PMID:20470443
Certification of production-quality gLite Job Management components
NASA Astrophysics Data System (ADS)
Andreetto, P.; Bertocco, S.; Capannini, F.; Cecchi, M.; Dorigo, A.; Frizziero, E.; Giacomini, F.; Gianelle, A.; Mezzadri, M.; Molinari, E.; Monforte, S.; Prelz, F.; Rebatto, D.; Sgaravatto, M.; Zangrando, L.
2011-12-01
With the advent of the recent European Union (EU) funded projects aimed at achieving an open, coordinated and proactive collaboration among the European communities that provide distributed computing services, more strict requirements and quality standards will be asked to middleware providers. Such a highly competitive and dynamic environment, organized to comply a business-oriented model, has already started pursuing quality criteria, thus requiring to formally define rigorous procedures, interfaces and roles for each step of the software life-cycle. This will ensure quality-certified releases and updates of the Grid middleware. In the European Middleware Initiative (EMI), the release management for one or more components will be organized into Product Team (PT) units, fully responsible for delivering production ready, quality-certified software and for coordinating each other to contribute to the EMI release as a whole. This paper presents the certification process, with respect to integration, installation, configuration and testing, adopted at INFN by the Product Team responsible for the gLite Web-Service based Computing Element (CREAM CE) and for the Workload Management System (WMS). The used resources, the testbeds layout, the integration and deployment methods, the certification steps to provide feedback to developers and to grant quality results are described.
Verma, Sadhna; Sarkar, Saradwata; Young, Jason; Venkataraman, Rajesh; Yang, Xu; Bhavsar, Anil; Patil, Nilesh; Donovan, James; Gaitonde, Krishnanath
2016-05-01
The purpose of this study was to compare high b-value (b = 2000 s/mm(2)) acquired diffusion-weighted imaging (aDWI) with computed DWI (cDWI) obtained using four diffusion models-mono-exponential (ME), intra-voxel incoherent motion (IVIM), stretched exponential (SE), and diffusional kurtosis (DK)-with respect to lesion visibility, conspicuity, contrast, and ability to predict significant prostate cancer (PCa). Ninety four patients underwent 3 T MRI including acquisition of b = 2000 s/mm(2) aDWI and low b-value DWI. High b = 2000 s/mm(2) cDWI was obtained using ME, IVIM, SE, and DK models. All images were scored on quality independently by three radiologists. Lesions were identified on all images and graded for lesion conspicuity. For a subset of lesions for which pathological truth was established, lesion-to-background contrast ratios (LBCRs) were computed and binomial generalized linear mixed model analysis was conducted to compare clinically significant PCa predictive capabilities of all DWI. For all readers and all models, cDWI demonstrated higher ratings for image quality and lesion conspicuity than aDWI except DK (p < 0.001). The LBCRs of ME, IVIM, and SE were significantly higher than LBCR of aDWI (p < 0.001). Receiver Operating Characteristic curves obtained from binomial generalized linear mixed model analysis demonstrated higher Area Under the Curves for ME, SE, IVIM, and aDWI compared to DK or PSAD alone in predicting significant PCa. High b-value cDWI using ME, IVIM, and SE diffusion models provide better image quality, lesion conspicuity, and increased LBCR than high b-value aDWI. Using cDWI can potentially provide comparable sensitivity and specificity for detecting significant PCa as high b-value aDWI without increased scan times and image degradation artifacts.
GPU-accelerated Kernel Regression Reconstruction for Freehand 3D Ultrasound Imaging.
Wen, Tiexiang; Li, Ling; Zhu, Qingsong; Qin, Wenjian; Gu, Jia; Yang, Feng; Xie, Yaoqin
2017-07-01
Volume reconstruction method plays an important role in improving reconstructed volumetric image quality for freehand three-dimensional (3D) ultrasound imaging. By utilizing the capability of programmable graphics processing unit (GPU), we can achieve a real-time incremental volume reconstruction at a speed of 25-50 frames per second (fps). After incremental reconstruction and visualization, hole-filling is performed on GPU to fill remaining empty voxels. However, traditional pixel nearest neighbor-based hole-filling fails to reconstruct volume with high image quality. On the contrary, the kernel regression provides an accurate volume reconstruction method for 3D ultrasound imaging but with the cost of heavy computational complexity. In this paper, a GPU-based fast kernel regression method is proposed for high-quality volume after the incremental reconstruction of freehand ultrasound. The experimental results show that improved image quality for speckle reduction and details preservation can be obtained with the parameter setting of kernel window size of [Formula: see text] and kernel bandwidth of 1.0. The computational performance of the proposed GPU-based method can be over 200 times faster than that on central processing unit (CPU), and the volume with size of 50 million voxels in our experiment can be reconstructed within 10 seconds.
Motion artifact detection in four-dimensional computed tomography images
NASA Astrophysics Data System (ADS)
Bouilhol, G.; Ayadi, M.; Pinho, R.; Rit, S.; Sarrut, D.
2014-03-01
Motion artifacts appear in four-dimensional computed tomography (4DCT) images because of suboptimal acquisition parameters or patient breathing irregularities. Frequency of motion artifacts is high and they may introduce errors in radiation therapy treatment planning. Motion artifact detection can be useful for image quality assessment and 4D reconstruction improvement but manual detection in many images is a tedious process. We propose a novel method to evaluate the quality of 4DCT images by automatic detection of motion artifacts. The method was used to evaluate the impact of the optimization of acquisition parameters on image quality at our institute. 4DCT images of 114 lung cancer patients were analyzed. Acquisitions were performed with a rotation period of 0.5 seconds and a pitch of 0.1 (74 patients) or 0.081 (40 patients). A sensitivity of 0.70 and a specificity of 0.97 were observed. End-exhale phases were less prone to motion artifacts. In phases where motion speed is high, the number of detected artifacts was systematically reduced with a pitch of 0.081 instead of 0.1 and the mean reduction was 0.79. The increase of the number of patients with no artifact detected was statistically significant for the 10%, 70% and 80% respiratory phases, indicating a substantial image quality improvement.
NASA Astrophysics Data System (ADS)
Castellano, Isabel; Geleijns, Jacob
After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.
New Integrated Video and Graphics Technology: Digital Video Interactive.
ERIC Educational Resources Information Center
Optical Information Systems, 1987
1987-01-01
Describes digital video interactive (DVI), a new technology which combines the interactivity of the graphics capabilities in personal computers with the realism of high-quality motion video and multitrack audio in an all-digital integrated system. (MES)
NASA Astrophysics Data System (ADS)
Kajiwara, Yusuke; Murata, Hiroaki; Kimura, Haruhiko; Abe, Koji
As a communication support tool for cases of amyotrophic lateral sclerosis (ALS), researches on eye gaze human-computer interfaces have been active. However, since voluntary and involuntary eye movements cannot be distinguished in the interfaces, their performance is still not sufficient for practical use. This paper presents a high performance human-computer interface system which unites high quality recognitions of horizontal directional eye movements and voluntary blinks. The experimental results have shown that the number of incorrect inputs is decreased by 35.1% in an existing system which equips recognitions of horizontal and vertical directional eye movements in addition to voluntary blinks and character inputs are speeded up by 17.4% from the existing system.
Advanced Capabilities for Wind Tunnel Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Kegelman, Jerome T.; Danehy, Paul M.; Schwartz, Richard J.
2010-01-01
Wind tunnel testing methods and test technologies for the 21st century using advanced capabilities are presented. These capabilities are necessary to capture more accurate and high quality test results by eliminating the uncertainties in testing and to facilitate verification of computational tools for design. This paper discusses near term developments underway in ground testing capabilities, which will enhance the quality of information of both the test article and airstream flow details. Also discussed is a selection of new capability investments that have been made to accommodate such developments. Examples include advanced experimental methods for measuring the test gas itself; using efficient experiment methodologies, including quality assurance strategies within the test; and increasing test result information density by using extensive optical visualization together with computed flow field results. These points could be made for both major investments in existing tunnel capabilities or for entirely new capabilities.
Aggregative Learning Method and Its Application for Communication Quality Evaluation
NASA Astrophysics Data System (ADS)
Akhmetov, Dauren F.; Kotaki, Minoru
2007-12-01
In this paper, so-called Aggregative Learning Method (ALM) is proposed to improve and simplify the learning and classification abilities of different data processing systems. It provides a universal basis for design and analysis of mathematical models of wide class. A procedure was elaborated for time series model reconstruction and analysis for linear and nonlinear cases. Data approximation accuracy (during learning phase) and data classification quality (during recall phase) are estimated from introduced statistic parameters. The validity and efficiency of the proposed approach have been demonstrated through its application for monitoring of wireless communication quality, namely, for Fixed Wireless Access (FWA) system. Low memory and computation resources were shown to be needed for the procedure realization, especially for data classification (recall) stage. Characterized with high computational efficiency and simple decision making procedure, the derived approaches can be useful for simple and reliable real-time surveillance and control system design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.
The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational andmore » theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.« less
Quality and security - They work together
NASA Technical Reports Server (NTRS)
Carr, Richard; Tynan, Marie; Davis, Russell
1991-01-01
This paper describes the importance of considering computer security as part of software quality assurance practice. The intended audience is primarily those professionals involved in the design, development, and quality assurance of software. Many issues are raised which point to the need ultimately for integration of quality assurance and computer security disciplines. To address some of the issues raised, the NASA Automated Information Security program is presented as a model which may be used for improving interactions between the quality assurance and computer security community of professionals.
A high-throughput screening approach for the optoelectronic properties of conjugated polymers.
Wilbraham, Liam; Berardo, Enrico; Turcani, Lukas; Jelfs, Kim E; Zwijnenburg, Martijn A
2018-06-25
We propose a general high-throughput virtual screening approach for the optical and electronic properties of conjugated polymers. This approach makes use of the recently developed xTB family of low-computational-cost density functional tight-binding methods from Grimme and co-workers, calibrated here to (TD-)DFT data computed for a representative diverse set of (co-)polymers. Parameters drawn from the resulting calibration using a linear model can then be applied to the xTB derived results for new polymers, thus generating near DFT-quality data with orders of magnitude reduction in computational cost. As a result, after an initial computational investment for calibration, this approach can be used to quickly and accurately screen on the order of thousands of polymers for target applications. We also demonstrate that the (opto)electronic properties of the conjugated polymers show only a very minor variation when considering different conformers and that the results of high-throughput screening are therefore expected to be relatively insensitive with respect to the conformer search methodology applied.
Speckle interferometry. Data acquisition and control for the SPID instrument.
NASA Astrophysics Data System (ADS)
Altarac, S.; Tallon, M.; Thiebaut, E.; Foy, R.
1998-08-01
SPID (SPeckle Imaging by Deconvolution) is a new speckle camera currently under construction at CRAL-Observatoire de Lyon. Its high spectral resolution and high image restoration capabilities open new astrophysical programs. The instrument SPID is composed of four main optical modules which are fully automated and computer controlled by a software written in Tcl/Tk/Tix and C. This software provides an intelligent assistance to the user by choosing observational parameters as a function of atmospheric parameters, computed in real time, and the desired restored image quality. Data acquisition is made by a photon-counting detector (CP40). A VME-based computer under OS9 controls the detector and stocks the data. The intelligent system runs under Linux on a PC. A slave PC under DOS commands the motors. These 3 computers communicate through an Ethernet network. SPID can be considered as a precursor for VLT's (Very Large Telescope, four 8-meter telescopes currently built in Chile by European Southern Observatory) very high spatial resolution camera.
Radiological interpretation of images displayed on tablet computers: a systematic review
Armfield, N R; Smith, A C
2015-01-01
Objective: To review the published evidence and to determine if radiological diagnostic accuracy is compromised when images are displayed on a tablet computer and thereby inform practice on using tablet computers for radiological interpretation by on-call radiologists. Methods: We searched the PubMed and EMBASE databases for studies on the diagnostic accuracy or diagnostic reliability of images interpreted on tablet computers. Studies were screened for inclusion based on pre-determined inclusion and exclusion criteria. Studies were assessed for quality and risk of bias using Quality Appraisal of Diagnostic Reliability Studies or the revised Quality Assessment of Diagnostic Accuracy Studies tool. Treatment of studies was reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Results: 11 studies met the inclusion criteria. 10 of these studies tested the Apple iPad® (Apple, Cupertino, CA). The included studies reported high sensitivity (84–98%), specificity (74–100%) and accuracy rates (98–100%) for radiological diagnosis. There was no statistically significant difference in accuracy between a tablet computer and a digital imaging and communication in medicine-calibrated control display. There was a near complete consensus from authors on the non-inferiority of diagnostic accuracy of images displayed on a tablet computer. All of the included studies were judged to be at risk of bias. Conclusion: Our findings suggest that the diagnostic accuracy of radiological interpretation is not compromised by using a tablet computer. This result is only relevant to the Apple iPad and to the modalities of CT, MRI and plain radiography. Advances in knowledge: The iPad may be appropriate for an on-call radiologist to use for radiological interpretation. PMID:25882691
Use of Cone Beam Computed Tomography in Endodontics
Scarfe, William C.; Levin, Martin D.; Gane, David; Farman, Allan G.
2009-01-01
Cone Beam Computed Tomography (CBCT) is a diagnostic imaging modality that provides high-quality, accurate three-dimensional (3D) representations of the osseous elements of the maxillofacial skeleton. CBCT systems are available that provide small field of view images at low dose with sufficient spatial resolution for applications in endodontic diagnosis, treatment guidance, and posttreatment evaluation. This article provides a literature review and pictorial demonstration of CBCT as an imaging adjunct for endodontics. PMID:20379362
Computer numeric control generation of toric surfaces
NASA Astrophysics Data System (ADS)
Bradley, Norman D.; Ball, Gary A.; Keller, John R.
1994-05-01
Until recently, the manufacture of toric ophthalmic lenses relied largely upon expensive, manual techniques for generation and polishing. Recent gains in computer numeric control (CNC) technology and tooling enable lens designers to employ single- point diamond, fly-cutting methods in the production of torics. Fly-cutting methods continue to improve, significantly expanding lens design possibilities while lowering production costs. Advantages of CNC fly cutting include precise control of surface geometry, rapid production with high throughput, and high-quality lens surface finishes requiring minimal polishing. As accessibility and affordability increase within the ophthalmic market, torics promise to dramatically expand lens design choices available to consumers.
Linguistic Features of Writing Quality
ERIC Educational Resources Information Center
McNamara, Danielle S.; Crossley, Scott A.; McCarthy, Philip M.
2010-01-01
In this study, a corpus of expert-graded essays, based on a standardized scoring rubric, is computationally evaluated so as to distinguish the differences between those essays that were rated as high and those rated as low. The automated tool, Coh-Metrix, is used to examine the degree to which high- and low-proficiency essays can be predicted by…
High quality chemical structure inventories provide the foundation of the U.S. EPA’s ToxCast and Tox21 projects, which are employing high-throughput technologies to screen thousands of chemicals in hundreds of biochemical and cell-based assays, probing a wide diversity of targets...
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
NASA Astrophysics Data System (ADS)
Yin, Leilei; Chen, Ying-Chieh; Gelb, Jeff; Stevenson, Darren M.; Braun, Paul A.
2010-09-01
High resolution x-ray computed tomography is a powerful non-destructive 3-D imaging method. It can offer superior resolution on objects that are opaque or low contrast for optical microscopy. Synchrotron based x-ray computed tomography systems have been available for scientific research, but remain difficult to access for broader users. This work introduces a lab-based high-resolution x-ray nanotomography system with 50nm resolution in absorption and Zernike phase contrast modes. Using this system, we have demonstrated high quality 3-D images of polymerized photonic crystals which have been analyzed for band gap structures. The isotropic volumetric data shows excellent consistency with other characterization results.
A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0
Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.
2014-01-01
We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072
NASA Astrophysics Data System (ADS)
Concepción, O.; Escobosa, A.; de Melo, O.
2018-03-01
Bismuth telluride (Bi2Te3), traditionally used in the industry as thermoelectric material, has deserved much attention recently due to its properties as a topological insulator, a kind of material that might have relevant applications in spintronics or quantum computing, among other innovative uses. The preparation of high-quality material has become a very important technological task. Here, we compare the preparation of Bi2Te3 by physical vapor transport from the evaporation of elemental Bi and Te sources, under either low pressure or atmospheric pressure. The layers were characterized by different techniques to evaluate its structural properties. As a result, it is concluded that, as a consequence of the different transport regimes, films grown at atmospheric pressure present better crystal quality.
Naval Research Lab Review 1999
1999-01-01
Center offers high-quality out- put from computer-generated files in EPS, Postscript, PICT, TIFF, Photoshop , and PowerPoint. Photo- graphic-quality color...767-3200 (228) 688-3390 (831) 656-4731 (410) 257-4000 DSN 297- or 754- 485 878 — Direct- in -Dialing 767- or 404- 688 656 257 Public Affairs (202) 767...research described in this NRL Review can be obtained from the Public Affairs Office, Code 1230, (202) 767-2541. Information concerning Technology
Determining crystal structures through crowdsourcing and coursework
NASA Astrophysics Data System (ADS)
Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A.; Cooper, Seth; Flatten, Jeff; Rogawski, David S.; Koropatkin, Nicole M.; Hailu, Tsinatkeab T.; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S.; Chapman, Matthew R.; Sikkema, Andrew P.; Skiba, Meredith A.; Maloney, Finn P.; Beinlich, Felix R. M.; Caglar, Ahmet; Coral, Alan; Jensen, Alice Elizabeth; Lubow, Allen; Boitano, Amanda; Lisle, Amy Elizabeth; Maxwell, Andrew T.; Failer, Barb; Kaszubowski, Bartosz; Hrytsiv, Bohdan; Vincenzo, Brancaccio; de Melo Cruz, Breno Renan; McManus, Brian Joseph; Kestemont, Bruno; Vardeman, Carl; Comisky, Casey; Neilson, Catherine; Landers, Catherine R.; Ince, Christopher; Buske, Daniel Jon; Totonjian, Daniel; Copeland, David Marshall; Murray, David; Jagieła, Dawid; Janz, Dietmar; Wheeler, Douglas C.; Cali, Elie; Croze, Emmanuel; Rezae, Farah; Martin, Floyd Orville; Beecher, Gil; de Jong, Guido Alexander; Ykman, Guy; Feldmann, Harald; Chan, Hugo Paul Perez; Kovanecz, Istvan; Vasilchenko, Ivan; Connellan, James C.; Borman, Jami Lynne; Norrgard, Jane; Kanfer, Jebbie; Canfield, Jeffrey M.; Slone, Jesse David; Oh, Jimmy; Mitchell, Joanne; Bishop, John; Kroeger, John Douglas; Schinkler, Jonas; McLaughlin, Joseph; Brownlee, June M.; Bell, Justin; Fellbaum, Karl Willem; Harper, Kathleen; Abbey, Kirk J.; Isaksson, Lennart E.; Wei, Linda; Cummins, Lisa N.; Miller, Lori Anne; Bain, Lyn; Carpenter, Lynn; Desnouck, Maarten; Sharma, Manasa G.; Belcastro, Marcus; Szew, Martin; Szew, Martin; Britton, Matthew; Gaebel, Matthias; Power, Max; Cassidy, Michael; Pfützenreuter, Michael; Minett, Michele; Wesselingh, Michiel; Yi, Minjune; Cameron, Neil Haydn Tormey; Bolibruch, Nicholas I.; Benevides, Noah; Kathleen Kerr, Norah; Barlow, Nova; Crevits, Nykole Krystyne; Dunn, Paul; Silveira Belo Nascimento Roque, Paulo Sergio; Riber, Peter; Pikkanen, Petri; Shehzad, Raafay; Viosca, Randy; James Fraser, Robert; Leduc, Robert; Madala, Roman; Shnider, Scott; de Boisblanc, Sharon; Butkovich, Slava; Bliven, Spencer; Hettler, Stephen; Telehany, Stephen; Schwegmann, Steven A.; Parkes, Steven; Kleinfelter, Susan C.; Michael Holst, Sven; van der Laan, T. J. A.; Bausewein, Thomas; Simon, Vera; Pulley, Warwick; Hull, William; Kim, Annes Yukyung; Lawton, Alexis; Ruesch, Amanda; Sundar, Anjali; Lawrence, Anna-Lisa; Afrin, Antara; Maheshwer, Bhargavi; Turfe, Bilal; Huebner, Christian; Killeen, Courtney Elizabeth; Antebi-Lerrman, Dalia; Luan, Danny; Wolfe, Derek; Pham, Duc; Michewicz, Elaina; Hull, Elizabeth; Pardington, Emily; Galal, Galal Osama; Sun, Grace; Chen, Grace; Anderson, Halie E.; Chang, Jane; Hewlett, Jeffrey Thomas; Sterbenz, Jennifer; Lim, Jiho; Morof, Joshua; Lee, Junho; Inn, Juyoung Samuel; Hahm, Kaitlin; Roth, Kaitlin; Nair, Karun; Markin, Katherine; Schramm, Katie; Toni Eid, Kevin; Gam, Kristina; Murphy, Lisha; Yuan, Lucy; Kana, Lulia; Daboul, Lynn; Shammas, Mario Karam; Chason, Max; Sinan, Moaz; Andrew Tooley, Nicholas; Korakavi, Nisha; Comer, Patrick; Magur, Pragya; Savliwala, Quresh; Davison, Reid Michael; Sankaran, Roshun Rajiv; Lewe, Sam; Tamkus, Saule; Chen, Shirley; Harvey, Sho; Hwang, Sin Ye; Vatsia, Sohrab; Withrow, Stefan; Luther, Tahra K.; Manett, Taylor; Johnson, Thomas James; Ryan Brash, Timothy; Kuhlman, Wyatt; Park, Yeonjung; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C. A.
2016-09-01
We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality.
Determining crystal structures through crowdsourcing and coursework.
Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A; Cooper, Seth; Flatten, Jeff; Rogawski, David S; Koropatkin, Nicole M; Hailu, Tsinatkeab T; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S; Chapman, Matthew R; Sikkema, Andrew P; Skiba, Meredith A; Maloney, Finn P; Beinlich, Felix R M; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C A
2016-09-16
We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality.
NASA Astrophysics Data System (ADS)
Shahiri, Amirah Mohamed; Husain, Wahidah; Rashid, Nur'Aini Abd
2017-10-01
Huge amounts of data in educational datasets may cause the problem in producing quality data. Recently, data mining approach are increasingly used by educational data mining researchers for analyzing the data patterns. However, many research studies have concentrated on selecting suitable learning algorithms instead of performing feature selection process. As a result, these data has problem with computational complexity and spend longer computational time for classification. The main objective of this research is to provide an overview of feature selection techniques that have been used to analyze the most significant features. Then, this research will propose a framework to improve the quality of students' dataset. The proposed framework uses filter and wrapper based technique to support prediction process in future study.
Protein sequence annotation in the genome era: the annotation concept of SWISS-PROT+TREMBL.
Apweiler, R; Gateau, A; Contrino, S; Martin, M J; Junker, V; O'Donovan, C; Lang, F; Mitaritonna, N; Kappus, S; Bairoch, A
1997-01-01
SWISS-PROT is a curated protein sequence database which strives to provide a high level of annotation, a minimal level of redundancy and high level of integration with other databases. Ongoing genome sequencing projects have dramatically increased the number of protein sequences to be incorporated into SWISS-PROT. Since we do not want to dilute the quality standards of SWISS-PROT by incorporating sequences without proper sequence analysis and annotation, we cannot speed up the incorporation of new incoming data indefinitely. However, as we also want to make the sequences available as fast as possible, we introduced TREMBL (TRanslation of EMBL nucleotide sequence database), a supplement to SWISS-PROT. TREMBL consists of computer-annotated entries in SWISS-PROT format derived from the translation of all coding sequences (CDS) in the EMBL nucleotide sequence database, except for CDS already included in SWISS-PROT. While TREMBL is already of immense value, its computer-generated annotation does not match the quality of SWISS-PROTs. The main difference is in the protein functional information attached to sequences. With this in mind, we are dedicating substantial effort to develop and apply computer methods to enhance the functional information attached to TREMBL entries.
Evaluation of the low dose cardiac CT imaging using ASIR technique
NASA Astrophysics Data System (ADS)
Fan, Jiahua; Hsieh, Jiang; Deubig, Amy; Sainath, Paavana; Crandall, Peter
2010-04-01
Today Cardiac imaging is one of the key driving forces for the research and development activities of Computed Tomography (CT) imaging. It requires high spatial and temporal resolution and is often associated with high radiation dose. The newly introduced ASIR technique presents an efficient method that offers the dose reduction benefits while maintaining image quality and providing fast reconstruction speed. This paper discusses the study of image quality of the ASIR technique for Cardiac CT imaging. Phantoms as well as clinical data have been evaluated to demonstrate the effectiveness of ASIR technique for Cardiac CT applications.
Teach-Discover-Treat (TDT): Collaborative Computational Drug Discovery for Neglected Diseases
Jansen, Johanna M.; Cornell, Wendy; Tseng, Y. Jane; Amaro, Rommie E.
2012-01-01
Teach – Discover – Treat (TDT) is an initiative to promote the development and sharing of computational tools solicited through a competition with the aim to impact education and collaborative drug discovery for neglected diseases. Collaboration, multidisciplinary integration, and innovation are essential for successful drug discovery. This requires a workforce that is trained in state-of-the-art workflows and equipped with the ability to collaborate on platforms that are accessible and free. The TDT competition solicits high quality computational workflows for neglected disease targets, using freely available, open access tools. PMID:23085175
Aguiar, F C; Segurado, P; Urbanič, G; Cambra, J; Chauvin, C; Ciadamidaro, S; Dörflinger, G; Ferreira, J; Germ, M; Manolaki, P; Minciardi, M R; Munné, A; Papastergiadou, E; Ferreira, M T
2014-04-01
This paper exposes a new methodological approach to solve the problem of intercalibrating river quality national methods when a common metric is lacking and most of the countries share the same Water Framework Directive (WFD) assessment method. We provide recommendations for similar works in future concerning the assessment of ecological accuracy and highlight the importance of a good common ground to make feasible the scientific work beyond the intercalibration. The approach herein presented was applied to highly seasonal rivers of the Mediterranean Geographical Intercalibration Group for the Biological Quality Element Macrophytes. The Mediterranean Group of river macrophytes involved seven countries and two assessment methods with similar acquisition data and assessment concept: the Macrophyte Biological Index for Rivers (IBMR) for Cyprus, France, Greece, Italy, Portugal and Spain, and the River Macrophyte Index (RMI) for Slovenia. Database included 318 sites of which 78 were considered as benchmarks. The boundary harmonization was performed for common WFD-assessment methods (all countries except Slovenia) using the median of the Good/Moderate and High/Good boundaries of all countries. Then, whenever possible, the Slovenian method, RMI was computed for the entire database. The IBMR was also computed for the Slovenian sites and was regressed against RMI in order to check the relatedness of methods (R(2)=0.45; p<0.00001) and to convert RMI boundaries into the IBMR scale. The boundary bias of RMI was computed using direct comparison of classification and the median boundary values following boundary harmonization. The average absolute class differences after harmonization is 26% and the percentage of classifications differing by half of a quality class is also small (16.4%). This multi-step approach to the intercalibration was endorsed by the WFD Regulatory Committee. © 2013 Elsevier B.V. All rights reserved.
Unity Power Factor Operated PFC Converter Based Power Supply for Computers
NASA Astrophysics Data System (ADS)
Singh, Shikha; Singh, Bhim; Bhuvaneswari, G.; Bist, Vashist
2017-11-01
Power Supplies (PSs) employed in personal computers pollute the single phase ac mains by drawing distorted current at a substandard Power Factor (PF). The harmonic distortion of the supply current in these personal computers are observed 75% to 90% with the Crest Factor (CF) being very high which escalates losses in the distribution system. To find a tangible solution to these issues, a non-isolated PFC converter is employed at the input of isolated converter that is capable of improving the input power quality apart from regulating the dc voltage at its output. This is given to the isolated stage that yields completely isolated and stiffly regulated multiple output voltages which is the prime requirement of computer PS. The operation of the proposed PS is evaluated under various operating conditions and the results show improved performance depicting nearly unity PF and low input current harmonics. The prototype of this PS is developed in laboratory environment and test results are recorded which corroborate the power quality improvement observed in simulation results under various operating conditions.
Use of handheld computers in clinical practice: a systematic review.
Mickan, Sharon; Atherton, Helen; Roberts, Nia Wyn; Heneghan, Carl; Tilson, Julie K
2014-07-06
Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals' use of handheld computers improve their access to information and support clinical decision making at the point of care? A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study's aim for assessing the impact of handheld computer use. We included seven randomised trials investigating medical or nursing staffs' use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Healthcare professionals' use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes.
Use of handheld computers in clinical practice: a systematic review
2014-01-01
Background Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals’ use of handheld computers improve their access to information and support clinical decision making at the point of care? Methods A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study’s aim for assessing the impact of handheld computer use. Results We included seven randomised trials investigating medical or nursing staffs’ use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Conclusion Healthcare professionals’ use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes. PMID:24998515
NASA Technical Reports Server (NTRS)
Kuczmarski, Maria A.; Neudeck, Philip G.
2000-01-01
Most solid-state electronic devices diodes, transistors, and integrated circuits are based on silicon. Although this material works well for many applications, its properties limit its ability to function under extreme high-temperature or high-power operating conditions. Silicon carbide (SiC), with its desirable physical properties, could someday replace silicon for these types of applications. A major roadblock to realizing this potential is the quality of SiC material that can currently be produced. Semiconductors require very uniform, high-quality material, and commercially available SiC tends to suffer from defects in the crystalline structure that have largely been eliminated in silicon. In some power circuits, these defects can focus energy into an extremely small area, leading to overheating that can damage the device. In an effort to better understand the way that these defects affect the electrical performance and reliability of an SiC device in a power circuit, the NASA Glenn Research Center at Lewis Field began an in-house three-dimensional computational modeling effort. The goal is to predict the temperature distributions within a SiC diode structure subjected to the various transient overvoltage breakdown stresses that occur in power management circuits. A commercial computational fluid dynamics computer program (FLUENT-Fluent, Inc., Lebanon, New Hampshire) was used to build a model of a defect-free SiC diode and generate a computational mesh. A typical breakdown power density was applied over 0.5 msec in a heated layer at the junction between the p-type SiC and n-type SiC, and the temperature distribution throughout the diode was then calculated. The peak temperature extracted from the computational model agreed well (within 6 percent) with previous first-order calculations of the maximum expected temperature at the end of the breakdown pulse. This level of agreement is excellent for a model of this type and indicates that three-dimensional computational modeling can provide useful predictions for this class of problem. The model is now being extended to include the effects of crystal defects. The model will provide unique insights into how high the temperature rises in the vicinity of the defects in a diode at various power densities and pulse durations. This information also will help researchers in understanding and designing SiC devices for safe and reliable operation in high-power circuits.
MODELS-3/CMAQ APPLICATIONS WHICH ILLUSTRATE CAPABILITY AND FUNCTIONALITY
The Models-3/CMAQ developed by the U.S. Environmental Protections Agency (USEPA) is a third generation multiscale, multi-pollutant air quality modeling system within a high-level, object-oriented computer framework (Models-3). It has been available to the scientific community ...
Desktop Publishing for Counselors.
ERIC Educational Resources Information Center
Lucking, Robert; Mitchum, Nancy
1990-01-01
Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…
Forming n/p Junctions With An Excimer Laser
NASA Technical Reports Server (NTRS)
Alexander, Paul, Jr.; Campbell, Robert B.; Wong, David C.; Bottenberg, William L.; Byron, Stanley
1988-01-01
Compact equipment yields high-quality solar cells. Computer controls pulses of excimer laser and movement of silcon wafer. Mirrors direct laser beam to wafer. Lenses focus beam to small spot on surface. Process suitable for silicon made by dendritic-web-growth process.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Science, Space and Technology.
The integration of television into a digital framework makes possible the merger of television and computers. Development of a digital system will permit the consumer to receive television and computer images on the same screen at a quality approaching 35mm film. If fiber optic telecommunications lines are linked to the home and standards are…
Fault tolerant computer control for a Maglev transportation system
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Nagle, Gail A.; Anagnostopoulos, George
1994-01-01
Magnetically levitated (Maglev) vehicles operating on dedicated guideways at speeds of 500 km/hr are an emerging transportation alternative to short-haul air and high-speed rail. They have the potential to offer a service significantly more dependable than air and with less operating cost than both air and high-speed rail. Maglev transportation derives these benefits by using magnetic forces to suspend a vehicle 8 to 200 mm above the guideway. Magnetic forces are also used for propulsion and guidance. The combination of high speed, short headways, stringent ride quality requirements, and a distributed offboard propulsion system necessitates high levels of automation for the Maglev control and operation. Very high levels of safety and availability will be required for the Maglev control system. This paper describes the mission scenario, functional requirements, and dependability and performance requirements of the Maglev command, control, and communications system. A distributed hierarchical architecture consisting of vehicle on-board computers, wayside zone computers, a central computer facility, and communication links between these entities was synthesized to meet the functional and dependability requirements on the maglev. Two variations of the basic architecture are described: the Smart Vehicle Architecture (SVA) and the Zone Control Architecture (ZCA). Preliminary dependability modeling results are also presented.
Soft bilateral filtering volumetric shadows using cube shadow maps
Ali, Hatam H.; Sunar, Mohd Shahrizal; Kolivand, Hoshang
2017-01-01
Volumetric shadows often increase the realism of rendered scenes in computer graphics. Typical volumetric shadows techniques do not provide a smooth transition effect in real-time with conservation on crispness of boundaries. This research presents a new technique for generating high quality volumetric shadows by sampling and interpolation. Contrary to conventional ray marching method, which requires extensive time, this proposed technique adopts downsampling in calculating ray marching. Furthermore, light scattering is computed in High Dynamic Range buffer to generate tone mapping. The bilateral interpolation is used along a view rays to smooth transition of volumetric shadows with respect to preserving-edges. In addition, this technique applied a cube shadow map to create multiple shadows. The contribution of this technique isreducing the number of sample points in evaluating light scattering and then introducing bilateral interpolation to improve volumetric shadows. This contribution is done by removing the inherent deficiencies significantly in shadow maps. This technique allows obtaining soft marvelous volumetric shadows, having a good performance and high quality, which show its potential for interactive applications. PMID:28632740
Atropos: specific, sensitive, and speedy trimming of sequencing reads.
Didion, John P; Martin, Marcel; Collins, Francis S
2017-01-01
A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.
Asadollahi, Aziz; Khazanovich, Lev
2018-04-11
The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.
Atropos: specific, sensitive, and speedy trimming of sequencing reads
Collins, Francis S.
2017-01-01
A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074
Experimental entanglement purification of arbitrary unknown states.
Pan, Jian-Wei; Gasparoni, Sara; Ursin, Rupert; Weihs, Gregor; Zeilinger, Anton
2003-05-22
Distribution of entangled states between distant locations is essential for quantum communication over large distances. But owing to unavoidable decoherence in the quantum communication channel, the quality of entangled states generally decreases exponentially with the channel length. Entanglement purification--a way to extract a subset of states of high entanglement and high purity from a large set of less entangled states--is thus needed to overcome decoherence. Besides its important application in quantum communication, entanglement purification also plays a crucial role in error correction for quantum computation, because it can significantly increase the quality of logic operations between different qubits. Here we demonstrate entanglement purification for general mixed states of polarization-entangled photons using only linear optics. Typically, one photon pair of fidelity 92% could be obtained from two pairs, each of fidelity 75%. In our experiments, decoherence is overcome to the extent that the technique would achieve tolerable error rates for quantum repeaters in long-distance quantum communication. Our results also imply that the requirement of high-accuracy logic operations in fault-tolerant quantum computation can be considerably relaxed.
Early Citability of Data vs Peer-Review like Data Publishing Procedures
NASA Astrophysics Data System (ADS)
Stockhause, Martina; Höck, Heinke; Toussaint, Frank; Lautenschlager, Michael
2014-05-01
The World Data Center for Climate (WDCC) hosted at the German Climate Computing Center (DKRZ) was one of the first data centers, which established a peer-review like data publication procedure resulting in DataCite DOIs. Data in the long-term archive (LTA) is diligently reviewed by data managers and data authors to grant high quality and widely reusability of the published data. This traditional data publication procedure for LTA data bearing DOIs is very time consuming especially for WDCC's high data volumes of climate model data in the order of multiple TBytes. Data is shared with project members and selected scientists months before the data is long-term archived. The scientific community analyses and thus reviews the data leading to data quality improvements. Scientists wish to cite these unstable data in scientific publications before the long-term archiving and the thorough data review process are finalized. A concept for early preprint DOIs for shared but not yet long-term archived data is presented. Requirements on data documentation, persistence and quality and use cases for preprint DOIs within the data life-cycle are discussed as well as questions of how to document the differences of the two DOI types and how to relate them to each other with the recommendation to use LTA DOIs in citations. WDCC wants to offer an additional user service for early citations of data of basic quality without compromising the LTA DOIs, i.e. WDCC's standard DOIs, as trustworthy indicator for high quality data. Referencing Links: World Data Center for Climate (WDCC): http://www.wdc-climate.de German Climate Computing Center (DKRZ): http://www.dkrz.de DataCite: http://datacite.org
Matsutani, Hideyuki; Sano, Tomonari; Kondo, Takeshi; Fujimoto, Shinichiro; Sekine, Takako; Arai, Takehiro; Morita, Hitomi; Takase, Shinichi
2010-12-20
A high radiation dose associated with 64 multidetector-row computed tomography (64-MDCT) is a major concern for physicians and patients alike. A new 320 row area detector computed tomography (ADCT) can obtain a view of the entire heart with one rotation (0.35 s) without requiring the helical method. As such, ADCT is expected to reduce the radiation dose. We studied image quality and radiation dose of ADCT compared to that of 64-MDCT in patients with a low heart rate (HR≤60). Three hundred eighty-five consecutive patients underwent 64-MDCT and 379 patients, ADCT. Patients with an arrhythmia were excluded. Prospective ECG-gated helical scan with high HP (FlashScan) in 64 was used for MDCT and prospective ECG-gated conventional one beat scan, for 320-ADCT. Image quality was visually evaluated by an image quality score. Radiation dose was estimated by DLP (mGy・cm) for 64-MDCT and DLP.e (mGy・cm) for 320-ADCT. Radiation dose of 320-ADCT (208±48 mGy・cm) was significantly (P<0.0001) lower than that of 64-MDCT (484±112 mGy・cm), and image quality score of 320-ADCT (3.0±0.2) was significantly (P=0.0011) higher than that of 64-MDCT (2.9±0.4). Scan time of 320-ADCT (1.4±0.1 s) was also significantly (P<0.0001) shorter than that of 64-MDCT (6.8±0.6 s). 320-ADCT can achieve not only a reduction in radiation dose but also a superior image quality and shortening of scan time compared to 64-MDCT.
ESR/ERS white paper on lung cancer screening
Bonomo, Lorenzo; Gaga, Mina; Nackaerts, Kristiaan; Peled, Nir; Prokop, Mathias; Remy-Jardin, Martine; von Stackelberg, Oyunbileg; Sculier, Jean-Paul
2015-01-01
Lung cancer is the most frequently fatal cancer, with poor survival once the disease is advanced. Annual low dose computed tomography has shown a survival benefit in screening individuals at high risk for lung cancer. Based on the available evidence, the European Society of Radiology and the European Respiratory Society recommend lung cancer screening in comprehensive, quality-assured, longitudinal programmes within a clinical trial or in routine clinical practice at certified multidisciplinary medical centres. Minimum requirements include: standardised operating procedures for low dose image acquisition, computer-assisted nodule evaluation, and positive screening results and their management; inclusion/exclusion criteria; expectation management; and smoking cessation programmes. Further refinements are recommended to increase quality, outcome and cost-effectiveness of lung cancer screening: inclusion of risk models, reduction of effective radiation dose, computer-assisted volumetric measurements and assessment of comorbidities (chronic obstructive pulmonary disease and vascular calcification). All these requirements should be adjusted to the regional infrastructure and healthcare system, in order to exactly define eligibility using a risk model, nodule management and quality assurance plan. The establishment of a central registry, including biobank and image bank, and preferably on a European level, is strongly encouraged. PMID:25929956
Computer users' ergonomics and quality of life - evidence from a developing country.
Ahmed, Ishfaq; Shaukat, Muhammad Zeeshan
2018-06-01
This study is aimed at investigating the quality of workplace ergonomics at various Pakistani organizations and quality of life of computer users working in these organizations. Two hundred and thirty-five computer users (only those employees who have to do most of their job tasks on computer or laptop, and at their office) responded by filling the questionnaire covering questions on workplace ergonomics and quality of life. Findings of the study revealed the ergonomics at those organizations was poor and unfavourable. The quality of life (both physical and mental health of the employees) of respondents was poor for employees who had unfavourable ergonomic environment. The findings thus highlight an important issue prevalent at Pakistani work settings.
Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers
NASA Astrophysics Data System (ADS)
Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi
2018-03-01
Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.
Hwang, I-Shyan
2017-01-01
The K-coverage configuration that guarantees coverage of each location by at least K sensors is highly popular and is extensively used to monitor diversified applications in wireless sensor networks. Long network lifetime and high detection quality are the essentials of such K-covered sleep-scheduling algorithms. However, the existing sleep-scheduling algorithms either cause high cost or cannot preserve the detection quality effectively. In this paper, the Pre-Scheduling-based K-coverage Group Scheduling (PSKGS) and Self-Organized K-coverage Scheduling (SKS) algorithms are proposed to settle the problems in the existing sleep-scheduling algorithms. Simulation results show that our pre-scheduled-based KGS approach enhances the detection quality and network lifetime, whereas the self-organized-based SKS algorithm minimizes the computation and communication cost of the nodes and thereby is energy efficient. Besides, SKS outperforms PSKGS in terms of network lifetime and detection quality as it is self-organized. PMID:29257078
Wire EDM for Refractory Materials
NASA Technical Reports Server (NTRS)
Zellars, G. R.; Harris, F. E.; Lowell, C. E.; Pollman, W. M.; Rys, V. J.; Wills, R. J.
1982-01-01
In an attempt to reduce fabrication time and costs, Wire Electrical Discharge Machine (Wire EDM) method was investigated as tool for fabricating matched blade roots and disk slots. Eight high-strength nickel-base superalloys were used. Computer-controlled Wire EDM technique provided high quality surfaces with excellent dimensional tolerances. Wire EDM method offers potential for substantial reductions in fabrication costs for "hard to machine" alloys and electrically conductive materials in specific high-precision applications.
Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data
NASA Astrophysics Data System (ADS)
Martins, Fabio J. W. A.; Foucaut, Jean-Marc; Thomas, Lionel; Azevedo, Luis F. A.; Stanislas, Michel
2015-08-01
Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time.
An improved multi-exposure approach for high quality holographic femtosecond laser patterning
NASA Astrophysics Data System (ADS)
Zhang, Chenchu; Hu, Yanlei; Li, Jiawen; Lao, Zhaoxin; Ni, Jincheng; Chu, Jiaru; Huang, Wenhao; Wu, Dong
2014-12-01
High efficiency two photon polymerization through single exposure via spatial light modulator (SLM) has been used to decrease the fabrication time and rapidly realize various micro/nanostructures, but the surface quality remains a big problem due to the speckle noise of optical intensity distribution at the defocused plane. Here, a multi-exposure approach which used tens of computer generate holograms successively loaded on SLM is presented to significantly improve the optical uniformity without losing efficiency. By applying multi-exposure, we found that the uniformity at the defocused plane was increased from ˜0.02 to ˜0.6 according to our simulation. The fabricated two series of letters "HELLO" and "USTC" under single-and multi-exposure in our experiment also verified that the surface quality was greatly improved. Moreover, by this method, several kinds of beam splitters with high quality, e.g., 2 × 2, 5 × 5 Daman, and complex nonseperate 5 × 5, gratings were fabricated with both of high quality and short time (<1 min, 95% time-saving). This multi-exposure SLM-two-photon polymerization method showed the promising prospect in rapidly fabricating and integrating various binary optical devices and their systems.
High Density Aerial Image Matching: State-Of and Future Prospects
NASA Astrophysics Data System (ADS)
Haala, N.; Cavegn, S.
2016-06-01
Ongoing innovations in matching algorithms are continuously improving the quality of geometric surface representations generated automatically from aerial images. This development motivated the launch of the joint ISPRS/EuroSDR project "Benchmark on High Density Aerial Image Matching", which aims on the evaluation of photogrammetric 3D data capture in view of the current developments in dense multi-view stereo-image matching. Originally, the test aimed on image based DSM computation from conventional aerial image flights for different landuse and image block configurations. The second phase then put an additional focus on high quality, high resolution 3D geometric data capture in complex urban areas. This includes both the extension of the test scenario to oblique aerial image flights as well as the generation of filtered point clouds as additional output of the respective multi-view reconstruction. The paper uses the preliminary outcomes of the benchmark to demonstrate the state-of-the-art in airborne image matching with a special focus of high quality geometric data capture in urban scenarios.
The relationship between computer games and quality of life in adolescents.
Dolatabadi, Nayereh Kasiri; Eslami, Ahmad Ali; Mostafavi, Firooze; Hassanzade, Akbar; Moradi, Azam
2013-01-01
Term of doing computer games among teenagers is growing rapidly. This popular phenomenon can cause physical and psychosocial issues in them. Therefore, this study examined the relationship between computer games and quality of life domains in adolescents aging 12-15 years. In a cross-sectional study using the 2-stage stratified cluster sampling method, 444 male and female students in Borkhar were selected. The data collection tool consisted of 1) World Health Organization Quality Of Life - BREF questionnaire and 2) personal information questionnaire. The data were analyzed by Pearson correlation, Spearman correlation, chi-square, independent t-tests and analysis of covariance. The total mean score of quality of life in students was 67.11±13.34. The results showed a significant relationship between the age of starting to play games and the overall quality of life score and its fourdomains (range r=-0.13 to -0.18). The mean of overall quality of life score in computer game users was 68.27±13.03 while it was 64.81±13.69 among those who did not play computer games and the difference was significant (P=0.01). There were significant differences in environmental and mental health domains between the two groups (P<0.05). However, there was no significant relationship between BMI with the time spent and the type of computer games. Playing computer games for a short time under parental supervision can have positive effects on quality of life in adolescents. However, spending long hours for playing computer games may have negative long-term effects.
Computer use, symptoms, and quality of life.
Hayes, John R; Sheedy, James E; Stelmack, Joan A; Heaney, Catherine A
2007-08-01
To model the effects of computer use on reported visual and physical symptoms and to measure the effects upon quality of life measures. A survey of 1000 university employees (70.5% adjusted response rate) assessed visual and physical symptoms, job, physical and mental demands, ability to control/influence work, amount of work at a computer, computer work environment, relations with others at work, life and job satisfaction, and quality of life. Data were analyzed to determine whether self-reported eye symptoms are associated with perceived quality of life. The study also explored the factors that are associated with eye symptoms. Structural equation modeling and multiple regression analyses were used to assess the hypotheses. Seventy percent of the employees used some form of vision correction during computer use, 2.9% used glasses specifically prescribed for computer use, and 8% had had refractive surgery. Employees spent an average of 6 h per day at the computer. In a multiple regression framework, the latent variable eye symptoms was significantly associated with a composite quality of life variable (p = 0.02) after adjusting for job quality, job satisfaction, supervisor relations, co-worker relations, mental and physical load of the job, and job demand. Age and gender were not significantly associated with symptoms. After adjusting for age, gender, ergonomics, hours at the computer, and exercise, eye symptoms were significantly associated with physical symptoms (p < 0.001) accounting for 48% of the variance. Environmental variability at work was associated with eye symptoms and eye symptoms demonstrated a significant impact on quality of life and physical symptoms.
Liu, Ping; Li, Guodong; Liu, Xinggao; Xiao, Long; Wang, Yalin; Yang, Chunhua; Gui, Weihua
2018-02-01
High quality control method is essential for the implementation of aircraft autopilot system. An optimal control problem model considering the safe aerodynamic envelop is therefore established to improve the control quality of aircraft flight level tracking. A novel non-uniform control vector parameterization (CVP) method with time grid refinement is then proposed for solving the optimal control problem. By introducing the Hilbert-Huang transform (HHT) analysis, an efficient time grid refinement approach is presented and an adaptive time grid is automatically obtained. With this refinement, the proposed method needs fewer optimization parameters to achieve better control quality when compared with uniform refinement CVP method, whereas the computational cost is lower. Two well-known flight level altitude tracking problems and one minimum time cost problem are tested as illustrations and the uniform refinement control vector parameterization method is adopted as the comparative base. Numerical results show that the proposed method achieves better performances in terms of optimization accuracy and computation cost; meanwhile, the control quality is efficiently improved. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Quality indexing with computer-aided lexicography
NASA Technical Reports Server (NTRS)
Buchan, Ronald L.
1992-01-01
Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.
Resource Provisioning in SLA-Based Cluster Computing
NASA Astrophysics Data System (ADS)
Xiong, Kaiqi; Suh, Sang
Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.
Quantitative evaluation of 3D images produced from computer-generated holograms
NASA Astrophysics Data System (ADS)
Sheerin, David T.; Mason, Ian R.; Cameron, Colin D.; Payne, Douglas A.; Slinger, Christopher W.
1999-08-01
Advances in computing and optical modulation techniques now make it possible to anticipate the generation of near real- time, reconfigurable, high quality, three-dimensional images using holographic methods. Computer generated holography (CGH) is the only technique which holds promise of producing synthetic images having the full range of visual depth cues. These realistic images will be viewable by several users simultaneously, without the need for headtracking or special glasses. Such a data visualization tool will be key to speeding up the manufacture of new commercial and military equipment by negating the need for the production of physical 3D models in the design phase. DERA Malvern has been involved in designing and testing fixed CGH in order to understand the connection between the complexity of the CGH, the algorithms used to design them, the processes employed in their implementation and the quality of the images produced. This poster describes results from CGH containing up to 108 pixels. The methods used to evaluate the reconstructed images are discussed and quantitative measures of image fidelity made. An understanding of the effect of the various system parameters upon final image quality enables a study of the possible system trade-offs to be carried out. Such an understanding of CGH production and resulting image quality is key to effective implementation of a reconfigurable CGH system currently under development at DERA.
Classification of dried vegetables using computer image analysis and artificial neural networks
NASA Astrophysics Data System (ADS)
Koszela, K.; Łukomski, M.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Zaborowicz, M.; Wojcieszak, D.
2017-07-01
In the recent years, there has been a continuously increasing demand for vegetables and dried vegetables. This trend affects the growth of the dehydration industry in Poland helping to exploit excess production. More and more often dried vegetables are used in various sectors of the food industry, both due to their high nutritional qualities and changes in consumers' food preferences. As we observe an increase in consumer awareness regarding a healthy lifestyle and a boom in health food, there is also an increase in the consumption of such food, which means that the production and crop area can increase further. Among the dried vegetables, dried carrots play a strategic role due to their wide application range and high nutritional value. They contain high concentrations of carotene and sugar which is present in the form of crystals. Carrots are also the vegetables which are most often subjected to a wide range of dehydration processes; this makes it difficult to perform a reliable qualitative assessment and classification of this dried product. The many qualitative properties of dried carrots determining their positive or negative quality assessment include colour and shape. The aim of the research was to develop and implement the model of a computer system for the recognition and classification of freeze-dried, convection-dried and microwave vacuum dried products using the methods of computer image analysis and artificial neural networks.
NASA Astrophysics Data System (ADS)
Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.
2012-01-01
Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.
Software metrics: Software quality metrics for distributed systems. [reliability engineering
NASA Technical Reports Server (NTRS)
Post, J. V.
1981-01-01
Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.
Short-term forecasting tools for agricultural nutrient management
USDA-ARS?s Scientific Manuscript database
The advent of real time/short term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high performance computing and hydrologic/climate modeling have enabled rapid dissemination of ...
The CompTox Chemistry Dashboard - A Community Data Resource for Environmental Chemistry
Despite an abundance of online databases providing access to chemical data, there is increasing demand for high-quality, structure-curated, open data to meet the various needs of the environmental sciences and computational toxicology communities. The U.S. Environmental Protectio...
Evaluating Computer-Generated Domain-Oriented Vocabularies.
ERIC Educational Resources Information Center
Damerau, Fred J.
1990-01-01
Discusses methods for automatically compiling domain-oriented vocabularies in natural language systems and describes techniques for evaluating the quality of the resulting word lists. A study is described that used subject headings from Grolier's Encyclopedia and the United Press International newswire, and filters for removing high frequency…
Midlands Teaching Factory, LTD.
ERIC Educational Resources Information Center
Midlands Technical Coll., Columbia, SC.
In 1987, Midlands Technical College (MTC), in Columbia, South Carolina, initiated a Computer Integrated Manufacturing (CIM) project, the Midlands Teaching Factory, LTD, which integrated various college departments with the goal of manufacturing a high quality, saleable product. The faculty developed a teaching factory model which was designed to…
Analysis and Design of Bridgeless Switched Mode Power Supply for Computers
NASA Astrophysics Data System (ADS)
Singh, S.; Bhuvaneswari, G.; Singh, B.
2014-09-01
Switched mode power supplies (SMPSs) used in computers need multiple isolated and stiffly regulated output dc voltages with different current ratings. These isolated multiple output dc voltages are obtained by using a multi-winding high frequency transformer (HFT). A half-bridge dc-dc converter is used here for obtaining different isolated and well regulated dc voltages. In the front end, non-isolated Single Ended Primary Inductance Converters (SEPICs) are added to improve the power quality in terms of low input current harmonics and high power factor (PF). Two non-isolated SEPICs are connected in a way to completely eliminate the need of single-phase diode-bridge rectifier at the front end. Output dc voltages at both the non-isolated and isolated stages are controlled and regulated separately for power quality improvement. A voltage mode control approach is used in the non-isolated SEPIC stage for simple and effective control whereas average current control is used in the second isolated stage.
NASA Technical Reports Server (NTRS)
Schumann, H. H. (Principal Investigator)
1972-01-01
The author has identified the following significant results. Preliminary analysis of DCS data from the USGS Verde River stream flow measuring site indicates the DCS system is furnishing high quality data more frequently than had been expected. During the 43-day period between Nov. 3, and Dec. 15, 1972, 552 DCS transmissions were received during 193 data passes. The amount of data received far exceeded the single high quality transmission per 12-hour period expected from the DCS system. The digital-parallel ERTS-1 data has furnished sufficient to accurately compute mean daily gage heights. These in turn, are used to compute average daily streamflow rates during periods of stable or slowly changing flow conditions. The digital-parallel data has also furnished useful information during peak flow periods. However, the serial-digital DCS capability, currently under development for transmitting streamflow data, should provide data of greater utility for determining times of flood peaks.
Al-Shahi, R; Sadler, M; Rees, G; Bateman, D
2002-01-01
The growing use of email and the world wide web (WWW), by the public, academics, and clinicians—as well as the increasing availability of high quality information on the WWW—make a working knowledge of the internet important. Although this article aims to enhance readers' existing use of the internet and medical resources on the WWW, it is also intelligible to someone unfamiliar with the internet. A web browser is one of the central pieces of software in modern computing: it is a window on the WWW, file transfer protocol sites, networked newsgroups, and your own computer's files. Effective use of the internet for professional purposes requires an understanding of the best strategies to search the WWW and the mechanisms for ensuring secure data transfer, as well as a compendium of online resources including journals, textbooks, medical portals, and sites providing high quality patient information. This article summarises these resources, available to incorporate into your web browser as downloadable "Favorites" or "Bookmarks" from www.jnnp.com, where there are also freely accessible hypertext links to the recommended sites. PMID:12438460
PRaVDA: High Energy Physics towards proton Computed Tomography
NASA Astrophysics Data System (ADS)
Price, T.; PRaVDA Consortium
2016-07-01
Proton radiotherapy is an increasingly popular modality for treating cancers of the head and neck, and in paediatrics. To maximise the potential of proton radiotherapy it is essential to know the distribution, and more importantly the proton stopping powers, of the body tissues between the proton beam and the tumour. A stopping power map could be measured directly, and uncertainties in the treatment vastly reduce, if the patient was imaged with protons instead of conventional x-rays. Here we outline the application of technologies developed for High Energy Physics to provide clinical-quality proton Computed Tomography, in so reducing range uncertainties and enhancing the treatment of cancer.
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira
2013-04-26
Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.
ERIC Educational Resources Information Center
Cheema, Jehanzeb R.; Zhang, Bo
2013-01-01
This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…
Inexact hardware for modelling weather & climate
NASA Astrophysics Data System (ADS)
Düben, Peter D.; McNamara, Hugh; Palmer, Tim
2014-05-01
The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing exact calculations in exchange for improvements in performance and potentially accuracy and a reduction in power consumption. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud resolving atmospheric modelling. The impact of both, hardware induced faults and low precision arithmetic is tested in the dynamical core of a global atmosphere model. Our simulations show that both approaches to inexact calculations do not substantially affect the quality of the model simulations, provided they are restricted to act only on smaller scales. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Li, T; Zhao, S; Liu, J; Yang, L; Huang, Z; Li, J; Luo, C; Li, X
2017-10-01
To investigate the use of second-generation dual-source high-pitch computed tomography in obtaining confident diagnostic image quality using a low radiation dose in young patients with congenital heart disease (CHD). From July 2014 to June 2016, 50 consecutive children <4 years with complex CHD underwent electrocardiography (ECG)-triggered dual-source computed tomography (CT). The patients were assigned randomly to two groups: high-pitch (pitch 3.4) spiral dual-source CT acquisition (group A) and retrospectively spiral dual-source CT acquisition (group B). The image quality, diagnostic accuracy, coronary artery origin, course demonstration, and radiation exposure were compared between the two groups. Fifty examinations were performed (group A, 25; group B, 25). There were no significant differences in image quality, diagnostic accuracy, coronary artery origin, and course demonstration between the two groups. The image quality scores were 1.3±0.4 in group A and 1.1±0.3 in group B (p=0.2). The diagnostic accuracy was 100% in both groups. The coronary arteries were traceable in 80% in group A and 84% in group B (p=0.7). A single coronary artery was identified in one case in group A and the left anterior descending (LAD) branch originated from the right coronary artery (RCA) in one case in group B. There were significant differences in the effective doses between the two groups (0.40±0.20 mSv in group A and 2.7±1.0 mSv in group B, p<0.05). Intra-cardiac and extra-cardiac malformation, coronary artery origin, and course malformation can be visualised clearly using a high-pitch ECG-triggered dual-source CT with a low radiation dose and good image quality in patients with CHD. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Quality improving techniques for free-viewpoint DIBR
NASA Astrophysics Data System (ADS)
Do, Luat; Zinger, Sveta; de With, Peter H. N.
2010-02-01
Interactive free-viewpoint selection applied to a 3D multi-view signal is a possible attractive feature of the rapidly developing 3D TV media. This paper explores a new rendering algorithm that computes a free-viewpoint based on depth image warping between two reference views from existing cameras. We have developed three quality enhancing techniques that specifically aim at solving the major artifacts. First, resampling artifacts are filled in by a combination of median filtering and inverse warping. Second, contour artifacts are processed while omitting warping of edges at high discontinuities. Third, we employ a depth signal for more accurate disocclusion inpainting. We obtain an average PSNR gain of 3 dB and 4.5 dB for the 'Breakdancers' and 'Ballet' sequences, respectively, compared to recently published results. While experimenting with synthetic data, we observe that the rendering quality is highly dependent on the complexity of the scene. Moreover, experiments are performed using compressed video from surrounding cameras. The overall system quality is dominated by the rendering quality and not by coding.
Determining crystal structures through crowdsourcing and coursework
Horowitz, Scott; Koepnick, Brian; Martin, Raoul; Tymieniecki, Agnes; Winburn, Amanda A.; Cooper, Seth; Flatten, Jeff; Rogawski, David S.; Koropatkin, Nicole M.; Hailu, Tsinatkeab T.; Jain, Neha; Koldewey, Philipp; Ahlstrom, Logan S.; Chapman, Matthew R.; Sikkema, Andrew P.; Skiba, Meredith A.; Maloney, Finn P.; Beinlich, Felix R. M.; Caglar, Ahmet; Coral, Alan; Jensen, Alice Elizabeth; Lubow, Allen; Boitano, Amanda; Lisle, Amy Elizabeth; Maxwell, Andrew T.; Failer, Barb; Kaszubowski, Bartosz; Hrytsiv, Bohdan; Vincenzo, Brancaccio; de Melo Cruz, Breno Renan; McManus, Brian Joseph; Kestemont, Bruno; Vardeman, Carl; Comisky, Casey; Neilson, Catherine; Landers, Catherine R.; Ince, Christopher; Buske, Daniel Jon; Totonjian, Daniel; Copeland, David Marshall; Murray, David; Jagieła, Dawid; Janz, Dietmar; Wheeler, Douglas C.; Cali, Elie; Croze, Emmanuel; Rezae, Farah; Martin, Floyd Orville; Beecher, Gil; de Jong, Guido Alexander; Ykman, Guy; Feldmann, Harald; Chan, Hugo Paul Perez; Kovanecz, Istvan; Vasilchenko, Ivan; Connellan, James C.; Borman, Jami Lynne; Norrgard, Jane; Kanfer, Jebbie; Canfield, Jeffrey M.; Slone, Jesse David; Oh, Jimmy; Mitchell, Joanne; Bishop, John; Kroeger, John Douglas; Schinkler, Jonas; McLaughlin, Joseph; Brownlee, June M.; Bell, Justin; Fellbaum, Karl Willem; Harper, Kathleen; Abbey, Kirk J.; Isaksson, Lennart E.; Wei, Linda; Cummins, Lisa N.; Miller, Lori Anne; Bain, Lyn; Carpenter, Lynn; Desnouck, Maarten; Sharma, Manasa G.; Belcastro, Marcus; Szew, Martin; Szew, Martin; Britton, Matthew; Gaebel, Matthias; Power, Max; Cassidy, Michael; Pfützenreuter, Michael; Minett, Michele; Wesselingh, Michiel; Yi, Minjune; Cameron, Neil Haydn Tormey; Bolibruch, Nicholas I.; Benevides, Noah; Kathleen Kerr, Norah; Barlow, Nova; Crevits, Nykole Krystyne; Dunn, Paul; Roque, Paulo Sergio Silveira Belo Nascimento; Riber, Peter; Pikkanen, Petri; Shehzad, Raafay; Viosca, Randy; James Fraser, Robert; Leduc, Robert; Madala, Roman; Shnider, Scott; de Boisblanc, Sharon; Butkovich, Slava; Bliven, Spencer; Hettler, Stephen; Telehany, Stephen; Schwegmann, Steven A.; Parkes, Steven; Kleinfelter, Susan C.; Michael Holst, Sven; van der Laan, T. J. A.; Bausewein, Thomas; Simon, Vera; Pulley, Warwick; Hull, William; Kim, Annes Yukyung; Lawton, Alexis; Ruesch, Amanda; Sundar, Anjali; Lawrence, Anna-Lisa; Afrin, Antara; Maheshwer, Bhargavi; Turfe, Bilal; Huebner, Christian; Killeen, Courtney Elizabeth; Antebi-Lerrman, Dalia; Luan, Danny; Wolfe, Derek; Pham, Duc; Michewicz, Elaina; Hull, Elizabeth; Pardington, Emily; Galal, Galal Osama; Sun, Grace; Chen, Grace; Anderson, Halie E.; Chang, Jane; Hewlett, Jeffrey Thomas; Sterbenz, Jennifer; Lim, Jiho; Morof, Joshua; Lee, Junho; Inn, Juyoung Samuel; Hahm, Kaitlin; Roth, Kaitlin; Nair, Karun; Markin, Katherine; Schramm, Katie; Toni Eid, Kevin; Gam, Kristina; Murphy, Lisha; Yuan, Lucy; Kana, Lulia; Daboul, Lynn; Shammas, Mario Karam; Chason, Max; Sinan, Moaz; Andrew Tooley, Nicholas; Korakavi, Nisha; Comer, Patrick; Magur, Pragya; Savliwala, Quresh; Davison, Reid Michael; Sankaran, Roshun Rajiv; Lewe, Sam; Tamkus, Saule; Chen, Shirley; Harvey, Sho; Hwang, Sin Ye; Vatsia, Sohrab; Withrow, Stefan; Luther, Tahra K; Manett, Taylor; Johnson, Thomas James; Ryan Brash, Timothy; Kuhlman, Wyatt; Park, Yeonjung; Popović, Zoran; Baker, David; Khatib, Firas; Bardwell, James C. A.
2016-01-01
We show here that computer game players can build high-quality crystal structures. Introduction of a new feature into the computer game Foldit allows players to build and real-space refine structures into electron density maps. To assess the usefulness of this feature, we held a crystallographic model-building competition between trained crystallographers, undergraduate students, Foldit players and automatic model-building algorithms. After removal of disordered residues, a team of Foldit players achieved the most accurate structure. Analysing the target protein of the competition, YPL067C, uncovered a new family of histidine triad proteins apparently involved in the prevention of amyloid toxicity. From this study, we conclude that crystallographers can utilize crowdsourcing to interpret electron density information and to produce structure solutions of the highest quality. PMID:27633552
An analytical study of electric vehicle handling dynamics
NASA Technical Reports Server (NTRS)
Greene, J. E.; Segal, D. J.
1979-01-01
Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.
Mulshine, James L; Avila, Rick; Yankelevitz, David; Baer, Thomas M; Estépar, Raul San Jose; Ambrose, Laurie Fenton; Aldigé, Carolyn R
2015-05-01
The Prevent Cancer Foundation Lung Cancer Workshop XI: Tobacco-Induced Disease: Advances in Policy, Early Detection and Management was held in New York, NY on May 16 and 17, 2014. The two goals of the Workshop were to define strategies to drive innovation in precompetitive quantitative research on the use of imaging to assess new therapies for management of early lung cancer and to discuss a process to implement a national program to provide high quality computed tomography imaging for lung cancer and other tobacco-induced disease. With the central importance of computed tomography imaging for both early detection and volumetric lung cancer assessment, strategic issues around the development of imaging and ensuring its quality are critical to ensure continued progress against this most lethal cancer.
Real-time structured light intraoral 3D measurement pipeline
NASA Astrophysics Data System (ADS)
Gheorghe, Radu; Tchouprakov, Andrei; Sokolov, Roman
2013-02-01
Computer aided design and manufacturing (CAD/CAM) is increasingly becoming a standard feature and service provided to patients in dentist offices and denture manufacturing laboratories. Although the quality of the tools and data has slowly improved in the last years, due to various surface measurement challenges, practical, accurate, invivo, real-time 3D high quality data acquisition and processing still needs improving. Advances in GPU computational power have allowed for achieving near real-time 3D intraoral in-vivo scanning of patient's teeth. We explore in this paper, from a real-time perspective, a hardware-software-GPU solution that addresses all the requirements mentioned before. Moreover we exemplify and quantify the hard and soft deadlines required by such a system and illustrate how they are supported in our implementation.
Encoder fault analysis system based on Moire fringe error signal
NASA Astrophysics Data System (ADS)
Gao, Xu; Chen, Wei; Wan, Qiu-hua; Lu, Xin-ran; Xie, Chun-yu
2018-02-01
Aiming at the problem of any fault and wrong code in the practical application of photoelectric shaft encoder, a fast and accurate encoder fault analysis system is researched from the aspect of Moire fringe photoelectric signal processing. DSP28335 is selected as the core processor and high speed serial A/D converter acquisition card is used. And temperature measuring circuit using AD7420 is designed. Discrete data of Moire fringe error signal is collected at different temperatures and it is sent to the host computer through wireless transmission. The error signal quality index and fault type is displayed on the host computer based on the error signal identification method. The error signal quality can be used to diagnosis the state of error code through the human-machine interface.
Controlling costs without compromising quality: paying hospitals for total knee replacement.
Pine, Michael; Fry, Donald E; Jones, Barbara L; Meimban, Roger J; Pine, Gregory J
2010-10-01
Unit costs of health services are substantially higher in the United States than in any other developed country in the world, without a correspondingly healthier population. An alternative payment structure, especially for high volume, high cost episodes of care (eg, total knee replacement), is needed to reward high quality care and reduce costs. The National Inpatient Sample of administrative claims data was used to measure risk-adjusted mortality, postoperative length-of-stay, costs of routine care, adverse outcome rates, and excess costs of adverse outcomes for total knee replacements performed between 2002 and 2005. Empirically identified inefficient and ineffective hospitals were then removed to create a reference group of high-performance hospitals. Predictive models for outcomes and costs were recalibrated to the reference hospitals and used to compute risk-adjusted outcomes and costs for all hospitals. Per case predicted costs were computed and compared with observed costs. Of the 688 hospitals with acceptable data, 62 failed to meet effectiveness criteria and 210 were identified as inefficient. The remaining 416 high-performance hospitals had 13.4% fewer risk-adjusted adverse outcomes (4.56%-3.95%; P < 0.001; χ) and 9.9% lower risk-adjusted total costs ($12,773-$11,512; P < 0.001; t test) than all study hospitals. Inefficiency accounted for 96% of excess costs. A payment system based on the demonstrated performance of effective, efficient hospitals can produce sizable cost savings without jeopardizing quality. In this study, 96% of total excess hospital costs resulted from higher routine costs at inefficient hospitals, whereas only 4% was associated with ineffective care.
A comparison of sequential and spiral scanning techniques in brain CT.
Pace, Ivana; Zarb, Francis
2015-01-01
To evaluate and compare image quality and radiation dose of sequential computed tomography (CT) examinations of the brain and spiral CT examinations of the brain imaged on a GE HiSpeed NX/I Dual Slice 2CT scanner. A random sample of 40 patients referred for CT examination of the brain was selected and divided into 2 groups. Half of the patients were scanned using the sequential technique; the other half were scanned using the spiral technique. Radiation dose data—both the computed tomography dose index (CTDI) and the dose length product (DLP)—were recorded on a checklist at the end of each examination. Using the European Guidelines on Quality Criteria for Computed Tomography, 4 radiologists conducted a visual grading analysis and rated the level of visibility of 6 anatomical structures considered necessary to produce images of high quality. The mean CTDI(vol) and DLP values were statistically significantly higher (P <.05) with the sequential scans (CTDI(vol): 22.06 mGy; DLP: 304.60 mGy • cm) than with the spiral scans (CTDI(vol): 14.94 mGy; DLP: 229.10 mGy • cm). The mean image quality rating scores for all criteria of the sequential scanning technique were statistically significantly higher (P <.05) in the visual grading analysis than those of the spiral scanning technique. In this local study, the sequential technique was preferred over the spiral technique for both overall image quality and differentiation between gray and white matter in brain CT scans. Other similar studies counter this finding. The radiation dose seen with the sequential CT scanning technique was significantly higher than that seen with the spiral CT scanning technique. However, image quality with the sequential technique was statistically significantly superior (P <.05).
The relationship between computer games and quality of life in adolescents
Dolatabadi, Nayereh Kasiri; Eslami, Ahmad Ali; Mostafavi, Firooze; Hassanzade, Akbar; Moradi, Azam
2013-01-01
Background: Term of doing computer games among teenagers is growing rapidly. This popular phenomenon can cause physical and psychosocial issues in them. Therefore, this study examined the relationship between computer games and quality of life domains in adolescents aging 12-15 years. Materials and Methods: In a cross-sectional study using the 2-stage stratified cluster sampling method, 444 male and female students in Borkhar were selected. The data collection tool consisted of 1) World Health Organization Quality Of Life – BREF questionnaire and 2) personal information questionnaire. The data were analyzed by Pearson correlation, Spearman correlation, chi-square, independent t-tests and analysis of covariance. Findings: The total mean score of quality of life in students was 67.11±13.34. The results showed a significant relationship between the age of starting to play games and the overall quality of life score and its fourdomains (range r=–0.13 to –0.18). The mean of overall quality of life score in computer game users was 68.27±13.03 while it was 64.81±13.69 among those who did not play computer games and the difference was significant (P=0.01). There were significant differences in environmental and mental health domains between the two groups (P<0.05). However, there was no significant relationship between BMI with the time spent and the type of computer games. Conclusion: Playing computer games for a short time under parental supervision can have positive effects on quality of life in adolescents. However, spending long hours for playing computer games may have negative long-term effects. PMID:24083270
A physics-motivated Centroidal Voronoi Particle domain decomposition method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Lin, E-mail: lin.fu@tum.de; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A., E-mail: nikolaus.adams@tum.de
2017-04-15
In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state ismore » developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.« less
A physics-motivated Centroidal Voronoi Particle domain decomposition method
NASA Astrophysics Data System (ADS)
Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.
2017-04-01
In this paper, we propose a novel domain decomposition method for large-scale simulations in continuum mechanics by merging the concepts of Centroidal Voronoi Tessellation (CVT) and Voronoi Particle dynamics (VP). The CVT is introduced to achieve a high-level compactness of the partitioning subdomains by the Lloyd algorithm which monotonically decreases the CVT energy. The number of computational elements between neighboring partitioning subdomains, which scales the communication effort for parallel simulations, is optimized implicitly as the generated partitioning subdomains are convex and simply connected with small aspect-ratios. Moreover, Voronoi Particle dynamics employing physical analogy with a tailored equation of state is developed, which relaxes the particle system towards the target partition with good load balance. Since the equilibrium is computed by an iterative approach, the partitioning subdomains exhibit locality and the incremental property. Numerical experiments reveal that the proposed Centroidal Voronoi Particle (CVP) based algorithm produces high-quality partitioning with high efficiency, independently of computational-element types. Thus it can be used for a wide range of applications in computational science and engineering.
High Resolution Peripheral Quantitative Computed Tomography for Assessment of Bone Quality
NASA Astrophysics Data System (ADS)
Kazakia, Galateia
2014-03-01
The study of bone quality is motivated by the high morbidity, mortality, and societal cost of skeletal fractures. Over 10 million people are diagnosed with osteoporosis in the US alone, suffering 1.5 million osteoporotic fractures and costing the health care system over 17 billion annually. Accurate assessment of fracture risk is necessary to ensure that pharmacological and other interventions are appropriately administered. Currently, areal bone mineral density (aBMD) based on 2D dual-energy X-ray absorptiometry (DXA) is used to determine osteoporotic status and predict fracture risk. Though aBMD is a significant predictor of fracture risk, it does not completely explain bone strength or fracture incidence. The major limitation of aBMD is the lack of 3D information, which is necessary to distinguish between cortical and trabecular bone and to quantify bone geometry and microarchitecture. High resolution peripheral quantitative computed tomography (HR-pQCT) enables in vivo assessment of volumetric BMD within specific bone compartments as well as quantification of geometric and microarchitectural measures of bone quality. HR-pQCT studies have documented that trabecular bone microstructure alterations are associated with fracture risk independent of aBMD.... Cortical bone microstructure - specifically porosity - is a major determinant of strength, stiffness, and fracture toughness of cortical tissue and may further explain the aBMD-independent effect of age on bone fragility and fracture risk. The application of finite element analysis (FEA) to HR-pQCT data permits estimation of patient-specific bone strength, shown to be associated with fracture incidence independent of aBMD. This talk will describe the HR-pQCT scanner, established metrics of bone quality derived from HR-pQCT data, and novel analyses of bone quality currently in development. Cross-sectional and longitudinal HR-pQCT studies investigating the impact of aging, disease, injury, gender, race, and therapeutics on bone quality will be discussed.
The Mariner 6 and 7 pictures of Mars
NASA Technical Reports Server (NTRS)
Collins, S. A., Jr.
1971-01-01
A comprehensive set of high quality reproductions of the final, computer-processed television pictures of Mars is presented. The genesis and unique characteristics of the pictures are explained, interesting features are pointed out, and some indication of their significance in the history of Mars investigations is provided.
UCSD's Automated Merit Processing System.
ERIC Educational Resources Information Center
Merryman, Robert; Johnson, Judy R.; Block, Ron
1998-01-01
The University of California San Diego replaced its manual staff merit-increase-recommendation process with an online computer program to reduce workloads and improve the quality of the final recommendations. The highly successful system has been enthusiastically embraced by the campus community and recognized by the National Association of…
Equity and Access: All Students Are Mathematical Problem Solvers
ERIC Educational Resources Information Center
Franz, Dana Pompkyl; Ivy, Jessica; McKissick, Bethany R.
2016-01-01
Often mathematical instruction for students with disabilities, especially those with learning disabilities, includes an overabundance of instruction on mathematical computation and does not include high-quality instruction on mathematical reasoning and problem solving. In fact, it is a common misconception that students with learning disabilities…
What We've Learned about Assessing Hands-On Science.
ERIC Educational Resources Information Center
Shavelson, Richard J.; Baxter, Gail P.
1992-01-01
A recent study compared hands-on scientific inquiry assessment to assessments involving lab notebooks, computer simulations, short-answer paper-and-pencil problems, and multiple-choice questions. Creating high quality performance assessments is a costly, time-consuming process requiring considerable scientific and technological know-how. Improved…
NASA Technical Reports Server (NTRS)
Rising, J. J.; Kairys, A. A.; Maass, C. A.; Siegart, C. D.; Rakness, W. L.; Mijares, R. D.; King, R. W.; Peterson, R. S.; Hurley, S. R.; Wickson, D.
1982-01-01
A limited authority pitch active control system (PACS) was developed for a wide body jet transport (L-1011) with a flying horizontal stabilizer. Two dual channel digital computers and the associated software provide command signals to a dual channel series servo which controls the stabilizer power actuators. Input sensor signals to the computer are pitch rate, column-trim position, and dynamic pressure. Control laws are given for the PACS and the system architecture is defined. The piloted flight simulation and vehicle system simulation tests performed to verify control laws and system operation prior to installation on the aircraft are discussed. Modifications to the basic aircraft are described. Flying qualities of the aircraft with the PACS on and off were evaluated. Handling qualities for cruise and high speed flight conditions with the c.g. at 39% mac ( + 1% stability margin) and PACS operating were judged to be as good as the handling qualities with the c.g. at 25% (+15% stability margin) and PACS off.
Automated Theorem Proving in High-Quality Software Design
NASA Technical Reports Server (NTRS)
Schumann, Johann; Swanson, Keith (Technical Monitor)
2001-01-01
The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.
2010-01-01
Service quality on computer and network systems has become increasingly important as many conventional service transactions are moved online. Service quality of computer and network services can be measured by the performance of the service process in throughput, delay, and so on. On a computer and network system, competing service requests of users and associated service activities change the state of limited system resources which in turn affects the achieved service ...relations of service activities, system state and service
Accretor: Generative Materiality in the Work of Driessens and Verstappen.
Whitelaw, Mitchell
2015-01-01
Accretor, by the Dutch artists Erwin Driessens and Maria Verstappen, is a generative artwork that adopts and adapts artificial life techniques to produce intricate three-dimensional forms. This article introduces and analyzes Accretor, considering the enigmatic quality of the generated objects and in particular the role of materiality in this highly computational work. Accretor demonstrates a tangled continuity between digital and physical domains, where the constraints and affordances of matter inform both formal processes and aesthetic interpretations. Drawing on Arp's notion of the concrete artwork and McCormack and Dorin's notion of the computational sublime, the article finally argues that Accretor demonstrates what might be called a processual sublime, evoking expansive processes that span both computational and non-computational systems.
Computer-generated holograms by multiple wavefront recording plane method with occlusion culling.
Symeonidou, Athanasia; Blinder, David; Munteanu, Adrian; Schelkens, Peter
2015-08-24
We propose a novel fast method for full parallax computer-generated holograms with occlusion processing, suitable for volumetric data such as point clouds. A novel light wave propagation strategy relying on the sequential use of the wavefront recording plane method is proposed, which employs look-up tables in order to reduce the computational complexity in the calculation of the fields. Also, a novel technique for occlusion culling with little additional computation cost is introduced. Additionally, the method adheres a Gaussian distribution to the individual points in order to improve visual quality. Performance tests show that for a full-parallax high-definition CGH a speedup factor of more than 2,500 compared to the ray-tracing method can be achieved without hardware acceleration.
A computational image analysis glossary for biologists.
Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M
2012-09-01
Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.
Online production validation in a HEP environment
NASA Astrophysics Data System (ADS)
Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.
2017-03-01
In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.
ERIC Educational Resources Information Center
Yoon, Susan A.; Koehler-Yom, Jessica; Anderson, Emma; Lin, Joyce; Klopfer, Eric
2015-01-01
Background: This exploratory study is part of a larger-scale research project aimed at building theoretical and practical knowledge of complex systems in students and teachers with the goal of improving high school biology learning through professional development and a classroom intervention. Purpose: We propose a model of adaptive expertise to…
Multi-GPU implementation of a VMAT treatment plan optimization algorithm.
Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B
2015-06-01
Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.
McCormack, Jane; Baker, Elise; Masso, Sarah; Crowe, Kathryn; McLeod, Sharynne; Wren, Yvonne; Roulstone, Sue
2017-06-01
Implementation fidelity refers to the degree to which an intervention or programme adheres to its original design. This paper examines implementation fidelity in the Sound Start Study, a clustered randomised controlled trial of computer-assisted support for children with speech sound disorders (SSD). Sixty-three children with SSD in 19 early childhood centres received computer-assisted support (Phoneme Factory Sound Sorter [PFSS] - Australian version). Educators facilitated the delivery of PFSS targeting phonological error patterns identified by a speech-language pathologist. Implementation data were gathered via (1) the computer software, which recorded when and how much intervention was completed over 9 weeks; (2) educators' records of practice sessions; and (3) scoring of fidelity (intervention procedure, competence and quality of delivery) from videos of intervention sessions. Less than one-third of children received the prescribed number of days of intervention, while approximately one-half participated in the prescribed number of intervention plays. Computer data differed from educators' data for total number of days and plays in which children participated; the degree of match was lower as data became more specific. Fidelity to intervention procedures, competency and quality of delivery was high. Implementation fidelity may impact intervention outcomes and so needs to be measured in intervention research; however, the way in which it is measured may impact on data.
NASA Astrophysics Data System (ADS)
Mei, Kai; Kopp, Felix K.; Fehringer, Andreas; Pfeiffer, Franz; Rummeny, Ernst J.; Kirschke, Jan S.; Noël, Peter B.; Baum, Thomas
2017-03-01
The trabecular bone microstructure is a key to the early diagnosis and advanced therapy monitoring of osteoporosis. Regularly measuring bone microstructure with conventional multi-detector computer tomography (MDCT) would expose patients with a relatively high radiation dose. One possible solution to reduce exposure to patients is sampling fewer projection angles. This approach can be supported by advanced reconstruction algorithms, with their ability to achieve better image quality under reduced projection angles or high levels of noise. In this work, we investigated the performance of iterative reconstruction from sparse sampled projection data on trabecular bone microstructure in in-vivo MDCT scans of human spines. The computed MDCT images were evaluated by calculating bone microstructure parameters. We demonstrated that bone microstructure parameters were still computationally distinguishable when half or less of the radiation dose was employed.
NASA Technical Reports Server (NTRS)
Hofmann, R.
1980-01-01
The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.
Analysis of Thickness and Quality factor of a Double Paddle Oscillator at Room Temperature.
Shakeel, Hamza; Metcalf, Thomas H; Pomeroy, J M
2016-01-01
In this paper, we evaluate the quality (Q) factor and the resonance frequency of a double paddle oscillator (DPO) with different thickness using analytical, computational and experimental methods. The study is carried out for the 2 nd anti-symmetric resonance mode that provides extremely high experimental Q factors on the order of 10 5 . The results show that both the Q factor and the resonance frequency of a DPO increase with the thickness at room temperature.
Advances in Numerical Boundary Conditions for Computational Aeroacoustics
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.
1997-01-01
Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.
ECG-triggered high-pitch CT for simultaneous assessment of the aorta and coronary arteries.
Hachulla, Anne-Lise; Ronot, Maxime; Noble, Stéphane; Becker, Christoph D; Montet, Xavier; Vallée, Jean-Paul
2016-01-01
To study the image quality of ECG-gated-computed tomography (CT) acquisition with a high-pitch CT imaging for the exploration of both the aorta and coronary arteries. Eighty-four patients underwent high-pitch ECG-gated aortic CT without β-blockers with iterative reconstruction algorithms. Contrast-to-noise ratio (CNR) between vessels and adjacent perivascular fat tissue were calculated on the aorta and the coronary arteries. Dose-length-products (DLP) were recorded. Two blinded readers graded image quality of the aorta and the coronary arteries on a 3-point scale. Coronary artery stenoses were compared with coronary angiograms in 24 patients. Kappa values were calculated. High-pitch acquisition resulted in a mean DLP of 234 ± 93 mGy cm(4.2 mSv) for an acquisition of the entire aorta, (mean 73 ± 16 bpm). CNR for ascending aorta was 10.6 ± 4 and CNR for coronary arteries was 9.85 ± 4.1. Image quality was excellent in 79/84 patients (94%), and excellent or moderate but diagnostic in 1087/1127 coronary artery segments (96%). 74 significant stenoses were observed, and 38/40 significant stenoses were confirmed by coronary angiography (K = 0.91, Sensitivity = 0.97, Specificity = 0.98). High-pitch ECG-gated aortic CT with iterative reconstructions allows an accurate exploration of both aorta and coronary arteries during the same acquisition, with limited dose deposition, despite the lack of β-blockers and relatively high heart rate. Radiologists need to be aware of the necessity to analyze and report coronary artery disease in aortic examination. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Barufaldi, Bruno; Lau, Kristen C.; Schiabel, Homero; Maidment, D. A.
2015-03-01
Routine performance of basic test procedures and dose measurements are essential for assuring high quality of mammograms. International guidelines recommend that breast care providers ascertain that mammography systems produce a constant high quality image, using as low a radiation dose as is reasonably achievable. The main purpose of this research is to develop a framework to monitor radiation dose and image quality in a mixed breast screening and diagnostic imaging environment using an automated tracking system. This study presents a module of this framework, consisting of a computerized system to measure the image quality of the American College of Radiology mammography accreditation phantom. The methods developed combine correlation approaches, matched filters, and data mining techniques. These methods have been used to analyze radiological images of the accreditation phantom. The classification of structures of interest is based upon reports produced by four trained readers. As previously reported, human observers demonstrate great variation in their analysis due to the subjectivity of human visual inspection. The software tool was trained with three sets of 60 phantom images in order to generate decision trees using the software WEKA (Waikato Environment for Knowledge Analysis). When tested with 240 images during the classification step, the tool correctly classified 88%, 99%, and 98%, of fibers, speck groups and masses, respectively. The variation between the computer classification and human reading was comparable to the variation between human readers. This computerized system not only automates the quality control procedure in mammography, but also decreases the subjectivity in the expert evaluation of the phantom images.
Dynamic power scheduling system for JPEG2000 delivery over wireless networks
NASA Astrophysics Data System (ADS)
Martina, Maurizio; Vacca, Fabrizio
2003-06-01
Third generation mobile terminals diffusion is encouraging the development of new multimedia based applications. The reliable transmission of audiovisual content will gain major interest being one of the most valuable services. Nevertheless, mobile scenario is severely power constrained: high compression ratios and refined energy management strategies are highly advisable. JPEG2000 as the source encoding stage assures excellent performance with extremely good visual quality. However the limited power budged imposes to limit the computational effort in order to save as much power as possible. Starting from an error prone environment, as the wireless one, high error-resilience features need to be employed. This paper tries to investigate the trade-off between quality and power in such a challenging environment.
A hardware-oriented concurrent TZ search algorithm for High-Efficiency Video Coding
NASA Astrophysics Data System (ADS)
Doan, Nghia; Kim, Tae Sung; Rhee, Chae Eun; Lee, Hyuk-Jae
2017-12-01
High-Efficiency Video Coding (HEVC) is the latest video coding standard, in which the compression performance is double that of its predecessor, the H.264/AVC standard, while the video quality remains unchanged. In HEVC, the test zone (TZ) search algorithm is widely used for integer motion estimation because it effectively searches the good-quality motion vector with a relatively small amount of computation. However, the complex computation structure of the TZ search algorithm makes it difficult to implement it in the hardware. This paper proposes a new integer motion estimation algorithm which is designed for hardware execution by modifying the conventional TZ search to allow parallel motion estimations of all prediction unit (PU) partitions. The algorithm consists of the three phases of zonal, raster, and refinement searches. At the beginning of each phase, the algorithm obtains the search points required by the original TZ search for all PU partitions in a coding unit (CU). Then, all redundant search points are removed prior to the estimation of the motion costs, and the best search points are then selected for all PUs. Compared to the conventional TZ search algorithm, experimental results show that the proposed algorithm significantly decreases the Bjøntegaard Delta bitrate (BD-BR) by 0.84%, and it also reduces the computational complexity by 54.54%.
New solutions and applications of 3D computer tomography image processing
NASA Astrophysics Data System (ADS)
Effenberger, Ira; Kroll, Julia; Verl, Alexander
2008-02-01
As nowadays the industry aims at fast and high quality product development and manufacturing processes a modern and efficient quality inspection is essential. Compared to conventional measurement technologies, industrial computer tomography (CT) is a non-destructive technology for 3D-image data acquisition which helps to overcome their disadvantages by offering the possibility to scan complex parts with all outer and inner geometric features. In this paper new and optimized methods for 3D image processing, including innovative ways of surface reconstruction and automatic geometric feature detection of complex components, are presented, especially our work of developing smart online data processing and data handling methods, with an integrated intelligent online mesh reduction. Hereby the processing of huge and high resolution data sets is guaranteed. Besides, new approaches for surface reconstruction and segmentation based on statistical methods are demonstrated. On the extracted 3D point cloud or surface triangulation automated and precise algorithms for geometric inspection are deployed. All algorithms are applied to different real data sets generated by computer tomography in order to demonstrate the capabilities of the new tools. Since CT is an emerging technology for non-destructive testing and inspection more and more industrial application fields will use and profit from this new technology.
Kiluk, Brian D.; Sugarman, Dawn E.; Nich, Charla; Gibbons, Carly J.; Martino, Steve; Rounsaville, Bruce J.; Carroll, Kathleen M.
2013-01-01
Objective Computer-assisted therapies offer a novel, cost-effective strategy for providing evidence-based therapies to a broad range of individuals with psychiatric disorders. However, the extent to which the growing body of randomized trials evaluating computer-assisted therapies meets current standards of methodological rigor for evidence-based interventions is not clear. Method A methodological analysis of randomized clinical trials of computer-assisted therapies for adult psychiatric disorders, published between January 1990 and January 2010, was conducted. Seventy-five studies that examined computer-assisted therapies for a range of axis I disorders were evaluated using a 14-item methodological quality index. Results Results indicated marked heterogeneity in study quality. No study met all 14 basic quality standards, and three met 13 criteria. Consistent weaknesses were noted in evaluation of treatment exposure and adherence, rates of follow-up assessment, and conformity to intention-to-treat principles. Studies utilizing weaker comparison conditions (e.g., wait-list controls) had poorer methodological quality scores and were more likely to report effects favoring the computer-assisted condition. Conclusions While several well-conducted studies have indicated promising results for computer-assisted therapies, this emerging field has not yet achieved a level of methodological quality equivalent to those required for other evidence-based behavioral therapies or pharmacotherapies. Adoption of more consistent standards for methodological quality in this field, with greater attention to potential adverse events, is needed before computer-assisted therapies are widely disseminated or marketed as evidence based. PMID:21536689
Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim
2013-08-02
To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Retrospective longitudinal study. Data for 2007-2008 to 2010-2011, extracted from the clinical computer systems of general practices in England. All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system.
Bansback, Nick; Li, Linda C; Lynd, Larry; Bryan, Stirling
2014-08-01
Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a 'think aloud' protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68-85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their preferred option when using DCIDA. Preliminary results suggest that DCIDA has potential to improve the quality of patient decision-making. Next steps include larger studies to test individual components of DCIDA and feasibility testing with patients making real decisions.
Real-time dynamic display of registered 4D cardiac MR and ultrasound images using a GPU
NASA Astrophysics Data System (ADS)
Zhang, Q.; Huang, X.; Eagleson, R.; Guiraudon, G.; Peters, T. M.
2007-03-01
In minimally invasive image-guided surgical interventions, different imaging modalities, such as magnetic resonance imaging (MRI), computed tomography (CT), and real-time three-dimensional (3D) ultrasound (US), can provide complementary, multi-spectral image information. Multimodality dynamic image registration is a well-established approach that permits real-time diagnostic information to be enhanced by placing lower-quality real-time images within a high quality anatomical context. For the guidance of cardiac procedures, it would be valuable to register dynamic MRI or CT with intraoperative US. However, in practice, either the high computational cost prohibits such real-time visualization of volumetric multimodal images in a real-world medical environment, or else the resulting image quality is not satisfactory for accurate guidance during the intervention. Modern graphics processing units (GPUs) provide the programmability, parallelism and increased computational precision to begin to address this problem. In this work, we first outline our research on dynamic 3D cardiac MR and US image acquisition, real-time dual-modality registration and US tracking. Then we describe image processing and optimization techniques for 4D (3D + time) cardiac image real-time rendering. We also present our multimodality 4D medical image visualization engine, which directly runs on a GPU in real-time by exploiting the advantages of the graphics hardware. In addition, techniques such as multiple transfer functions for different imaging modalities, dynamic texture binding, advanced texture sampling and multimodality image compositing are employed to facilitate the real-time display and manipulation of the registered dual-modality dynamic 3D MR and US cardiac datasets.
Does exposure to computers affect the routine parameters of semen quality?
Sun, Yue-Lian; Zhou, Wei-Jin; Wu, Jun-Qing; Gao, Er-Sheng
2005-09-01
To assess whether exposure to computers harms the semen quality of healthy young men. A total of 178 subjects were recruited from two maternity and children healthcare centers in Shanghai, 91 with a history of exposure to computers (i.e., exposure for 20 h or more per week in the last 2 years) and 87 persons to act as control (no or little exposure to computers). Data on the history of exposure to computers and other characteristics were obtained by means of a structured questionnaire interview. Semen samples were collected by masturbation in the place where the semen samples were analyzed. No differences in the distribution of the semen parameters (semen volume, sperm density, percentage of progressive sperm, sperm viability and percentage of normal form sperm) were found between the exposed group and the control group. Exposure to computers was not found to be a risk factor for inferior semen quality after adjusting for potential confounders, including abstinence days, testicle size, occupation, history of exposure to toxic substances. The present study did not find that healthy men exposed to computers had inferior semen quality.
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Vivian, Rebecca
2015-10-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.
Alternatives in Medical Education: Non-Animal Methods.
ERIC Educational Resources Information Center
Carlson, Peggy, Ed.
The technology explosion in medical education has led to the use of computer models, videotapes, interactive videos, and state-of-the-art simulators in medical training. This booklet describes alternatives to using animals in medical education. Although it is mainly intended to describe products applicable to medical school courses, high-quality,…
Using Computer Conferencing Techniques To Maximize Student Learning.
ERIC Educational Resources Information Center
Norton, Robert E.; Stammen, Ronald M.
The Consortium for the Development of Professional Materials for Vocational Education at Ohio State University was organized in 1978 for the purpose of developing high-quality curriculum materials for training leadership personnel in vocational and technical education in the United States, and to pilot test and demonstrate new instructional…
Three Essays on the Economics of Information Systems
ERIC Educational Resources Information Center
Jian, Lian
2010-01-01
My dissertation contains three studies centering on the question: how to motivate people to share high quality information on online information aggregation systems, also known as social computing systems? I take a social scientific approach to "identify" the strategic behavior of individuals in information systems, and "analyze" how non-monetary…
Raffaelli, Marcela; Armstrong, Jessica; Tran, Steve P; Griffith, Aisha N; Walker, Kathrin; Gutierrez, Vanessa
2016-06-01
Computer-assisted data collection offers advantages over traditional paper and pencil measures; however, little guidance is available regarding the logistics of conducting computer-assisted data collection with adolescents in group settings. To address this gap, we draw on our experiences conducting a multi-site longitudinal study of adolescent development. Structured questionnaires programmed on laptop computers using Audio Computer Assisted Self-Interviewing (ACASI) were administered to groups of adolescents in community-based and afterschool programs. Although implementing ACASI required additional work before entering the field, we benefited from reduced data processing time, high data quality, and high levels of youth motivation. Preliminary findings from an ethnically diverse sample of 265 youth indicate favorable perceptions of using ACASI. Using our experiences as a case study, we provide recommendations on selecting an appropriate data collection device (including hardware and software), preparing and testing the ACASI, conducting data collection in the field, and managing data. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Demirci, Müşerref Duygu Saçar; Allmer, Jens
2017-07-28
MicroRNAs (miRNAs) are involved in the post-transcriptional regulation of protein abundance and thus have a great impact on the resulting phenotype. It is, therefore, no wonder that they have been implicated in many diseases ranging from virus infections to cancer. This impact on the phenotype leads to a great interest in establishing the miRNAs of an organism. Experimental methods are complicated which led to the development of computational methods for pre-miRNA detection. Such methods generally employ machine learning to establish models for the discrimination between miRNAs and other sequences. Positive training data for model establishment, for the most part, stems from miRBase, the miRNA registry. The quality of the entries in miRBase has been questioned, though. This unknown quality led to the development of filtering strategies in attempts to produce high quality positive datasets which can lead to a scarcity of positive data. To analyze the quality of filtered data we developed a machine learning model and found it is well able to establish data quality based on intrinsic measures. Additionally, we analyzed which features describing pre-miRNAs could discriminate between low and high quality data. Both models are applicable to data from miRBase and can be used for establishing high quality positive data. This will facilitate the development of better miRNA detection tools which will make the prediction of miRNAs in disease states more accurate. Finally, we applied both models to all miRBase data and provide the list of high quality hairpins.
NASA Astrophysics Data System (ADS)
Shi, X.
2015-12-01
As NSF indicated - "Theory and experimentation have for centuries been regarded as two fundamental pillars of science. It is now widely recognized that computational and data-enabled science forms a critical third pillar." Geocomputation is the third pillar of GIScience and geosciences. With the exponential growth of geodata, the challenge of scalable and high performance computing for big data analytics become urgent because many research activities are constrained by the inability of software or tool that even could not complete the computation process. Heterogeneous geodata integration and analytics obviously magnify the complexity and operational time frame. Many large-scale geospatial problems may be not processable at all if the computer system does not have sufficient memory or computational power. Emerging computer architectures, such as Intel's Many Integrated Core (MIC) Architecture and Graphics Processing Unit (GPU), and advanced computing technologies provide promising solutions to employ massive parallelism and hardware resources to achieve scalability and high performance for data intensive computing over large spatiotemporal and social media data. Exploring novel algorithms and deploying the solutions in massively parallel computing environment to achieve the capability for scalable data processing and analytics over large-scale, complex, and heterogeneous geodata with consistent quality and high-performance has been the central theme of our research team in the Department of Geosciences at the University of Arkansas (UARK). New multi-core architectures combined with application accelerators hold the promise to achieve scalability and high performance by exploiting task and data levels of parallelism that are not supported by the conventional computing systems. Such a parallel or distributed computing environment is particularly suitable for large-scale geocomputation over big data as proved by our prior works, while the potential of such advanced infrastructure remains unexplored in this domain. Within this presentation, our prior and on-going initiatives will be summarized to exemplify how we exploit multicore CPUs, GPUs, and MICs, and clusters of CPUs, GPUs and MICs, to accelerate geocomputation in different applications.
NASA Astrophysics Data System (ADS)
Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.
2017-11-01
Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.
NASA Astrophysics Data System (ADS)
Hussein, I.; Wilkins, M.; Roscoe, C.; Faber, W.; Chakravorty, S.; Schumacher, P.
2016-09-01
Finite Set Statistics (FISST) is a rigorous Bayesian multi-hypothesis management tool for the joint detection, classification and tracking of multi-sensor, multi-object systems. Implicit within the approach are solutions to the data association and target label-tracking problems. The full FISST filtering equations, however, are intractable. While FISST-based methods such as the PHD and CPHD filters are tractable, they require heavy moment approximations to the full FISST equations that result in a significant loss of information contained in the collected data. In this paper, we review Smart Sampling Markov Chain Monte Carlo (SSMCMC) that enables FISST to be tractable while avoiding moment approximations. We study the effect of tuning key SSMCMC parameters on tracking quality and computation time. The study is performed on a representative space object catalog with varying numbers of RSOs. The solution is implemented in the Scala computing language at the Maui High Performance Computing Center (MHPCC) facility.
2013-01-01
Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. Conclusion A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases. PMID:24499556
Jacobs, Richard H A H; Haak, Koen V; Thumfart, Stefan; Renken, Remco; Henson, Brian; Cornelissen, Frans W
2016-01-01
Our world is filled with texture. For the human visual system, this is an important source of information for assessing environmental and material properties. Indeed-and presumably for this reason-the human visual system has regions dedicated to processing textures. Despite their abundance and apparent relevance, only recently the relationships between texture features and high-level judgments have captured the interest of mainstream science, despite long-standing indications for such relationships. In this study, we explore such relationships, as these might be used to predict perceived texture qualities. This is relevant, not only from a psychological/neuroscience perspective, but also for more applied fields such as design, architecture, and the visual arts. In two separate experiments, observers judged various qualities of visual textures such as beauty, roughness, naturalness, elegance, and complexity. Based on factor analysis, we find that in both experiments, ~75% of the variability in the judgments could be explained by a two-dimensional space, with axes that are closely aligned to the beauty and roughness judgments. That a two-dimensional judgment space suffices to capture most of the variability in the perceived texture qualities suggests that observers use a relatively limited set of internal scales on which to base various judgments, including aesthetic ones. Finally, for both of these judgments, we determined the relationship with a large number of texture features computed for each of the texture stimuli. We find that the presence of lower spatial frequencies, oblique orientations, higher intensity variation, higher saturation, and redness correlates with higher beauty ratings. Features that captured image intensity and uniformity correlated with roughness ratings. Therefore, a number of computational texture features are predictive of these judgments. This suggests that perceived texture qualities-including the aesthetic appreciation-are sufficiently universal to be predicted-with reasonable accuracy-based on the computed feature content of the textures.
Calibration-quality adiabatic potential energy surfaces for H3(+) and its isotopologues.
Pavanello, Michele; Adamowicz, Ludwik; Alijah, Alexander; Zobov, Nikolai F; Mizus, Irina I; Polyansky, Oleg L; Tennyson, Jonathan; Szidarovszky, Tamás; Császár, Attila G
2012-05-14
Calibration-quality ab initio adiabatic potential energy surfaces (PES) have been determined for all isotopologues of the molecular ion H(3)(+). The underlying Born-Oppenheimer electronic structure computations used optimized explicitly correlated shifted Gaussian functions. The surfaces include diagonal Born-Oppenheimer corrections computed from the accurate electronic wave functions. A fit to the 41,655 ab initio points is presented which gives a standard deviation better than 0.1 cm(-1) when restricted to the points up to 6000 cm(-1) above the first dissociation asymptote. Nuclear motion calculations utilizing this PES, called GLH3P, and an exact kinetic energy operator given in orthogonal internal coordinates are presented. The ro-vibrational transition frequencies for H(3)(+), H(2)D(+), and HD(2)(+) are compared with high resolution measurements. The most sophisticated and complete procedure employed to compute ro-vibrational energy levels, which makes explicit allowance for the inclusion of non-adiabatic effects, reproduces all the known ro-vibrational levels of the H(3)(+) isotopologues considered to better than 0.2 cm(-1). This represents a significant (order-of-magnitude) improvement compared to previous studies of transitions in the visible. Careful treatment of linear geometries is important for high frequency transitions and leads to new assignments for some of the previously observed lines. Prospects for further investigations of non-adiabatic effects in the H(3)(+) isotopologues are discussed. In short, the paper presents (a) an extremely accurate global potential energy surface of H(3)(+) resulting from high accuracy ab initio computations and global fit, (b) very accurate nuclear motion calculations of all available experimental line data up to 16,000 cm(-1), and (c) results suggest that we can predict accurately the lines of H(3)(+) towards dissociation and thus facilitate their experimental observation.
Calibration-quality adiabatic potential energy surfaces for H3+ and its isotopologues
NASA Astrophysics Data System (ADS)
Pavanello, Michele; Adamowicz, Ludwik; Alijah, Alexander; Zobov, Nikolai F.; Mizus, Irina I.; Polyansky, Oleg L.; Tennyson, Jonathan; Szidarovszky, Tamás; Császár, Attila G.
2012-05-01
Calibration-quality ab initio adiabatic potential energy surfaces (PES) have been determined for all isotopologues of the molecular ion H_3^+. The underlying Born-Oppenheimer electronic structure computations used optimized explicitly correlated shifted Gaussian functions. The surfaces include diagonal Born-Oppenheimer corrections computed from the accurate electronic wave functions. A fit to the 41 655 ab initio points is presented which gives a standard deviation better than 0.1 cm-1 when restricted to the points up to 6000 cm-1 above the first dissociation asymptote. Nuclear motion calculations utilizing this PES, called GLH3P, and an exact kinetic energy operator given in orthogonal internal coordinates are presented. The ro-vibrational transition frequencies for H_3^+, H2D+, and HD_2^+ are compared with high resolution measurements. The most sophisticated and complete procedure employed to compute ro-vibrational energy levels, which makes explicit allowance for the inclusion of non-adiabatic effects, reproduces all the known ro-vibrational levels of the H_3^+ isotopologues considered to better than 0.2 cm-1. This represents a significant (order-of-magnitude) improvement compared to previous studies of transitions in the visible. Careful treatment of linear geometries is important for high frequency transitions and leads to new assignments for some of the previously observed lines. Prospects for further investigations of non-adiabatic effects in the H_3^+ isotopologues are discussed. In short, the paper presents (a) an extremely accurate global potential energy surface of H_3^+ resulting from high accuracy ab initio computations and global fit, (b) very accurate nuclear motion calculations of all available experimental line data up to 16 000 cm-1, and (c) results suggest that we can predict accurately the lines of H_3^+ towards dissociation and thus facilitate their experimental observation.
Podcasting: contemporary patient education.
Abreu, Daniel V; Tamura, Thomas K; Sipp, J Andrew; Keamy, Donald G; Eavey, Roland D
2008-04-01
Portable video technology is a widely available new tool with potential to be used by pediatric otolaryngology practices for patient and family education. Podcasts are media broadcasts that employ this new technology. They can be accessed via the Internet and viewed either on a personal computer or on a handheld device, such as an iPod or an MP3 player. We wished to examine the feasibility of establishing a podcast-hosting Web site. We digitally recorded pediatric otologic procedures in the operating room and saved the digital files to DVDs. We then edited the DVDs at home with video-editing software on a personal computer. Next, spoken narrative was recorded with audio-recording software and combined with the edited video clips. The final products were converted into the M4V file format, and the final versions were uploaded onto our hospital's Web site. We then downloaded the podcasts onto a high-quality portable media player so that we could evaluate their quality. All of the podcasts are now on the hospital Web site, where they can be downloaded by patients and families at no cost. The site includes instructions on how to download the appropriate free software for viewing the podcasts on a portable media player or on a computer. Using this technology for patient education expands the audience and permits portability of information. We conclude that a home computer can be used to inexpensively create informative surgery demonstrations that can be accessed via a Web site and transferred to portable viewing devices with excellent quality.
Models of protein–ligand crystal structures: trust, but verify
Deller, Marc C.
2015-01-01
X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575
Models of protein-ligand crystal structures: trust, but verify.
Deller, Marc C; Rupp, Bernhard
2015-09-01
X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.
Huang, Chao-Tsung; Wang, Yu-Wen; Huang, Li-Ren; Chin, Jui; Chen, Liang-Gee
2017-02-01
Digital refocusing has a tradeoff between complexity and quality when using sparsely sampled light fields for low-storage applications. In this paper, we propose a fast physically correct refocusing algorithm to address this issue in a twofold way. First, view interpolation is adopted to provide photorealistic quality at infocus-defocus hybrid boundaries. Regarding its conventional high complexity, we devised a fast line-scan method specifically for refocusing, and its 1D kernel can be 30× faster than the benchmark View Synthesis Reference Software (VSRS)-1D-Fast. Second, we propose a block-based multi-rate processing flow for accelerating purely infocused or defocused regions, and a further 3- 34× speedup can be achieved for high-resolution images. All candidate blocks of variable sizes can interpolate different numbers of rendered views and perform refocusing in different subsampled layers. To avoid visible aliasing and block artifacts, we determine these parameters and the simulated aperture filter through a localized filter response analysis using defocus blur statistics. The final quadtree block partitions are then optimized in terms of computation time. Extensive experimental results are provided to show superior refocusing quality and fast computation speed. In particular, the run time is comparable with the conventional single-image blurring, which causes serious boundary artifacts.
Purely Structural Protein Scoring Functions Using Support Vector Machine and Ensemble Learning.
Mirzaei, Shokoufeh; Sidi, Tomer; Keasar, Chen; Crivelli, Silvia
2016-08-24
The function of a protein is determined by its structure, which creates a need for efficient methods of protein structure determination to advance scientific and medical research. Because current experimental structure determination methods carry a high price tag, computational predictions are highly desirable. Given a protein sequence, computational methods produce numerous 3D structures known as decoys. However, selection of the best quality decoys is challenging as the end users can handle only a few ones. Therefore, scoring functions are central to decoy selection. They combine measurable features into a single number indicator of decoy quality. Unfortunately, current scoring functions do not consistently select the best decoys. Machine learning techniques offer great potential to improve decoy scoring. This paper presents two machine-learning based scoring functions to predict the quality of proteins structures, i.e., the similarity between the predicted structure and the experimental one without knowing the latter. We use different metrics to compare these scoring functions against three state-of-the-art scores. This is a first attempt at comparing different scoring functions using the same non-redundant dataset for training and testing and the same features. The results show that adding informative features may be more significant than the method used.
2014-09-01
High Fructose Corn Syrup Diluted 1 to 10 percent by weight 50 to 500 mg/l Slow Release Whey (fresh/powered) Dissolved (powdered form) or injected...the assessment of remedial progress and functioning. This project also addressed several high priority needs from the Navy Environmental Quality...memory high -performance computing systems. For instance, as of March 2012 the code has been successfully executed on 2 cpu’s for an inversion problem
An improved multi-exposure approach for high quality holographic femtosecond laser patterning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chenchu; Hu, Yanlei, E-mail: huyl@ustc.edu.cn, E-mail: jwl@ustc.edu.cn; Li, Jiawen, E-mail: huyl@ustc.edu.cn, E-mail: jwl@ustc.edu.cn
High efficiency two photon polymerization through single exposure via spatial light modulator (SLM) has been used to decrease the fabrication time and rapidly realize various micro/nanostructures, but the surface quality remains a big problem due to the speckle noise of optical intensity distribution at the defocused plane. Here, a multi-exposure approach which used tens of computer generate holograms successively loaded on SLM is presented to significantly improve the optical uniformity without losing efficiency. By applying multi-exposure, we found that the uniformity at the defocused plane was increased from ∼0.02 to ∼0.6 according to our simulation. The fabricated two series ofmore » letters “HELLO” and “USTC” under single-and multi-exposure in our experiment also verified that the surface quality was greatly improved. Moreover, by this method, several kinds of beam splitters with high quality, e.g., 2 × 2, 5 × 5 Daman, and complex nonseperate 5 × 5, gratings were fabricated with both of high quality and short time (<1 min, 95% time-saving). This multi-exposure SLM-two-photon polymerization method showed the promising prospect in rapidly fabricating and integrating various binary optical devices and their systems.« less
Recording high quality speech during tagged cine-MRI studies using a fiber optic microphone.
NessAiver, Moriel S; Stone, Maureen; Parthasarathy, Vijay; Kahana, Yuvi; Paritsky, Alexander; Paritsky, Alex
2006-01-01
To investigate the feasibility of obtaining high quality speech recordings during cine imaging of tongue movement using a fiber optic microphone. A Complementary Spatial Modulation of Magnetization (C-SPAMM) tagged cine sequence triggered by an electrocardiogram (ECG) simulator was used to image a volunteer while speaking the syllable pairs /a/-/u/, /i/-/u/, and the words "golly" and "Tamil" in sync with the imaging sequence. A noise-canceling, optical microphone was fastened approximately 1-2 inches above the mouth of the volunteer. The microphone was attached via optical fiber to a laptop computer, where the speech was sampled at 44.1 kHz. A reference recording of gradient activity with no speech was subtracted from target recordings. Good quality speech was discernible above the background gradient sound using the fiber optic microphone without reference subtraction. The audio waveform of gradient activity was extremely stable and reproducible. Subtraction of the reference gradient recording further reduced gradient noise by roughly 21 dB, resulting in exceptionally high quality speech waveforms. It is possible to obtain high quality speech recordings using an optical microphone even during exceptionally loud cine imaging sequences. This opens up the possibility of more elaborate MRI studies of speech including spectral analysis of the speech signal in all types of MRI.
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
GenomicTools: a computational platform for developing high-throughput analytics in genomics.
Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo
2012-01-15
Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.
Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T
2017-12-15
Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, T; UT Southwestern Medical Center, Dallas, TX; Yan, H
2014-06-15
Purpose: To develop a 3D dictionary learning based statistical reconstruction algorithm on graphic processing units (GPU), to improve the quality of low-dose cone beam CT (CBCT) imaging with high efficiency. Methods: A 3D dictionary containing 256 small volumes (atoms) of 3x3x3 voxels was trained from a high quality volume image. During reconstruction, we utilized a Cholesky decomposition based orthogonal matching pursuit algorithm to find a sparse representation on this dictionary basis of each patch in the reconstructed image, in order to regularize the image quality. To accelerate the time-consuming sparse coding in the 3D case, we implemented our algorithm inmore » a parallel fashion by taking advantage of the tremendous computational power of GPU. Evaluations are performed based on a head-neck patient case. FDK reconstruction with full dataset of 364 projections is used as the reference. We compared the proposed 3D dictionary learning based method with a tight frame (TF) based one using a subset data of 121 projections. The image qualities under different resolutions in z-direction, with or without statistical weighting are also studied. Results: Compared to the TF-based CBCT reconstruction, our experiments indicated that 3D dictionary learning based CBCT reconstruction is able to recover finer structures, to remove more streaking artifacts, and is less susceptible to blocky artifacts. It is also observed that statistical reconstruction approach is sensitive to inconsistency between the forward and backward projection operations in parallel computing. Using high a spatial resolution along z direction helps improving the algorithm robustness. Conclusion: 3D dictionary learning based CBCT reconstruction algorithm is able to sense the structural information while suppressing noise, and hence to achieve high quality reconstruction. The GPU realization of the whole algorithm offers a significant efficiency enhancement, making this algorithm more feasible for potential clinical application. A high zresolution is preferred to stabilize statistical iterative reconstruction. This work was supported in part by NIH(1R01CA154747-01), NSFC((No. 61172163), Research Fund for the Doctoral Program of Higher Education of China (No. 20110201110011), China Scholarship Council.« less
Eller, Achim; Wuest, Wolfgang; Scharf, Michael; Brand, Michael; Achenbach, Stephan; Uder, Michael; Lell, Michael M
2013-12-01
To evaluate an automated attenuation-based kV-selection in computed tomography of the chest in respect to radiation dose and image quality, compared to a standard 120 kV protocol. 104 patients were examined using a 128-slice scanner. Fifty examinations (58 ± 15 years, study group) were performed using the automated adaption of tube potential (100-140 kV), based on the attenuation profile of the scout scan, 54 examinations (62 ± 14 years, control group) with fixed 120 kV. Estimated CT dose index (CTDI) of the software-proposed setting was compared with a 120 kV protocol. After the scan CTDI volume (CTDIvol) and dose length product (DLP) were recorded. Image quality was assessed by region of interest (ROI) measurements, subjective image quality by two observers with a 4-point scale (3--excellent, 0--not diagnostic). The algorithm selected 100 kV in 78% and 120 kV in 22%. Overall CTDIvol reduction was 26.6% (34% in 100 kV) overall DLP reduction was 22.8% (32.1% in 100 kV) (all p<0.001). Subjective image quality was excellent in both groups. The attenuation based kV-selection algorithm enables relevant dose reduction (~27%) in chest-CT while keeping image quality parameters at high levels. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Reliability history of the Apollo guidance computer
NASA Technical Reports Server (NTRS)
Hall, E. C.
1972-01-01
The Apollo guidance computer was designed to provide the computation necessary for guidance, navigation and control of the command module and the lunar landing module of the Apollo spacecraft. The computer was designed using the technology of the early 1960's and the production was completed by 1969. During the development, production, and operational phase of the program, the computer has accumulated a very interesting history which is valuable for evaluating the technology, production methods, system integration, and the reliability of the hardware. The operational experience in the Apollo guidance systems includes 17 computers which flew missions and another 26 flight type computers which are still in various phases of prelaunch activity including storage, system checkout, prelaunch spacecraft checkout, etc. These computers were manufactured and maintained under very strict quality control procedures with requirements for reporting and analyzing all indications of failure. Probably no other computer or electronic equipment with equivalent complexity has been as well documented and monitored. Since it has demonstrated a unique reliability history, it is important to evaluate the techniques and methods which have contributed to the high reliability of this computer.
VizieR Online Data Catalog: l Car radial velocity curves (Anderson, 2016)
NASA Astrophysics Data System (ADS)
Anderson, R. I.
2018-02-01
Line-of-sight (radial) velocities of the long-period classical Cepheid l Carinae were measured from 925 high-quality optical spectra recorded using the fiber-fed high-resolution (R~60,000) Coralie spectrograph located at the Euler telescope at La Silla Observatory, Chile. The data were taken between 2014 and 2016. This is the full version of Tab. 2 presented partially in the paper. Line shape parameters (depth, width, asymmetry) are listed for the computed cross-correlation profiles (CCFs). Radial velocities were determined using different techniques (Gaussian, bi-Gaussian) and measured on CCFs computed using three different numerical masks (G2, weak lines, strong lines). (1 data file).
NASA Technical Reports Server (NTRS)
Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator)
1975-01-01
The author has identified the following significant results. It was found that the high speed man machine interaction capability is a distinct advantage of the image 100; however, the small size of the digital computer in the system is a definite limitation. The system can be highly useful in an analysis mode in which it complements a large general purpose computer. The image 100 was found to be extremely valuable in the analysis of aircraft MSS data where the spatial resolution begins to approach photographic quality and the analyst can exercise interpretation judgements and readily interact with the machine.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
Convex relaxations for gas expansion planning
Borraz-Sanchez, Conrado; Bent, Russell Whitford; Backhaus, Scott N.; ...
2016-01-01
Expansion of natural gas networks is a critical process involving substantial capital expenditures with complex decision-support requirements. Here, given the non-convex nature of gas transmission constraints, global optimality and infeasibility guarantees can only be offered by global optimisation approaches. Unfortunately, state-of-the-art global optimisation solvers are unable to scale up to real-world size instances. In this study, we present a convex mixed-integer second-order cone relaxation for the gas expansion planning problem under steady-state conditions. The underlying model offers tight lower bounds with high computational efficiency. In addition, the optimal solution of the relaxation can often be used to derive high-quality solutionsmore » to the original problem, leading to provably tight optimality gaps and, in some cases, global optimal solutions. The convex relaxation is based on a few key ideas, including the introduction of flux direction variables, exact McCormick relaxations, on/off constraints, and integer cuts. Numerical experiments are conducted on the traditional Belgian gas network, as well as other real larger networks. The results demonstrate both the accuracy and computational speed of the relaxation and its ability to produce high-quality solution« less
Scheduling algorithms for automatic control systems for technological processes
NASA Astrophysics Data System (ADS)
Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.
2017-01-01
Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.
ERIC Educational Resources Information Center
Dunn, Peter
2008-01-01
Quality encompasses a very broad range of ideas in learning materials, yet the accuracy of the content is often overlooked as a measure of quality. Various aspects of accuracy are briefly considered, and the issue of computational accuracy is then considered further. When learning materials are produced containing the results of mathematical…
A Structure for Creating Quality Software.
ERIC Educational Resources Information Center
Christensen, Larry C.; Bodey, Michael R.
1990-01-01
Addresses the issue of assuring quality software for use in computer-aided instruction and presents a structure by which developers can create quality courseware. Differences between courseware and computer-aided instruction software are discussed, methods for testing software are described, and human factors issues as well as instructional design…
NASA Technical Reports Server (NTRS)
Cornwell, Donald M., Jr.; Saif, Babak N.
1991-01-01
The spatial pointing angle and far field beamwidth of a high-power semiconductor laser are characterized as a function of CW power and also as a function of temperature. The time-averaged spatial pointing angle and spatial lobe width were measured under intensity-modulated conditions. The measured pointing deviations are determined to be well within the pointing requirements of the NASA Laser Communications Transceiver (LCT) program. A computer-controlled Mach-Zehnder phase-shifter interferometer is used to characterize the wavefront quality of the laser. The rms phase error over the entire pupil was measured as a function of CW output power. Time-averaged measurements of the wavefront quality are also made under intensity-modulated conditions. The measured rms phase errors are determined to be well within the wavefront quality requirements of the LCT program.
Men, Kuo; Dai, Jianrong
2017-12-01
To develop a projection quality-driven tube current modulation method in cone-beam computed tomography for image-guided radiotherapy based on the prior attenuation information obtained by the planning computed tomography and then evaluate its effect on a reduction in the imaging dose. The QCKV-1 phantom with different thicknesses (0-400 mm) of solid water upon it was used to simulate different attenuation (μ). Projections were acquired with a series of tube current-exposure time product (mAs) settings, and a 2-dimensional contrast to noise ratio was analyzed for each projection to create a lookup table of mAs versus 2-dimensional contrast to noise ratio, μ. Before a patient underwent computed tomography, the maximum attenuation [Formula: see text] within the 95% range of each projection angle (θ) was estimated according to the planning computed tomography images. Then, a desired 2-dimensional contrast to noise ratio value was selected, and the mAs setting at θ was calculated with the lookup table of mAs versus 2-dimensional contrast to noise ratio,[Formula: see text]. Three-dimensional cone-beam computed tomography images were reconstructed using the projections acquired with the selected mAs. The imaging dose was evaluated with a polymethyl methacrylate dosimetry phantom in terms of volume computed tomography dose index. Image quality was analyzed using a Catphan 503 phantom with an oval body annulus and a pelvis phantom. For the Catphan 503 phantom, the cone-beam computed tomography image obtained by the projection quality-driven tube current modulation method had a similar quality to that of conventional cone-beam computed tomography . However, the proposed method could reduce the imaging dose by 16% to 33% to achieve an equivalent contrast to noise ratio value. For the pelvis phantom, the structural similarity index was 0.992 with a dose reduction of 39.7% for the projection quality-driven tube current modulation method. The proposed method could reduce the additional dose to the patient while not degrading the image quality for cone-beam computed tomography. The projection quality-driven tube current modulation method could be especially beneficial to patients who undergo cone-beam computed tomography frequently during a treatment course.
Chapter 11. Quality evaluation of apple by computer vision
USDA-ARS?s Scientific Manuscript database
Apple is one of the most consumed fruits in the world, and there is a critical need for enhanced computer vision technology for quality assessment of apples. This chapter gives a comprehensive review on recent advances in various computer vision techniques for detecting surface and internal defects ...
Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing
2013-01-01
Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.
Aktas, Aynur; Hullihen, Barbara; Shrotriya, Shiva; Thomas, Shirley; Walsh, Declan; Estfan, Bassam
2015-03-01
Incorporation of tablet computers (TCs) into patient assessment may facilitate safe and secure data collection. We evaluated the usefulness and acceptability of a TC as an electronic self-report symptom assessment instrument. Research Electronic Data Capture Web-based application supported data capture. Information was collected and disseminated in real time and a structured format. Completed questionnaires were printed and given to the physician before the patient visit. Most participants completed the survey without assistance. Completion rate was 100%. The median global quality of life was high for all. More than half reported pain. Based on Edmonton Symptom Assessment System, the top 3 most common symptoms were tiredness, anxiety, and decreased well-being. Patient and physician acceptability for these quick and useful TC-based surveys was excellent. © The Author(s) 2013.
Computer implemented classification of vegetation using aircraft acquired multispectral scanner data
NASA Technical Reports Server (NTRS)
Cibula, W. G.
1975-01-01
The use of aircraft 24-channel multispectral scanner data in conjunction with computer processing techniques to obtain an automated classification of plant species association was discussed. The classification of various plant species associations was related to information needed for specific applications. In addition, the necessity for multiple selection of training fields for a single class in situations where the study area consists of highly irregular terrain was detailed. A single classification was illuminated differently in different areas, resulting in the existence of multiple spectral signatures for a given class. These different signatures result since different qualities of radiation upwell to the detector from portions that have differing qualities of incident radiation. Techniques of training field selection were outlined, and a classification obtained from a natural area in Tishomingo State Park in northern Mississippi was presented.
Bettina Ohse; Falk Huettmann; Stefanie M. Ickert-Bond; Glenn P. Juday
2009-01-01
Most wilderness areas still lack accurate distribution information on tree species. We met this need with a predictive GIS modeling approach, using freely available digital data and computer programs to efficiently obtain high-quality species distribution maps. Here we present a digital map with the predicted distribution of white spruce (Picea glauca...
2017-03-29
AFRL-AFOSR-VA-TR-2017-0072 12. DISTRIBUTION/ AVAILABILITY STATEMENT DISTRIBUTION A: Distribution approved for public release. 13. SUPPLEMENTARY NOTES...ablation, high intensity 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON PARRA, ENRIQUE...Leibniz Universitat Hannover. These additions have significantly strengthened our team, as evidenced by the high quality publications by those
A comparative study of integrators for constructing ephemerides with high precision.
NASA Astrophysics Data System (ADS)
Huang, Tian-Yi
1990-09-01
There are four indexes for evaluating various integrators. They are the local truncation error, the numerical stability, the complexity of computation and the quality of adaptation. A review and a comparative study of several numerical integration methods, such as Adams, Cowell, Runge-Kutta-Fehlberg, Gragg-Bulirsch-Stoer extrapolation, Everhart, Taylor series and Krogh, which are popular for constructing ephemerides with high precision, has been worked out.
Is there a preference for linearity when viewing natural images?
NASA Astrophysics Data System (ADS)
Kane, David; Bertamío, Marcelo
2015-01-01
The system gamma of the imaging pipeline, defined as the product of the encoding and decoding gammas, is typically greater than one and is stronger for images viewed with a dark background (e.g. cinema) than those viewed in lighter conditions (e.g. office displays).1-3 However, for high dynamic range (HDR) images reproduced on a low dynamic range (LDR) monitor, subjects often prefer a system gamma of less than one,4 presumably reflecting the greater need for histogram equalization in HDR images. In this study we ask subjects to rate the perceived quality of images presented on a LDR monitor using various levels of system gamma. We reveal that the optimal system gamma is below one for images with a HDR and approaches or exceeds one for images with a LDR. Additionally, the highest quality scores occur for images where a system gamma of one is optimal, suggesting a preference for linearity (where possible). We find that subjective image quality scores can be predicted by computing the degree of histogram equalization of the lightness distribution. Accordingly, an optimal, image dependent system gamma can be computed that maximizes perceived image quality.
Rich client data exploration and research prototyping for NOAA
NASA Astrophysics Data System (ADS)
Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah
2009-08-01
Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.
Fan, Ming; Kuwahara, Hiroyuki; Wang, Xiaolei; Wang, Suojin; Gao, Xin
2015-11-01
Parameter estimation is a challenging computational problem in the reverse engineering of biological systems. Because advances in biotechnology have facilitated wide availability of time-series gene expression data, systematic parameter estimation of gene circuit models from such time-series mRNA data has become an important method for quantitatively dissecting the regulation of gene expression. By focusing on the modeling of gene circuits, we examine here the performance of three types of state-of-the-art parameter estimation methods: population-based methods, online methods and model-decomposition-based methods. Our results show that certain population-based methods are able to generate high-quality parameter solutions. The performance of these methods, however, is heavily dependent on the size of the parameter search space, and their computational requirements substantially increase as the size of the search space increases. In comparison, online methods and model decomposition-based methods are computationally faster alternatives and are less dependent on the size of the search space. Among other things, our results show that a hybrid approach that augments computationally fast methods with local search as a subsequent refinement procedure can substantially increase the quality of their parameter estimates to the level on par with the best solution obtained from the population-based methods while maintaining high computational speed. These suggest that such hybrid methods can be a promising alternative to the more commonly used population-based methods for parameter estimation of gene circuit models when limited prior knowledge about the underlying regulatory mechanisms makes the size of the parameter search space vastly large. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Kontopantelis, Evangelos; Buchan, Iain; Reeves, David; Checkland, Kath; Doran, Tim
2013-01-01
Objectives To investigate the relationship between performance on the UK Quality and Outcomes Framework pay-for-performance scheme and choice of clinical computer system. Design Retrospective longitudinal study. Setting Data for 2007–2008 to 2010–2011, extracted from the clinical computer systems of general practices in England. Participants All English practices participating in the pay-for-performance scheme: average 8257 each year, covering over 99% of the English population registered with a general practice. Main outcome measures Levels of achievement on 62 quality-of-care indicators, measured as: reported achievement (levels of care after excluding inappropriate patients); population achievement (levels of care for all patients with the relevant condition) and percentage of available quality points attained. Multilevel mixed effects multiple linear regression models were used to identify population, practice and clinical computing system predictors of achievement. Results Seven clinical computer systems were consistently active in the study period, collectively holding approximately 99% of the market share. Of all population and practice characteristics assessed, choice of clinical computing system was the strongest predictor of performance across all three outcome measures. Differences between systems were greatest for intermediate outcomes indicators (eg, control of cholesterol levels). Conclusions Under the UK's pay-for-performance scheme, differences in practice performance were associated with the choice of clinical computing system. This raises the question of whether particular system characteristics facilitate higher quality of care, better data recording or both. Inconsistencies across systems need to be understood and addressed, and researchers need to be cautious when generalising findings from samples of providers using a single computing system. PMID:23913774
NASA Astrophysics Data System (ADS)
Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.
2003-08-01
In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.
Ultra-High-Resolution Computed Tomography of the Lung: Image Quality of a Prototype Scanner.
Kakinuma, Ryutaro; Moriyama, Noriyuki; Muramatsu, Yukio; Gomi, Shiho; Suzuki, Masahiro; Nagasawa, Hirobumi; Kusumoto, Masahiko; Aso, Tomohiko; Muramatsu, Yoshihisa; Tsuchida, Takaaki; Tsuta, Koji; Maeshima, Akiko Miyagi; Tochigi, Naobumi; Watanabe, Shun-Ichi; Sugihara, Naoki; Tsukagoshi, Shinsuke; Saito, Yasuo; Kazama, Masahiro; Ashizawa, Kazuto; Awai, Kazuo; Honda, Osamu; Ishikawa, Hiroyuki; Koizumi, Naoya; Komoto, Daisuke; Moriya, Hiroshi; Oda, Seitaro; Oshiro, Yasuji; Yanagawa, Masahiro; Tomiyama, Noriyuki; Asamura, Hisao
2015-01-01
The image noise and image quality of a prototype ultra-high-resolution computed tomography (U-HRCT) scanner was evaluated and compared with those of conventional high-resolution CT (C-HRCT) scanners. This study was approved by the institutional review board. A U-HRCT scanner prototype with 0.25 mm x 4 rows and operating at 120 mAs was used. The C-HRCT images were obtained using a 0.5 mm x 16 or 0.5 mm x 64 detector-row CT scanner operating at 150 mAs. Images from both scanners were reconstructed at 0.1-mm intervals; the slice thickness was 0.25 mm for the U-HRCT scanner and 0.5 mm for the C-HRCT scanners. For both scanners, the display field of view was 80 mm. The image noise of each scanner was evaluated using a phantom. U-HRCT and C-HRCT images of 53 images selected from 37 lung nodules were then observed and graded using a 5-point score by 10 board-certified thoracic radiologists. The images were presented to the observers randomly and in a blinded manner. The image noise for U-HRCT (100.87 ± 0.51 Hounsfield units [HU]) was greater than that for C-HRCT (40.41 ± 0.52 HU; P < .0001). The image quality of U-HRCT was graded as superior to that of C-HRCT (P < .0001) for all of the following parameters that were examined: margins of subsolid and solid nodules, edges of solid components and pulmonary vessels in subsolid nodules, air bronchograms, pleural indentations, margins of pulmonary vessels, edges of bronchi, and interlobar fissures. Despite a larger image noise, the prototype U-HRCT scanner had a significantly better image quality than the C-HRCT scanners.
Koplay, M; Kizilca, O; Cimen, D; Sivri, M; Erdogan, H; Guvenc, O; Oc, M; Oran, B
2016-11-01
The goal of this study was to investigate the radiation dose and diagnostic efficacy of cardiac computed tomography angiography (CCTA) using prospective ECG-gated high-pitch dual-source computed tomography (DSCT) in the diagnosis of congenital cardiovascular abnormalities in pediatric population. One hundred five pediatric patients who were clinically diagnosed with congenital heart disease with suspected extracardiac vascular abnormalities were included in the study. All CCTAs were performed on a 128×2-section DSCT scanner. CCTA findings were compared with surgical and/or conventional cardiac angiography findings. Dose-length product (DLP) and effective doses (ED) were calculated for each patient. Patients were divided into 4 groups by age, and ED and DLP values were compared among groups. The image quality was evaluated using a five-point scale. CCTA showed 173 abnormalities in 105 patients. There were 2 patients with false positive and 3 with false negative findings. The sensitivity and specificity of CCTA were 98.3% and 99.9%, respectively. The positive predictive value and negative predictive value of CCT were 98.9% and 99.9%, respectively. The average DLP and ED values were 15.6±9.6 (SD) mGy.cm and 0.34±0.10 (SD) mSv, respectively. The mean image quality score was 4.8±0.5 (SD) in all patients. The inter-observer agreement for the image quality scores was good (κ=0.80). CCTA is an excellent imaging modality for evaluation of cardiovascular abnormalities and provides excellent image quality with very low radiation exposure when low-dose prospective ECG-triggered high-pitch DSCT is used. Copyright © 2016 Editions françaises de radiologie. Published by Elsevier Masson SAS. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Various computer models, ranging from simple to complex, have been developed to simulate hydrology and water quality from field to watershed scales. However, many users are uncertain about which model to choose when estimating water quantity and quality conditions in a watershed. This study compared...
A maximum entropy reconstruction technique for tomographic particle image velocimetry
NASA Astrophysics Data System (ADS)
Bilsky, A. V.; Lozhkin, V. A.; Markovich, D. M.; Tokarev, M. P.
2013-04-01
This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART.
LittleQuickWarp: an ultrafast image warping tool.
Qu, Lei; Peng, Hanchuan
2015-02-01
Warping images into a standard coordinate space is critical for many image computing related tasks. However, for multi-dimensional and high-resolution images, an accurate warping operation itself is often very expensive in terms of computer memory and computational time. For high-throughput image analysis studies such as brain mapping projects, it is desirable to have high performance image warping tools that are compatible with common image analysis pipelines. In this article, we present LittleQuickWarp, a swift and memory efficient tool that boosts 3D image warping performance dramatically and at the same time has high warping quality similar to the widely used thin plate spline (TPS) warping. Compared to the TPS, LittleQuickWarp can improve the warping speed 2-5 times and reduce the memory consumption 6-20 times. We have implemented LittleQuickWarp as an Open Source plug-in program on top of the Vaa3D system (http://vaa3d.org). The source code and a brief tutorial can be found in the Vaa3D plugin source code repository. Copyright © 2014 Elsevier Inc. All rights reserved.
Analysis of computer images in the presence of metals
NASA Astrophysics Data System (ADS)
Buzmakov, Alexey; Ingacheva, Anastasia; Prun, Victor; Nikolaev, Dmitry; Chukalina, Marina; Ferrero, Claudio; Asadchikov, Victor
2018-04-01
Artifacts caused by intensely absorbing inclusions are encountered in computed tomography via polychromatic scanning and may obscure or simulate pathologies in medical applications. To improve the quality of reconstruction if high-Z inclusions in presence, previously we proposed and tested with synthetic data an iterative technique with soft penalty mimicking linear inequalities on the photon-starved rays. This note reports a test at the tomographic laboratory set-up at the Institute of Crystallography FSRC "Crystallography and Photonics" RAS in which tomographic scans were successfully made of temporary tooth without inclusion and with Pb inclusion.
Hypermedia = hypercommunication
NASA Technical Reports Server (NTRS)
Laff, Mark R.
1990-01-01
New hardware and software technology gave application designers the freedom to use new realism in human computer interaction. High-quality images, motion video, stereo sound and music, speech, touch, gesture provide richer data channels between the person and the machine. Ultimately, this will lead to richer communication between people with the computer as an intermediary. The whole point of hyper-books, hyper-newspapers, virtual worlds, is to transfer the concept and relationships, the 'data structure', from the mind of creator to that of user. Some of the characteristics of this rich information channel are discussed, and some examples are presented.
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...
2015-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less
A gateway for phylogenetic analysis powered by grid computing featuring GARLI 2.0.
Bazinet, Adam L; Zwickl, Derrick J; Cummings, Michael P
2014-09-01
We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Using machine learning to accelerate sampling-based inversion
NASA Astrophysics Data System (ADS)
Valentine, A. P.; Sambridge, M.
2017-12-01
In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.
A case study for cloud based high throughput analysis of NGS data using the globus genomics system
Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha
2014-01-01
Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205
Wagner, Roland; Helin, Tapio; Obereder, Andreas; Ramlau, Ronny
2016-02-20
The imaging quality of modern ground-based telescopes such as the planned European Extremely Large Telescope is affected by atmospheric turbulence. In consequence, they heavily depend on stable and high-performance adaptive optics (AO) systems. Using measurements of incoming light from guide stars, an AO system compensates for the effects of turbulence by adjusting so-called deformable mirror(s) (DMs) in real time. In this paper, we introduce a novel reconstruction method for ground layer adaptive optics. In the literature, a common approach to this problem is to use Bayesian inference in order to model the specific noise structure appearing due to spot elongation. This approach leads to large coupled systems with high computational effort. Recently, fast solvers of linear order, i.e., with computational complexity O(n), where n is the number of DM actuators, have emerged. However, the quality of such methods typically degrades in low flux conditions. Our key contribution is to achieve the high quality of the standard Bayesian approach while at the same time maintaining the linear order speed of the recent solvers. Our method is based on performing a separate preprocessing step before applying the cumulative reconstructor (CuReD). The efficiency and performance of the new reconstructor are demonstrated using the OCTOPUS, the official end-to-end simulation environment of the ESO for extremely large telescopes. For more specific simulations we also use the MOST toolbox.
Morsbach, Fabian; Gordic, Sonja; Desbiolles, Lotus; Husarik, Daniela; Frauenfelder, Thomas; Schmidt, Bernhard; Allmendinger, Thomas; Wildermuth, Simon; Alkadhi, Hatem; Leschka, Sebastian
2014-08-01
To evaluate image quality, maximal heart rate allowing for diagnostic imaging, and radiation dose of turbo high-pitch dual-source coronary computed tomographic angiography (CCTA). First, a cardiac motion phantom simulating heart rates (HRs) from 60-90 bpm in 5-bpm steps was examined on a third-generation dual-source 192-slice CT (prospective ECG-triggering, pitch 3.2; rotation time, 250 ms). Subjective image quality regarding the presence of motion artefacts was interpreted by two readers on a four-point scale (1, excellent; 4, non-diagnostic). Objective image quality was assessed by calculating distortion vectors. Thereafter, 20 consecutive patients (median, 50 years) undergoing clinically indicated CCTA were included. In the phantom study, image quality was rated diagnostic up to the HR75 bpm, with object distortion being 1 mm or less. Distortion increased above 1 mm at HR of 80-90 bpm. Patients had a mean HR of 66 bpm (47-78 bpm). Coronary segments were of diagnostic image quality for all patients with HR up to 73 bpm. Average effective radiation dose in patients was 0.6 ± 0.3 mSv. Our combined phantom and patient study indicates that CCTA with turbo high-pitch third-generation dual-source 192-slice CT can be performed at HR up to 75 bpm while maintaining diagnostic image quality, being associated with an average radiation dose of 0.6 mSv. • CCTA is feasible with the turbo high-pitch mode. • Turbo high-pitch CCTA provides diagnostic image quality up to 73 bpm. • The radiation dose of high-pitch CCTA is 0.6 mSv on average.
NASA Astrophysics Data System (ADS)
Comyn-Wattiau, Isabelle; Thalheim, Bernhard
Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.
Future Reality: How Emerging Technologies Will Change Language Itself.
Perlin, Ken
2016-01-01
Just as notebook computers once freed us to take our computers with us, smartphones freed us to walk around with computers in our pockets, and wearables will soon free us from needing to hold a screen at all. Today, as high-quality virtual and augmented reality begins to become available at consumer prices, the "screen" will soon be all around us. But the largest long-term impact here may not merely be one of form factor, but rather one of language itself. Once wearables become small enough, cheap enough, and therefore ubiquitous enough to be accepted as part of our everyday reality, our use of language will evolve in important ways.
Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke
2015-01-01
Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842
Image quality classification for DR screening using deep learning.
FengLi Yu; Jing Sun; Annan Li; Jun Cheng; Cheng Wan; Jiang Liu
2017-07-01
The quality of input images significantly affects the outcome of automated diabetic retinopathy (DR) screening systems. Unlike the previous methods that only consider simple low-level features such as hand-crafted geometric and structural features, in this paper we propose a novel method for retinal image quality classification (IQC) that performs computational algorithms imitating the working of the human visual system. The proposed algorithm combines unsupervised features from saliency map and supervised features coming from convolutional neural networks (CNN), which are fed to an SVM to automatically detect high quality vs poor quality retinal fundus images. We demonstrate the superior performance of our proposed algorithm on a large retinal fundus image dataset and the method could achieve higher accuracy than other methods. Although retinal images are used in this study, the methodology is applicable to the image quality assessment and enhancement of other types of medical images.
Practical quality control tools for curves and surfaces
NASA Technical Reports Server (NTRS)
Small, Scott G.
1992-01-01
Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.
NASA Astrophysics Data System (ADS)
Mundhra, A.; Sain, K.; Shankar, U.
2012-12-01
The Indian National Gas Hydrate Program Expedition (NGHP) 01 discovered gas hydrate in unconsolidated sediments at several drilling sites along the continental margins of Krishna-Godavari Basin, India. Presence of gas hydrate reduces the attenuation of travelling seismic waves which can be measured by estimation of seismic quality factor (Dasgupta and Clark, 1998). Here, we use log spectral ratio method (Sain et al, 2009) to compute quality factor at three locations, among which two have strong and one has no bottom simulating reflector (BSR), along seismic cross-line near one of the drilling site. Interval quality factor for three submarine sedimentary layers bounded by seafloor, BSR, one reflector above and another reflector below the BSR has been measured. To compute quality factor, unprocessed pre-stack seismic data has been used to avoid any influence of processing sequence. We have estimated that interval quality factor lies within 200-220 in the interval having BSR while it varies within 90-100 in other intervals. Thereby, high interval quality factor ascertains that observed BSR is due to presence of gas hydrates. We have performed rock physics modelling by using isotropic and anisotropic models, to quantitatively estimate gas hydrate saturation at one of the location where an interval has high quality factor. Abruptly high measured resistivity and high P-wave velocity in the interval, leads to towering hydrate saturation (Archie,1942 and Helegrud et al, 1999) in comparison to lower gas hydrate saturations estimated by pressure core and chlorinity measurements. Overestimation of saturation is attributed to presence of near vertical fractures that are identified from logging-while-drilling resistivity images. We have carried out anisotropic modeling (Kennedy and Herrick, 2004 and Lee,2009) by incorporating fracture volume and fracture porosity to estimate hydrate saturation and have observed that modeled gas hydrate saturations agree with the lower gas hydrate saturations obtained from pressure core and chlorinity measurements. Therefore, we find that 1) quality factor is significantly higher in the interval bearing gas hydrates and is a useful tool to discover hydrate deposits, 2) anisotropy due to presence of near vertical hydrate filled fractures translates into elevated saturation because of high measured resistivity and velocity and 3) anisotropic model greatly corrects the saturation estimates in fractured medium. References: Archie, G.E., 1942. Petroleum Transactions of AIME, 146, 54-62. Dasgupta, R., Clark, R.A., 1998. Geophysics 63, 2120-2128. Kennedy, W.D., Herrick, D.C., 2004. Petrophysics 45, 38-58. Lee, M.W., 2009. U.S. Geological Survey Scientific Investigations Report 2009-5141, 13. Sain, K., Singh, A.K., Thakur, N.K., Khanna, R.K., 2009.Marine Geophysical Researches 30, 137-145.
2014-01-01
Background Single-pass, contrast-enhanced whole body multidetector computed tomography (MDCT) emerged as the diagnostic standard for evaluating patients with major trauma. Modern iterative image algorithms showed high image quality at a much lower radiation dose in the non-trauma setting. This study aims at investigating whether the radiation dose can safely be reduced in trauma patients without compromising the diagnostic accuracy and image quality. Methods/Design Prospective observational study with two consecutive cohorts of patients. Setting: A high-volume, academic, supra-regional trauma centre in Germany. Study population: Consecutive male and female patients who 1. had been exposed to a high-velocity trauma mechanism, 2. present with clinical evidence or high suspicion of multiple trauma (predicted Injury Severity Score [ISS] ≥16) and 3. are scheduled for primary MDCT based on the decision of the trauma leader on call. Imaging protocols: In a before/after design, a consecutive series of 500 patients will undergo single-pass, whole-body 128-row multi-detector computed tomography (MDCT) with a standard, as low as possible radiation dose. This will be followed by a consecutive series of 500 patients undergoing an approved ultra-low dose MDCT protocol using an image processing algorithm. Data: Routine administrative data and electronic patient records, as well as digital images stored in a picture archiving and communications system will serve as the primary data source. The protocol was approved by the institutional review board. Main outcomes: (1) incidence of delayed diagnoses, (2) diagnostic accuracy, as correlated to the reference standard of a synopsis of all subsequent clinical, imaging, surgical and autopsy findings, (3) patients’ safety, (4) radiation exposure (e.g. effective dose), (5) subjective image quality (assessed independently radiologists and trauma surgeons on a 100-mm visual analogue scale), (6) objective image quality (e.g., contrast-to-noise ratio). Analysis: Multivariate regression will be employed to adjust and correct the findings for time and cohort effects. An exploratory interim analysis halfway after introduction of low-dose MDCT will be conducted to assess whether this protocol is clearly inferior or superior to the current standard. Discussion Although non-experimental, this study will generate first large-scale data on the utility of imaging-enhancing algorithms in whole-body MDCT for major blunt trauma. Trial registration Current Controlled Trials ISRCTN74557102. PMID:24589310
Computer Integrated Hardwood Processing
Luis G. Occeña; Daniel L. Schmoldt
1997-01-01
The planning of how the hardwood log can be sawn to improve recovery of high-value lumber has always been hampered by the limited information provided by external defects, and whatever internal defects are eventually revealed on the cut log faces by the sawing pattern. With expanded export and domestic markets, low-quality logs, increased competition from non-wood...
ERIC Educational Resources Information Center
Association of Small Computer Users in Education, Greencastle, IN.
This proceedings report includes 37 papers presented at the 1993 presented on the following topics: information technology in college recruiting; introductory networks in the classroom; Total Quality Management in higher education and a computing services organization; a High Tech Student Workstation; network communication for students and…
USDA-ARS?s Scientific Manuscript database
The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...
Speckle imaging techniques of the turbulence degraded images
NASA Astrophysics Data System (ADS)
Liu, Jin; Huang, Zongfu; Mao, Hongjun; Liang, Yonghui
2018-03-01
We propose a speckle imaging algorithm in which we use the improved form of spectral ratio to obtain the Fried parameter, we also use a filter to reduce the high frequency noise effects. Our algorithm makes an improvement in the quality of the reconstructed images. The performance is illustrated by computer simulations.
An Implementation of Interactive Objects on the Web.
ERIC Educational Resources Information Center
Fritze, Paul
With the release of ShockWave, MacroMedia Director animations can now be incorporated directly into Web pages to provide high quality animation and interactivity, to support, for example, tutorial style questions and instantaneous feedback. This paper looks at the application of this technique in the translation of a traditional computer-based…
Sensing Surveillance & Navigation
2012-03-07
Removing Atmospheric Turbulence Goal: to restore a single high quality image from the observed sequence Prof. Peyman...Computer Sciences – Higher wavelet studies , time-scale, time-frequency transformations, Reduced Signature Targets, Low Probability of Intercept...Range Dependent Beam -patterns •Electronic Steering with Frequency Offsets •Inherent Countermeasure Capability Why? W1(t) W2(t) W3
Manheimer, Eric D.; Peters, M. Robert; Wolff, Steven D.; Qureshi, Mehreen A.; Atluri, Prashanth; Pearson, Gregory D.N.; Einstein, Andrew J.
2011-01-01
Triple-rule-out computed tomography angiography (TRO CTA), performed to evaluate the coronary arteries, pulmonary arteries, and thoracic aorta, has been associated with high radiation exposure. Utilization of sequential scanning for coronary computed tomography angiography (CCTA) reduces radiation dose. The application of sequential scanning to TRO CTA is much less well defined. We analyzed radiation dose and image quality from TRO CTA performed in a single outpatient center, comparing scans from a period during which helical scanning with electrocardiographically controlled tube current modulation was used for all patients (n=35) and after adoption of a strategy incorporating sequential scanning whenever appropriate (n=35). Sequential scanning was able to be employed in 86% of cases. The sequential-if-appropriate strategy, compared to the helical-only strategy, was associated with a 61.6% dose decrease (mean dose-length product [DLP] of 439 mGy×cm vs 1144 mGy×cm and mean effective dose of 7.5 mSv vs 19.4 mSv, respectively, p<0.0001). Similarly, there was a 71.5% dose reduction among 30 patients scanned with the sequential protocol compared to 40 patients scanned with the helical protocol under either strategy (326 mGy×cm vs 1141 mGy×cm and 5.5 mSv vs 19.4 mSv, respectively, p<0.0001). Although image quality did not differ between strategies, there was a non-statistically significant trend towards better quality in the sequential protocol compared to the helical protocol. In conclusion, approaching TRO CTA with a diagnostic strategy of sequential scanning as appropriate offers a marked reduction in radiation dose while maintaining image quality. PMID:21306693
TU-AB-202-05: GPU-Based 4D Deformable Image Registration Using Adaptive Tetrahedral Mesh Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhong, Z; Zhuang, L; Gu, X
Purpose: Deformable image registration (DIR) has been employed today as an automated and effective segmentation method to transfer tumor or organ contours from the planning image to daily images, instead of manual segmentation. However, the computational time and accuracy of current DIR approaches are still insufficient for online adaptive radiation therapy (ART), which requires real-time and high-quality image segmentation, especially in a large datasets of 4D-CT images. The objective of this work is to propose a new DIR algorithm, with fast computational speed and high accuracy, by using adaptive feature-based tetrahedral meshing and GPU-based parallelization. Methods: The first step ismore » to generate the adaptive tetrahedral mesh based on the image features of a reference phase of 4D-CT, so that the deformation can be well captured and accurately diffused from the mesh vertices to voxels of the image volume. Subsequently, the deformation vector fields (DVF) and other phases of 4D-CT can be obtained by matching each phase of the target 4D-CT images with the corresponding deformed reference phase. The proposed 4D DIR method is implemented on GPU, resulting in significantly increasing the computational efficiency due to its parallel computing ability. Results: A 4D NCAT digital phantom was used to test the efficiency and accuracy of our method. Both the image and DVF results show that the fine structures and shapes of lung are well preserved, and the tumor position is well captured, i.e., 3D distance error is 1.14 mm. Compared to the previous voxel-based CPU implementation of DIR, such as demons, the proposed method is about 160x faster for registering a 10-phase 4D-CT with a phase dimension of 256×256×150. Conclusion: The proposed 4D DIR method uses feature-based mesh and GPU-based parallelism, which demonstrates the capability to compute both high-quality image and motion results, with significant improvement on the computational speed.« less
NASA Astrophysics Data System (ADS)
Volcke, P.; Pequegnat, C.; Grunberg, M.; Lecointre, A.; Bzeznik, B.; Wolyniec, D.; Engels, F.; Maron, C.; Cheze, J.; Pardo, C.; Saurel, J. M.; André, F.
2015-12-01
RESIF is a nationwide french project aimed at building a high quality observation system to observe and understand the inner earth. RESIF deals with permanent seismic networks data as well as mobile networks data, including dense/semi-dense arrays. RESIF project is distributed among different nodes providing qualified data to the main datacentre in Université Grenoble Alpes, France. Data control and qualification is performed by each individual nodes : the poster will provide some insights on RESIF broadband seismic component data quality control. We will then present data that has been recently made publicly available. Data is distributed through worldwide FDSN and european EIDA standards protocols. A new web portal is now opened to explore and download seismic data and metadata. The RESIF datacentre is also now connected to Grenoble University High Performance Computing (HPC) facility : a typical use-case will be presented using iRODS technologies. The use of dense observation networks is increasing, bringing challenges in data growth and handling : we will present an example where HDF5 data format was used as an alternative to usual seismology data formats.
Analysis of the impact of digital watermarking on computer-aided diagnosis in medical imaging.
Garcia-Hernandez, Jose Juan; Gomez-Flores, Wilfrido; Rubio-Loyola, Javier
2016-01-01
Medical images (MI) are relevant sources of information for detecting and diagnosing a large number of illnesses and abnormalities. Due to their importance, this study is focused on breast ultrasound (BUS), which is the main adjunct for mammography to detect common breast lesions among women worldwide. On the other hand, aiming to enhance data security, image fidelity, authenticity, and content verification in e-health environments, MI watermarking has been widely used, whose main goal is to embed patient meta-data into MI so that the resulting image keeps its original quality. In this sense, this paper deals with the comparison of two watermarking approaches, namely spread spectrum based on the discrete cosine transform (SS-DCT) and the high-capacity data-hiding (HCDH) algorithm, so that the watermarked BUS images are guaranteed to be adequate for a computer-aided diagnosis (CADx) system, whose two principal outcomes are lesion segmentation and classification. Experimental results show that HCDH algorithm is highly recommended for watermarking medical images, maintaining the image quality and without introducing distortion into the output of CADx. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computer-generated holograms and diffraction gratings in optical security applications
NASA Astrophysics Data System (ADS)
Stepien, Pawel J.
2000-04-01
The term 'computer generated hologram' (CGH) describes a diffractive structure strictly calculated and recorded to diffract light in a desired way. The CGH surface profile is a result of the wavefront calculation rather than of interference. CGHs are able to form 2D and 3D images. Optically, variable devices (OVDs) composed of diffractive gratings are often used in security applications. There are various types of optically and digitally recorded gratings in security applications. Grating based OVDs are used to record bright 2D images with limited range of cinematic effects. These effects result form various orientations or densities of recorded gratings. It is difficult to record high quality OVDs of 3D objects using gratings. Stereo grams and analogue rainbow holograms offer 3D imaging, but they are darker and have lower resolution than grating OVDs. CGH based OVDs contains unlimited range of cinematic effects and high quality 3D images. Images recorded using CGHs are usually more noisy than grating based OVDs, because of numerical inaccuracies in CGH calculation and mastering. CGH based OVDs enable smooth integration of hidden and machine- readable features within an OVD design.
The Quality of Talk in Children's Joint Activity at the Computer.
ERIC Educational Resources Information Center
Mercer, Neil
1994-01-01
Describes findings of the Spoken Language and New Technology (SLANT) research project which studied the talk of primary school children in the United Kingdom who were working in small groups at computers with various kinds of software. Improvements in the quality of talk and collaboration during computer-based activities are suggested. (Contains…
Energy Efficient Image/Video Data Transmission on Commercial Multi-Core Processors
Lee, Sungju; Kim, Heegon; Chung, Yongwha; Park, Daihee
2012-01-01
In transmitting image/video data over Video Sensor Networks (VSNs), energy consumption must be minimized while maintaining high image/video quality. Although image/video compression is well known for its efficiency and usefulness in VSNs, the excessive costs associated with encoding computation and complexity still hinder its adoption for practical use. However, it is anticipated that high-performance handheld multi-core devices will be used as VSN processing nodes in the near future. In this paper, we propose a way to improve the energy efficiency of image and video compression with multi-core processors while maintaining the image/video quality. We improve the compression efficiency at the algorithmic level or derive the optimal parameters for the combination of a machine and compression based on the tradeoff between the energy consumption and the image/video quality. Based on experimental results, we confirm that the proposed approach can improve the energy efficiency of the straightforward approach by a factor of 2∼5 without compromising image/video quality. PMID:23202181
Geoinformatics 2007: data to knowledge
Brady, Shailaja R.; Sinha, A. Krishna; Gundersen, Linda C.
2007-01-01
Geoinformatics is the term used to describe a variety of efforts to promote collaboration between the computer sciences and the geosciences to solve complex scientific questions. It refers to the distributed, integrated digital information system and working environment that provides innovative means for the study of the Earth systems, as well as other planets, through use of advanced information technologies. Geoinformatics activities range from major research and development efforts creating new technologies to provide high-quality, sustained production-level services for data discovery, integration and analysis, to small, discipline-specific efforts that develop earth science data collections and data analysis tools serving the needs of individual communities. The ultimate vision of Geoinformatics is a highly interconnected data system populated with high quality, freely available data, as well as, a robust set of software for analysis, visualization, and modeling.
NASA Astrophysics Data System (ADS)
Veltri, Pierangelo
The use of computer based solutions for data management in biology and clinical science has contributed to improve life-quality and also to gather research results in shorter time. Indeed, new algorithms and high performance computation have been using in proteomics and genomics studies for curing chronic diseases (e.g., drug designing) as well as supporting clinicians both in diagnosis (e.g., images-based diagnosis) and patient curing (e.g., computer based information analysis on information gathered from patient). In this paper we survey on examples of computer based techniques applied in both biology and clinical contexts. The reported applications are also results of experiences in real case applications at University Medical School of Catanzaro and also part of experiences of the National project Staywell SH 2.0 involving many research centers and companies aiming to study and improve citizen wellness.
High quality digital holographic reconstruction on analog film
NASA Astrophysics Data System (ADS)
Nelsen, B.; Hartmann, P.
2017-05-01
High quality real-time digital holographic reconstruction, i.e. at 30 Hz frame rates, has been at the forefront of research and has been hailed as the holy grail of display systems. While these efforts have produced a fascinating array of computer algorithms and technology, many applications of reconstructing high quality digital holograms do not require such high frame rates. In fact, applications such as 3D holographic lithography even require a stationary mask. Typical devices used for digital hologram reconstruction are based on spatial-light-modulator technology and this technology is great for reconstructing arbitrary holograms on the fly; however, it lacks the high spatial resolution achievable by its analog counterpart, holographic film. Analog holographic film is therefore the method of choice for reconstructing highquality static holograms. The challenge lies in taking a static, high-quality digitally calculated hologram and effectively writing it to holographic film. We have developed a theoretical system based on a tunable phase plate, an intensity adjustable high-coherence laser and a slip-stick based piezo rotation stage to effectively produce a digitally calculated hologram on analog film. The configuration reproduces the individual components, both the amplitude and phase, of the hologram in the Fourier domain. These Fourier components are then individually written on the holographic film after interfering with a reference beam. The system is analogous to writing angularly multiplexed plane waves with individual component phase control.
Ground-water quality atlas of Wisconsin
Kammerer, Phil A.
1981-01-01
This report summarizes data on ground-water quality stored in the U.S. Geological Survey's computer system (WATSTORE). The summary includes water quality data for 2,443 single-aquifer wells, which tap one of the State's three major aquifers (sand and gravel, Silurian dolomite, and sandstone). Data for dissolved solids, hardness, alkalinity, calcium, magnesium, sodium, potassium, iron, manganese, sulfate, chloride, fluoride, and nitrate are summarized by aquifer and by county, and locations of wells for which data are available 1 are shown for each aquifer. Calcium, magnesium, and bicarbonate (the principal component of alkalinity) are the major dissolved constituents in Wisconsin's ground water. High iron concentrations and hardness cause ground-water quality problems in much of the State. Statewide ,summaries of trace constituent (selected trace metals; arsenic, boron, and organic carbon) concentrations show that these constituents impair water quality in only a few isolated wells.
May, Matthias Stefan; Bruegel, Joscha; Brand, Michael; Wiesmueller, Marco; Krauss, Bernhard; Allmendinger, Thomas; Uder, Michael; Wuest, Wolfgang
2017-09-01
The aim of this study was to intra-individually compare the image quality obtained by dual-source, dual-energy (DSDE) computed tomography (CT) examinations and different virtual monoenergetic reconstructions to a low single-energy (SE) scan. Third-generation DSDE-CT was performed in 49 patients with histologically proven malignant disease of the head and neck region. Weighted average images (WAIs) and virtual monoenergetic images (VMIs) for low (40 and 60 keV) and high (120 and 190 keV) energies were reconstructed. A second scan aligned to the jaw, covering the oral cavity, was performed for every patient to reduce artifacts caused by dental hardware using a SE-CT protocol with 70-kV tube voltages and matching radiation dose settings. Objective image quality was evaluated by calculating contrast-to-noise ratios. Subjective image quality was evaluated by experienced radiologists. Highest contrast-to-noise ratios for vessel and tumor attenuation were obtained in 40-keV VMI (all P < 0.05). Comparable objective results were found in 60-keV VMI, WAI, and the 70-kV SE examinations. Overall subjective image quality was also highest for 40-keV, but differences to 60-keV VMI, WAI, and 70-kV SE were nonsignificant (all P > 0.05). High kiloelectron volt VMIs reduce metal artifacts with only limited diagnostic impact because of insufficiency in case of severe dental hardware. CTDIvol did not differ significantly between both examination protocols (DSDE: 18.6 mGy; 70-kV SE: 19.4 mGy; P = 0.10). High overall image quality for tumor delineation in head and neck imaging were obtained with 40-keV VMI. However, 70-kV SE examinations are an alternative and modified projections aligned to the jaw are recommended in case of severe artifacts caused by dental hardware.
An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform
NASA Technical Reports Server (NTRS)
Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak
2012-01-01
The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.
Ultrafast Comparison of Personal Genomes via Precomputed Genome Fingerprints.
Glusman, Gustavo; Mauldin, Denise E; Hood, Leroy E; Robinson, Max
2017-01-01
We present an ultrafast method for comparing personal genomes. We transform the standard genome representation (lists of variants relative to a reference) into "genome fingerprints" via locality sensitive hashing. The resulting genome fingerprints can be meaningfully compared even when the input data were obtained using different sequencing technologies, processed using different pipelines, represented in different data formats and relative to different reference versions. Furthermore, genome fingerprints are robust to up to 30% missing data. Because of their reduced size, computation on the genome fingerprints is fast and requires little memory. For example, we could compute all-against-all pairwise comparisons among the 2504 genomes in the 1000 Genomes data set in 67 s at high quality (21 μs per comparison, on a single processor), and achieved a lower quality approximation in just 11 s. Efficient computation enables scaling up a variety of important genome analyses, including quantifying relatedness, recognizing duplicative sequenced genomes in a set, population reconstruction, and many others. The original genome representation cannot be reconstructed from its fingerprint, effectively decoupling genome comparison from genome interpretation; the method thus has significant implications for privacy-preserving genome analytics.
NASA Astrophysics Data System (ADS)
Sasmal, Sudipta; Chakrabarti, Sandip Kumar; Pal, Sujay
To examine quality and propagation characteristics of radio waves in a very long propagation path, Indian Centre for Space Physics participated in the 27th Indian scientific expedition to Antarctica during 2007-2008. One Stanford University made AWESOME (Atmospheric Weather Educational System for Observation and Modeling of Effects) Very Low Frequency (VLF) receiving system was installed at the Indian Antarctic station Maitri and about five weeks of data was recorded successfully from the Indian transmitter VTX and several other transmitting stations worldwide. Signal quality of VTX was found to be very good and signal amplitude was highly stable. The signal showed evidences of round the clock solar radiation in Antarctic region during local summer. We compute elevation angle of the Sun theoretically during this period. We compute the spatial distribution of the signal by using the LWPC model during the all-day and all-night propagation conditions. We compute the attenuation coefficient of the different propagation modes and observe that different modes are dominating in different propagation conditions. We also observe effects of the Antarctic polar ice in the propagation modes.
Computer aided manual validation of mass spectrometry-based proteomic data.
Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M
2013-06-15
Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.
Demonstration of a stable ultrafast laser based on a nonlinear microcavity
Peccianti, M.; Pasquazi, A.; Park, Y.; Little, B.E.; Chu, S.T.; Moss, D.J.; Morandotti, R.
2012-01-01
Ultrashort pulsed lasers, operating through the phenomenon of mode-locking, have had a significant role in many facets of our society for 50 years, for example, in the way we exchange information, measure and diagnose diseases, process materials, and in many other applications. Recently, high-quality resonators have been exploited to demonstrate optical combs. The ability to phase-lock their modes would allow mode-locked lasers to benefit from their high optical spectral quality, helping to realize novel sources such as precision optical clocks for applications in metrology, telecommunication, microchip-computing, and many other areas. Here we demonstrate the first mode-locked laser based on a microcavity resonator. It operates via a new mode-locking method, which we term filter-driven four-wave mixing, and is based on a CMOS-compatible high quality factor microring resonator. It achieves stable self-starting oscillation with negligible amplitude noise at ultrahigh repetition rates, and spectral linewidths well below 130 kHz. PMID:22473009
This article describes the governing equations, computational algorithms, and other components entering into the Community Multiscale Air Quality (CMAQ) modeling system. This system has been designed to approach air quality as a whole by including state-of-the-science capabiliti...
Prototyping of Dental Structures Using Laser Milling
NASA Astrophysics Data System (ADS)
Andreev, A. O.; Kosenko, M. S.; Petrovskiy, V. N.; Mironov, V. D.
2016-02-01
The results of experimental studies of the effect of an ytterbium fiber laser radiation parameters on processing efficiency and quality of ZrO2 ceramics widely used in stomatology are presented. Laser operating conditions with optimum characteristics for obtaining high quality final surfaces and rapid material removal of dental structures are determined. The ability of forming thin-walled ceramic structures by laser milling technology (a minimum wall thickness of 50 μm) is demonstrated. The examples of three-dimensional dental structures created in computer 3D-models of human teeth using laser milling are shown.
Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork
Wood, Nathan J.; Halsing, David L.
2006-01-01
To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally, respondents support greater use of mobile-computing technology at the USGS and are interested in training opportunities and further discussions related to data archiving, access to additional digital data types, and technology development.
Garrido-Morgado, Álvaro; González-Benito, Óscar; Martos-Partal, Mercedes
2016-01-01
Creating and maintaining customer loyalty are strategic requirements for modern business. In the current competitive context, product quality, and brand experience are crucial in building and maintaining customer loyalty. Consumer loyalty, which may be classified into cognitive loyalty and affective loyalty, is related to customers' quality perception. Cue utilization theory distinguishes two dimensions for perceived quality, extrinsic quality-linked to the brand-and intrinsic quality-related with internal product characteristics. We propose that (i) cognitive loyalty is more influenced by intrinsic product quality whereas extrinsic product quality (brand name) is more salient for affective loyalty and, (ii) different commercial stimuli have a differential effectiveness on intrinsic and extrinsic perceived quality. In fact, in this study, we analyze how perceived quality dimensions may influence the effectiveness of two different commercial stimuli: displays and advertising flyers. While displays work within the point of sale under time-constrained conditions where consumers are more likely to use heuristics to simplify their decisions, advertising flyers work outside of the point of sale under low time-constrained conditions, and therefore favor a more reasoned purchase decision where systematic processing will be more likely. We analyze the role of quality perception in determining the effectiveness of both these commercial stimuli for selling products that induce high purchase involvement and perceived risk. The empirical analysis focuses on computer products sold by one of Europe's largest computer retailers and it combines scanner, observational, and survey data. The results show that both dimensions of quality perceptions moderate the influence of displays and advertising flyers on sales, but their impact is different on each commercial stimuli. Extrinsic quality perception increases to a greater extent the effect of displays due to the use of a brand name heuristic. However, intrinsic quality perception improves to a greater extent the effect of advertising flyers, which in turn are more closely related to systematic decision processing.
Kim, Joshua; Lu, Weiguo; Zhang, Tiezhi
2014-02-07
Cone-beam computed tomography (CBCT) is an important online imaging modality for image guided radiotherapy. But suboptimal image quality and the lack of a real-time stereoscopic imaging function limit its implementation in advanced treatment techniques, such as online adaptive and 4D radiotherapy. Tetrahedron beam computed tomography (TBCT) is a novel online imaging modality designed to improve on the image quality provided by CBCT. TBCT geometry is flexible, and multiple detector and source arrays can be used for different applications. In this paper, we describe a novel dual source-dual detector TBCT system that is specially designed for LINAC radiation treatment machines. The imaging system is positioned in-line with the MV beam and is composed of two linear array x-ray sources mounted aside the electrical portal imaging device and two linear arrays of x-ray detectors mounted below the machine head. The detector and x-ray source arrays are orthogonal to each other, and each pair of source and detector arrays forms a tetrahedral volume. Four planer images can be obtained from different view angles at each gantry position at a frame rate as high as 20 frames per second. The overlapped regions provide a stereoscopic field of view of approximately 10-15 cm. With a half gantry rotation, a volumetric CT image can be reconstructed having a 45 cm field of view. Due to the scatter rejecting design of the TBCT geometry, the system can potentially produce high quality 2D and 3D images with less radiation exposure. The design of the dual source-dual detector system is described, and preliminary results of studies performed on numerical phantoms and simulated patient data are presented.
NASA Astrophysics Data System (ADS)
Kim, Joshua; Lu, Weiguo; Zhang, Tiezhi
2014-02-01
Cone-beam computed tomography (CBCT) is an important online imaging modality for image guided radiotherapy. But suboptimal image quality and the lack of a real-time stereoscopic imaging function limit its implementation in advanced treatment techniques, such as online adaptive and 4D radiotherapy. Tetrahedron beam computed tomography (TBCT) is a novel online imaging modality designed to improve on the image quality provided by CBCT. TBCT geometry is flexible, and multiple detector and source arrays can be used for different applications. In this paper, we describe a novel dual source-dual detector TBCT system that is specially designed for LINAC radiation treatment machines. The imaging system is positioned in-line with the MV beam and is composed of two linear array x-ray sources mounted aside the electrical portal imaging device and two linear arrays of x-ray detectors mounted below the machine head. The detector and x-ray source arrays are orthogonal to each other, and each pair of source and detector arrays forms a tetrahedral volume. Four planer images can be obtained from different view angles at each gantry position at a frame rate as high as 20 frames per second. The overlapped regions provide a stereoscopic field of view of approximately 10-15 cm. With a half gantry rotation, a volumetric CT image can be reconstructed having a 45 cm field of view. Due to the scatter rejecting design of the TBCT geometry, the system can potentially produce high quality 2D and 3D images with less radiation exposure. The design of the dual source-dual detector system is described, and preliminary results of studies performed on numerical phantoms and simulated patient data are presented.
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
High-kVp Assisted Metal Artifact Reduction for X-ray Computed Tomography
Xi, Yan; Jin, Yannan; De Man, Bruno; Wang, Ge
2016-01-01
In X-ray computed tomography (CT), the presence of metallic parts in patients causes serious artifacts and degrades image quality. Many algorithms were published for metal artifact reduction (MAR) over the past decades with various degrees of success but without a perfect solution. Some MAR algorithms are based on the assumption that metal artifacts are due only to strong beam hardening and may fail in the case of serious photon starvation. Iterative methods handle photon starvation by discarding or underweighting corrupted data, but the results are not always stable and they come with high computational cost. In this paper, we propose a high-kVp-assisted CT scan mode combining a standard CT scan with a few projection views at a high-kVp value to obtain critical projection information near the metal parts. This method only requires minor hardware modifications on a modern CT scanner. Two MAR algorithms are proposed: dual-energy normalized MAR (DNMAR) and high-energy embedded MAR (HEMAR), aiming at situations without and with photon starvation respectively. Simulation results obtained with the CT simulator CatSim demonstrate that the proposed DNMAR and HEMAR methods can eliminate metal artifacts effectively. PMID:27891293
Performance evaluation of objective quality metrics for HDR image compression
NASA Astrophysics Data System (ADS)
Valenzise, Giuseppe; De Simone, Francesca; Lauga, Paul; Dufaux, Frederic
2014-09-01
Due to the much larger luminance and contrast characteristics of high dynamic range (HDR) images, well-known objective quality metrics, widely used for the assessment of low dynamic range (LDR) content, cannot be directly applied to HDR images in order to predict their perceptual fidelity. To overcome this limitation, advanced fidelity metrics, such as the HDR-VDP, have been proposed to accurately predict visually significant differences. However, their complex calibration may make them difficult to use in practice. A simpler approach consists in computing arithmetic or structural fidelity metrics, such as PSNR and SSIM, on perceptually encoded luminance values but the performance of quality prediction in this case has not been clearly studied. In this paper, we aim at providing a better comprehension of the limits and the potentialities of this approach, by means of a subjective study. We compare the performance of HDR-VDP to that of PSNR and SSIM computed on perceptually encoded luminance values, when considering compressed HDR images. Our results show that these simpler metrics can be effectively employed to assess image fidelity for applications such as HDR image compression.
NASA Astrophysics Data System (ADS)
Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai
2016-09-01
The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.
Cosmological N-body Simulation
NASA Astrophysics Data System (ADS)
Lake, George
1994-05-01
.90ex> }}} The ``N'' in N-body calculations has doubled every year for the last two decades. To continue this trend, the UW N-body group is working on algorithms for the fast evaluation of gravitational forces on parallel computers and establishing rigorous standards for the computations. In these algorithms, the computational cost per time step is ~ 10(3) pairwise forces per particle. A new adaptive time integrator enables us to perform high quality integrations that are fully temporally and spatially adaptive. SPH--smoothed particle hydrodynamics will be added to simulate the effects of dissipating gas and magnetic fields. The importance of these calculations is two-fold. First, they determine the nonlinear consequences of theories for the structure of the Universe. Second, they are essential for the interpretation of observations. Every galaxy has six coordinates of velocity and position. Observations determine two sky coordinates and a line of sight velocity that bundles universal expansion (distance) together with a random velocity created by the mass distribution. Simulations are needed to determine the underlying structure and masses. The importance of simulations has moved from ex post facto explanation to an integral part of planning large observational programs. I will show why high quality simulations with ``large N'' are essential to accomplish our scientific goals. This year, our simulations have N >~ 10(7) . This is sufficient to tackle some niche problems, but well short of our 5 year goal--simulating The Sloan Digital Sky Survey using a few Billion particles (a Teraflop-year simulation). Extrapolating past trends, we would have to ``wait'' 7 years for this hundred-fold improvement. Like past gains, significant changes in the computational methods are required for these advances. I will describe new algorithms, algorithmic hacks and a dedicated computer to perform Billion particle simulations. Finally, I will describe research that can be enabled by Petaflop computers. This research is supported by the NASA HPCC/ESS program.
Computable visually observed phenotype ontological framework for plants
2011-01-01
Background The ability to search for and precisely compare similar phenotypic appearances within and across species has vast potential in plant science and genetic research. The difficulty in doing so lies in the fact that many visual phenotypic data, especially visually observed phenotypes that often times cannot be directly measured quantitatively, are in the form of text annotations, and these descriptions are plagued by semantic ambiguity, heterogeneity, and low granularity. Though several bio-ontologies have been developed to standardize phenotypic (and genotypic) information and permit comparisons across species, these semantic issues persist and prevent precise analysis and retrieval of information. A framework suitable for the modeling and analysis of precise computable representations of such phenotypic appearances is needed. Results We have developed a new framework called the Computable Visually Observed Phenotype Ontological Framework for plants. This work provides a novel quantitative view of descriptions of plant phenotypes that leverages existing bio-ontologies and utilizes a computational approach to capture and represent domain knowledge in a machine-interpretable form. This is accomplished by means of a robust and accurate semantic mapping module that automatically maps high-level semantics to low-level measurements computed from phenotype imagery. The framework was applied to two different plant species with semantic rules mined and an ontology constructed. Rule quality was evaluated and showed high quality rules for most semantics. This framework also facilitates automatic annotation of phenotype images and can be adopted by different plant communities to aid in their research. Conclusions The Computable Visually Observed Phenotype Ontological Framework for plants has been developed for more efficient and accurate management of visually observed phenotypes, which play a significant role in plant genomics research. The uniqueness of this framework is its ability to bridge the knowledge of informaticians and plant science researchers by translating descriptions of visually observed phenotypes into standardized, machine-understandable representations, thus enabling the development of advanced information retrieval and phenotype annotation analysis tools for the plant science community. PMID:21702966
Two-stage atlas subset selection in multi-atlas based image segmentation.
Zhao, Tingting; Ruan, Dan
2015-06-01
Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas subset selection scheme for multi-atlas based segmentation. It achieves good segmentation accuracy with significantly reduced computation cost, making it a suitable configuration in the presence of extensive heterogeneous atlases.
Fang, Ruogu; Karlsson, Kolbeinn; Chen, Tsuhan; Sanelli, Pina C.
2014-01-01
Blood-brain-barrier permeability (BBBP) measurements extracted from the perfusion computed tomography (PCT) using the Patlak model can be a valuable indicator to predict hemorrhagic transformation in patients with acute stroke. Unfortunately, the standard Patlak model based PCT requires excessive radiation exposure, which raised attention on radiation safety. Minimizing radiation dose is of high value in clinical practice but can degrade the image quality due to the introduced severe noise. The purpose of this work is to construct high quality BBBP maps from low-dose PCT data by using the brain structural similarity between different individuals and the relations between the high- and low-dose maps. The proposed sparse high-dose induced (shd-Patlak) model performs by building a high-dose induced prior for the Patlak model with a set of location adaptive dictionaries, followed by an optimized estimation of BBBP map with the prior regularized Patlak model. Evaluation with the simulated low-dose clinical brain PCT datasets clearly demonstrate that the shd-Patlak model can achieve more significant gains than the standard Patlak model with improved visual quality, higher fidelity to the gold standard and more accurate details for clinical analysis. PMID:24200529
Gebhard, Cathérine; Fuchs, Tobias A; Fiechter, Michael; Stehli, Julia; Stähli, Barbara E; Gaemperli, Oliver; Kaufmann, Philipp A
2013-10-01
The accuracy of coronary computed tomography angiography (CCTA) in obese persons is compromised by increased image noise. We investigated CCTA image quality acquired on a high-definition 64-slice CT scanner using modern adaptive statistical iterative reconstruction (ASIR). Seventy overweight and obese patients (24 males; mean age 57 years, mean body mass index 33 kg/m(2)) were studied with clinically-indicated contrast enhanced CCTA. Thirty-five patients underwent a standard definition protocol with filtered backprojection reconstruction (SD-FBP) while 35 patients matched for gender, age, body mass index and coronary artery calcifications underwent a novel high definition protocol with ASIR (HD-ASIR). Segment by segment image quality was assessed using a four-point scale (1 = excellent, 2 = good, 3 = moderate, 4 = non-diagnostic) and revealed better scores for HD-ASIR compared to SD-FBP (1.5 ± 0.43 vs. 1.8 ± 0.48; p < 0.05). The smallest detectable vessel diameter was also improved, 1.0 ± 0.5 mm for HD-ASIR as compared to 1.4 ± 0.4 mm for SD-FBP (p < 0.001). Average vessel attenuation was higher for HD-ASIR (388.3 ± 109.6 versus 350.6 ± 90.3 Hounsfield Units, HU; p < 0.05), while image noise, signal-to-noise ratio and contrast-to noise ratio did not differ significantly between reconstruction protocols (p = NS). The estimated effective radiation doses were similar, 2.3 ± 0.1 and 2.5 ± 0.1 mSv (HD-ASIR vs. SD-ASIR respectively). Compared to a standard definition backprojection protocol (SD-FBP), a newer high definition scan protocol in combination with ASIR (HD-ASIR) incrementally improved image quality and visualization of distal coronary artery segments in overweight and obese individuals, without increasing image noise and radiation dose.
ERIC Educational Resources Information Center
Zigic, Sasha; Lemckert, Charles J.
2007-01-01
The following paper presents a computer-based learning strategy to assist in introducing and teaching water quality modelling to undergraduate civil engineering students. As part of the learning strategy, an interactive computer-based instructional (CBI) aid was specifically developed to assist students to set up, run and analyse the output from a…
Mekitarian Filho, Eduardo; de Carvalho, Werther Brunow; Gilio, Alfredo Elias; Robinson, Fay; Mason, Keira P
2013-10-01
This pilot study introduces the aerosolized route for midazolam as an option for infant and pediatric sedation for computed tomography imaging. This technique produced predictable and effective sedation for quality computed tomography imaging studies with minimal artifact and no significant adverse events. Copyright © 2013 Mosby, Inc. All rights reserved.
Nauer, Claude Bertrand; Zubler, Christoph; Weisstanner, Christian; Stieger, Christof; Senn, Pascal; Arnold, Andreas
2012-03-01
The purpose of this experimental study was to investigate the effect of tube tension reduction on image contrast and image quality in pediatric temporal bone computed tomography (CT). Seven lamb heads with infant-equivalent sizes were scanned repeatedly, using four tube tensions from 140 to 80 kV while the CT-Dose Index (CTDI) was held constant. Scanning was repeated with four CTDI values from 30 to 3 mGy. Image contrast was calculated for the middle ear as the Hounsfield unit (HU) difference between bone and air and for the inner ear as the HU difference between bone and fluid. The influence of tube tension on high-contrast detail delineation was evaluated using a phantom. The subjective image quality of eight middle and inner ear structures was assessed using a 4-point scale (scores 1-2 = insufficient; scores 3-4 = sufficient). Middle and inner ear contrast showed a near linear increase with tube tension reduction (r = -0.94/-0.88) and was highest at 80 kV. Tube tension had no influence on spatial resolution. Subjective image quality analysis showed significantly better scoring at lower tube tensions, with highest image quality at 80 kV. However, image quality improvement was most relevant for low-dose scans. Image contrast in the temporal bone is significantly higher at low tube tensions, leading to a better subjective image quality. Highest contrast and best quality were found at 80 kV. This image quality improvement might be utilized to further reduce the radiation dose in pediatric low-dose CT protocols.
Depth image super-resolution via semi self-taught learning framework
NASA Astrophysics Data System (ADS)
Zhao, Furong; Cao, Zhiguo; Xiao, Yang; Zhang, Xiaodi; Xian, Ke; Li, Ruibo
2017-06-01
Depth images have recently attracted much attention in computer vision and high-quality 3D content for 3DTV and 3D movies. In this paper, we present a new semi self-taught learning application framework for enhancing resolution of depth maps without making use of ancillary color images data at the target resolution, or multiple aligned depth maps. Our framework consists of cascade random forests reaching from coarse to fine results. We learn the surface information and structure transformations both from a small high-quality depth exemplars and the input depth map itself across different scales. Considering that edge plays an important role in depth map quality, we optimize an effective regularized objective that calculates on output image space and input edge space in random forests. Experiments show the effectiveness and superiority of our method against other techniques with or without applying aligned RGB information
A protocol for generating a high-quality genome-scale metabolic reconstruction.
Thiele, Ines; Palsson, Bernhard Ø
2010-01-01
Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have been developed over the last 10 years. These reconstructions represent structured knowledge bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates a myriad of computational biological studies, including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge bases. Here we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction, as well as the common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process.
A protocol for generating a high-quality genome-scale metabolic reconstruction
Thiele, Ines; Palsson, Bernhard Ø.
2011-01-01
Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have developed over the past 10 years. These reconstructions represent structured knowledge-bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates myriad computational biological studies including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics, and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge-bases. Here, we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction as well as common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process. PMID:20057383
Multi-Resolution Unstructured Grid-Generation for Geophysical Applications on the Sphere
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2015-01-01
An algorithm for the generation of non-uniform unstructured grids on ellipsoidal geometries is described. This technique is designed to generate high quality triangular and polygonal meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric and ocean simulation, and numerical weather predication. Using a recently developed Frontal-Delaunay-refinement technique, a method for the construction of high-quality unstructured ellipsoidal Delaunay triangulations is introduced. A dual polygonal grid, derived from the associated Voronoi diagram, is also optionally generated as a by-product. Compared to existing techniques, it is shown that the Frontal-Delaunay approach typically produces grids with near-optimal element quality and smooth grading characteristics, while imposing relatively low computational expense. Initial results are presented for a selection of uniform and non-uniform ellipsoidal grids appropriate for large-scale geophysical applications. The use of user-defined mesh-sizing functions to generate smoothly graded, non-uniform grids is discussed.
Low-temperature magnetotransport in Si/SiGe heterostructures on 300 mm Si wafers
NASA Astrophysics Data System (ADS)
Scappucci, Giordano; Yeoh, L.; Sabbagh, D.; Sammak, A.; Boter, J.; Droulers, G.; Kalhor, N.; Brousse, D.; Veldhorst, M.; Vandersypen, L. M. K.; Thomas, N.; Roberts, J.; Pillarisetty, R.; Amin, P.; George, H. C.; Singh, K. J.; Clarke, J. S.
Undoped Si/SiGe heterostructures are a promising material stack for the development of spin qubits in silicon. To deploy a qubit into high volume manufacturing in a quantum computer requires stringent control over substrate uniformity and quality. Electron mobility and valley splitting are two key electrical metrics of substrate quality relevant for qubits. Here we present low-temperature magnetotransport measurements of strained Si quantum wells with mobilities in excess of 100000 cm2/Vs fabricated on 300 mm wafers within the framework of advanced semiconductor manufacturing. These results are benchmarked against the results obtained in Si quantum wells deposited on 100 mm Si wafers in an academic research environment. To ensure rapid progress in quantum wells quality we have implemented fast feedback loops from materials growth, to heterostructure FET fabrication, and low temperature characterisation. On this topic we will present recent progress in developing a cryogenic platform for high-throughput magnetotransport measurements.
Study of ceramic products and processing techniques in space. [using computerized simulation
NASA Technical Reports Server (NTRS)
Markworth, A. J.; Oldfield, W.
1974-01-01
An analysis of the solidification kinetics of beta alumina in a zero-gravity environment was carried out, using computer-simulation techniques, in order to assess the feasibility of producing high-quality single crystals of this material in space. The two coupled transport processes included were movement of the solid-liquid interface and diffusion of sodium atoms in the melt. Results of the simulation indicate that appreciable crystal-growth rates can be attained in space. Considerations were also made of the advantages offered by high-quality single crystals of beta alumina for use as a solid electrolyte; these clearly indicate that space-grown materials are superior in many respects to analogous terrestrially-grown crystals. Likewise, economic considerations, based on the rapidly expanding technological applications for beta alumina and related fast ionic conductors, reveal that the many superior qualities of space-grown material justify the added expense and experimental detail associated with space processing.
SAIL: Summation-bAsed Incremental Learning for Information-Theoretic Text Clustering.
Cao, Jie; Wu, Zhiang; Wu, Junjie; Xiong, Hui
2013-04-01
Information-theoretic clustering aims to exploit information-theoretic measures as the clustering criteria. A common practice on this topic is the so-called Info-Kmeans, which performs K-means clustering with KL-divergence as the proximity function. While expert efforts on Info-Kmeans have shown promising results, a remaining challenge is to deal with high-dimensional sparse data such as text corpora. Indeed, it is possible that the centroids contain many zero-value features for high-dimensional text vectors, which leads to infinite KL-divergence values and creates a dilemma in assigning objects to centroids during the iteration process of Info-Kmeans. To meet this challenge, in this paper, we propose a Summation-bAsed Incremental Learning (SAIL) algorithm for Info-Kmeans clustering. Specifically, by using an equivalent objective function, SAIL replaces the computation of KL-divergence by the incremental computation of Shannon entropy. This can avoid the zero-feature dilemma caused by the use of KL-divergence. To improve the clustering quality, we further introduce the variable neighborhood search scheme and propose the V-SAIL algorithm, which is then accelerated by a multithreaded scheme in PV-SAIL. Our experimental results on various real-world text collections have shown that, with SAIL as a booster, the clustering performance of Info-Kmeans can be significantly improved. Also, V-SAIL and PV-SAIL indeed help improve the clustering quality at a lower cost of computation.
Room temperature linelists for CO2 asymmetric isotopologues with ab initio computed intensities
NASA Astrophysics Data System (ADS)
Zak, Emil J.; Tennyson, Jonathan; Polyansky, Oleg L.; Lodi, Lorenzo; Zobov, Nikolay F.; Tashkun, Sergei A.; Perevalov, Valery I.
2017-12-01
The present paper reports room temperature line lists for six asymmetric isotopologues of carbon dioxide: 16O12C18O (628), 16O12C17O (627), 16O13C18O (638),16O13C17O (637), 17O12C18O (728) and 17O13C18O (738), covering the range 0-8000 cm-1. Variational rotation-vibration wavefunctions and energy levels are computed using the DVR3D software suite and a high quality semi-empirical potential energy surface (PES), followed by computation of intensities using an ab initio dipole moment surface (DMS). A theoretical procedure for quantifying sensitivity of line intensities to minor distortions of the PES/DMS renders our theoretical model as critically evaluated. Several recent high quality measurements and theoretical approaches are discussed to provide a benchmark of our results against the most accurate available data. Indeed, the thesis of transferability of accuracy among different isotopologues with the use of mass-independent PES is supported by several examples. Thereby, we conclude that the majority of line intensities for strong bands are predicted with sub-percent accuracy. Accurate line positions are generated using an effective Hamiltonian, constructed from the latest experiments. This study completes the list of relevant isotopologues of carbon dioxide; these line lists are available to remote sensing studies and inclusion in databases.
Data Curation: Improving Environmental Health Data Quality.
Yang, Lin; Li, Jiao; Hou, Li; Qian, Qing
2015-01-01
With the growing recognition of the influence of climate change on human health, scientists' attention to analyzing the relationship between meteorological factors and adverse health effects. However, the paucity of high quality integrated data is one of the great challenges, especially when scientific studies rely on data-intensive computing. This paper aims to design an appropriate curation process to address this problem. We present a data curation workflow that: (i) follows the guidance of DCC Curation Lifecycle Model; (ii) combines manual curation with automatic curation; (iii) and solves environmental health data curation problem. The workflow was applied to a medical knowledge service system and showed that it was capable of improving work efficiency and data quality.
GPU-accelerated regularized iterative reconstruction for few-view cone beam CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matenine, Dmitri, E-mail: dmitri.matenine.1@ulaval.ca; Goussard, Yves, E-mail: yves.goussard@polymtl.ca; Després, Philippe, E-mail: philippe.despres@phy.ulaval.ca
2015-04-15
Purpose: The present work proposes an iterative reconstruction technique designed for x-ray transmission computed tomography (CT). The main objective is to provide a model-based solution to the cone-beam CT reconstruction problem, yielding accurate low-dose images via few-views acquisitions in clinically acceptable time frames. Methods: The proposed technique combines a modified ordered subsets convex (OSC) algorithm and the total variation minimization (TV) regularization technique and is called OSC-TV. The number of subsets of each OSC iteration follows a reduction pattern in order to ensure the best performance of the regularization method. Considering the high computational cost of the algorithm, it ismore » implemented on a graphics processing unit, using parallelization to accelerate computations. Results: The reconstructions were performed on computer-simulated as well as human pelvic cone-beam CT projection data and image quality was assessed. In terms of convergence and image quality, OSC-TV performs well in reconstruction of low-dose cone-beam CT data obtained via a few-view acquisition protocol. It compares favorably to the few-view TV-regularized projections onto convex sets (POCS-TV) algorithm. It also appears to be a viable alternative to full-dataset filtered backprojection. Execution times are of 1–2 min and are compatible with the typical clinical workflow for nonreal-time applications. Conclusions: Considering the image quality and execution times, this method may be useful for reconstruction of low-dose clinical acquisitions. It may be of particular benefit to patients who undergo multiple acquisitions by reducing the overall imaging radiation dose and associated risks.« less
Emotion Analysis of Telephone Complaints from Customer Based on Affective Computing.
Gong, Shuangping; Dai, Yonghui; Ji, Jun; Wang, Jinzhao; Sun, Hai
2015-01-01
Customer complaint has been the important feedback for modern enterprises to improve their product and service quality as well as the customer's loyalty. As one of the commonly used manners in customer complaint, telephone communication carries rich emotional information of speeches, which provides valuable resources for perceiving the customer's satisfaction and studying the complaint handling skills. This paper studies the characteristics of telephone complaint speeches and proposes an analysis method based on affective computing technology, which can recognize the dynamic changes of customer emotions from the conversations between the service staff and the customer. The recognition process includes speaker recognition, emotional feature parameter extraction, and dynamic emotion recognition. Experimental results show that this method is effective and can reach high recognition rates of happy and angry states. It has been successfully applied to the operation quality and service administration in telecom and Internet service company.
Youpi: YOUr processing PIpeline
NASA Astrophysics Data System (ADS)
Monnerville, Mathias; Sémah, Gregory
2012-03-01
Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.
High Available COTS Based Computer for Space
NASA Astrophysics Data System (ADS)
Hartmann, J.; Magistrati, Giorgio
2015-09-01
The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.
Guo, Hao; Fu, Jing
2013-01-01
Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489
Emotion Analysis of Telephone Complaints from Customer Based on Affective Computing
Gong, Shuangping; Ji, Jun; Wang, Jinzhao; Sun, Hai
2015-01-01
Customer complaint has been the important feedback for modern enterprises to improve their product and service quality as well as the customer's loyalty. As one of the commonly used manners in customer complaint, telephone communication carries rich emotional information of speeches, which provides valuable resources for perceiving the customer's satisfaction and studying the complaint handling skills. This paper studies the characteristics of telephone complaint speeches and proposes an analysis method based on affective computing technology, which can recognize the dynamic changes of customer emotions from the conversations between the service staff and the customer. The recognition process includes speaker recognition, emotional feature parameter extraction, and dynamic emotion recognition. Experimental results show that this method is effective and can reach high recognition rates of happy and angry states. It has been successfully applied to the operation quality and service administration in telecom and Internet service company. PMID:26633967
Gao, Yuan; Peters, Ove A; Wu, Hongkun; Zhou, Xuedong
2009-02-01
The purpose of this study was to customize an application framework by using the MeVisLab image processing and visualization platform for three-dimensional reconstruction and assessment of tooth and root canal morphology. One maxillary first molar was scanned before and after preparation with ProTaper by using micro-computed tomography. With a customized application framework based on MeVisLab, internal and external anatomy was reconstructed. Furthermore, the dimensions of root canal and radicular dentin were quantified, and effects of canal preparation were assessed. Finally, a virtual preparation with risk analysis was performed to simulate the removal of a broken instrument. This application framework provided an economical platform and met current requirements of endodontic research. The broad-based use of high-quality free software and the resulting exchange of experience might help to improve the quality of endodontic research with micro-computed tomography.
LLSURE: local linear SURE-based edge-preserving image filtering.
Qiu, Tianshuang; Wang, Aiqi; Yu, Nannan; Song, Aimin
2013-01-01
In this paper, we propose a novel approach for performing high-quality edge-preserving image filtering. Based on a local linear model and using the principle of Stein's unbiased risk estimate as an estimator for the mean squared error from the noisy image only, we derive a simple explicit image filter which can filter out noise while preserving edges and fine-scale details. Moreover, this filter has a fast and exact linear-time algorithm whose computational complexity is independent of the filtering kernel size; thus, it can be applied to real time image processing tasks. The experimental results demonstrate the effectiveness of the new filter for various computer vision applications, including noise reduction, detail smoothing and enhancement, high dynamic range compression, and flash/no-flash denoising.
Converting laserdisc video to digital video: a demonstration project using brain animations.
Jao, C S; Hier, D B; Brint, S U
1995-01-01
Interactive laserdiscs are of limited value in large group learning situations due to the expense of establishing multiple workstations. The authors implemented an alternative to laserdisc video by using indexed digital video combined with an expert system. High-quality video was captured from a laserdisc player and combined with waveform audio into an audio-video-interleave (AVI) file format in the Microsoft Video-for-Windows environment (Microsoft Corp., Seattle, WA). With the use of an expert system, a knowledge-based computer program provided random access to these indexed AVI files. The program can be played on any multimedia computer without the need for laserdiscs. This system offers a high level of interactive video without the overhead and cost of a laserdisc player.
SOA-based digital library services and composition in biomedical applications.
Zhao, Xia; Liu, Enjie; Clapworthy, Gordon J; Viceconti, Marco; Testi, Debora
2012-06-01
Carefully collected, high-quality data are crucial in biomedical visualization, and it is important that the user community has ready access to both this data and the high-performance computing resources needed by the complex, computational algorithms that will process it. Biological researchers generally require data, tools and algorithms from multiple providers to achieve their goals. This paper illustrates our response to the problems that result from this. The Living Human Digital Library (LHDL) project presented in this paper has taken advantage of Web Services to build a biomedical digital library infrastructure that allows clinicians and researchers not only to preserve, trace and share data resources, but also to collaborate at the data-processing level. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Load Balancing Strategies for Multi-Block Overset Grid Applications
NASA Technical Reports Server (NTRS)
Djomehri, M. Jahed; Biswas, Rupak; Lopez-Benitez, Noe; Biegel, Bryan (Technical Monitor)
2002-01-01
The multi-block overset grid method is a powerful technique for high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process uses a grid system that discretizes the problem domain by using separately generated but overlapping structured grids that periodically update and exchange boundary information through interpolation. For efficient high performance computations of large-scale realistic applications using this methodology, the individual grids must be properly partitioned among the parallel processors. Overall performance, therefore, largely depends on the quality of load balancing. In this paper, we present three different load balancing strategies far overset grids and analyze their effects on the parallel efficiency of a Navier-Stokes CFD application running on an SGI Origin2000 machine.
Silveira, Augusta; Gonçalves, Joaquim; Sequeira, Teresa; Ribeiro, Cláudia; Lopes, Carlos; Monteiro, Eurico; Pimentel, Francisco Luís
2011-12-01
Quality of Life is a distinct and important emerging health focus, guiding practice and research. The routine Quality of Life evaluation in clinical, economic, and epidemiological studies and in medical practice promises a better Quality of Life and improved health resources optimization. The use of information technology and a Knowledge Management System related to Quality of Life assessment is essential to routine clinical evaluation and can define a clinical research methodology that is more efficient and better organized. In this paper, a Validation Model using the Quality of Life informatics platform is presented. Portuguese PC-software using European Organization for Research and Treatment of Cancer questionnaires (EORTC-QLQ C30 and EORTC-H&N35), is compared with the original paper-pen approach in the Quality of Life monitoring of head and neck cancer patients. The Quality of Life informatics platform was designed specifically for this study with a simple and intuitive interface that ensures confidentiality while providing Quality of Life evaluation for all cancer patients. For the Validation Model, the sample selection was random. Fifty-four head and neck cancer patients completed 216 questionnaires (108 using the informatics platform and 108 using the original paper-pen approach) with a one-hour interval in between. Patient preferences and computer experience were registered. Quality of Life informatics platform showed high usability as a user-friendly tool. This informatics platform allows data collection by auto-reply, database construction, and statistical data analysis and also facilitates the automatic listing of the questionnaires. When comparing the approaches (Wilcoxon test by item, percentile distribution and Cronbach's alpha), most of the responses were similar. Most of the patients (53.6%) reported a preference for the software version. The Quality of Life informatics platform has revealed to be a powerful and effective tool, allowing a real time analysis of Quality of Life data. Computer-based quality-of-life monitoring in head and neck cancer patients is essential to get clinically meaningful data that can support clinical decisions, identify potential needs, and support a stepped-care model. This represents a fundamental step for routine Quality of Life implementation in the Oncology Portuguese Institute (IPO-Porto), ORL and C&P department services clinical practice. Finally, we propose a diagram of diagnostic performance, considerating the generalized lack of mycological diagnosis in Portugal, which emphasizes the need for a careful history, focused on quantifying the latency period.
Computational efficiency for the surface renewal method
NASA Astrophysics Data System (ADS)
Kelley, Jason; Higgins, Chad
2018-04-01
Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.
Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford
2010-01-01
The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method.
Nechayev, Sergey; Reusswig, Philip D; Baldo, Marc A; Rotschild, Carmel
2016-12-07
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd 3+ :YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers.
Designing a Broadband Pump for High-Quality Micro-Lasers via Modified Net Radiation Method
Nechayev, Sergey; Reusswig, Philip D.; Baldo, Marc A.; Rotschild, Carmel
2016-01-01
High-quality micro-lasers are key ingredients in non-linear optics, communication, sensing and low-threshold solar-pumped lasers. However, such micro-lasers exhibit negligible absorption of free-space broadband pump light. Recently, this limitation was lifted by cascade energy transfer, in which the absorption and quality factor are modulated with wavelength, enabling non-resonant pumping of high-quality micro-lasers and solar-pumped laser to operate at record low solar concentration. Here, we present a generic theoretical framework for modeling the absorption, emission and energy transfer of incoherent radiation between cascade sensitizer and laser gain media. Our model is based on linear equations of the modified net radiation method and is therefore robust, fast converging and has low complexity. We apply this formalism to compute the optimal parameters of low-threshold solar-pumped lasers. It is revealed that the interplay between the absorption and self-absorption of such lasers defines the optimal pump absorption below the maximal value, which is in contrast to conventional lasers for which full pump absorption is desired. Numerical results are compared to experimental data on a sensitized Nd3+:YAG cavity, and quantitative agreement with theoretical models is found. Our work modularizes the gain and sensitizing components and paves the way for the optimal design of broadband-pumped high-quality micro-lasers and efficient solar-pumped lasers. PMID:27924844
Mpeg2 codec HD improvements with medical and robotic imaging benefits
NASA Astrophysics Data System (ADS)
Picard, Wayne F. J.
2010-02-01
In this report, we propose an efficient scheme to use High Definition Television (HDTV) in a console or notebook format as a computer terminal in addition to their role as TV display unit. In the proposed scheme, we assume that the main computer is situated at a remote location. The computer raster in the remote server is compressed using an HD E- >Mpeg2 encoder and transmitted to the terminal at home. The built-in E->Mpeg2 decoder in the terminal decompresses the compressed bit stream, and displays the raster. The terminal will be fitted with a mouse and keyboard, through which the interaction with the remote computer server can be performed via a communications back channel. The terminal in a notebook format can thus be used as a high resolution computer and multimedia device. We will consider developments such as the required HD enhanced Mpeg2 resolution (E->Mpeg2) and its medical ramifications due to improvements on compressed image quality with 2D to 3D conversion (Mpeg3) and using the compressed Discrete Cosine Transform coefficients in the reality compression of vision and control of medical robotic surgeons.
Improved depth estimation with the light field camera
NASA Astrophysics Data System (ADS)
Wang, Huachun; Sang, Xinzhu; Chen, Duo; Guo, Nan; Wang, Peng; Yu, Xunbo; Yan, Binbin; Wang, Kuiru; Yu, Chongxiu
2017-10-01
Light-field cameras are used in consumer and industrial applications. An array of micro-lenses captures enough information that one can refocus images after acquisition, as well as shift one's viewpoint within the sub-apertures of the main lens, effectively obtaining multiple views. Thus, depth estimation from both defocus and correspondence are now available in a single capture. And Lytro.Inc also provides a depth estimation from a single-shot capture with light field camera, like Lytro Illum. This Lytro depth estimation containing many correct depth information can be used for higher quality estimation. In this paper, we present a novel simple and principled algorithm that computes dense depth estimation by combining defocus, correspondence and Lytro depth estimations. We analyze 2D epipolar image (EPI) to get defocus and correspondence depth maps. Defocus depth is obtained by computing the spatial gradient after angular integration and correspondence depth by computing the angular variance from EPIs. Lytro depth can be extracted from Lyrto Illum with software. We then show how to combine the three cues into a high quality depth map. Our method for depth estimation is suitable for computer vision applications such as matting, full control of depth-of-field, and surface reconstruction, as well as light filed display
CZT sensors for Computed Tomography: from crystal growth to image quality
NASA Astrophysics Data System (ADS)
Iniewski, K.
2016-12-01
Recent advances in Traveling Heater Method (THM) growth and device fabrication that require additional processing steps have enabled to dramatically improve hole transport properties and reduce polarization effects in Cadmium Zinc Telluride (CZT) material. As a result high flux operation of CZT sensors at rates in excess of 200 Mcps/mm2 is now possible and has enabled multiple medical imaging companies to start building prototype Computed Tomography (CT) scanners. CZT sensors are also finding new commercial applications in non-destructive testing (NDT) and baggage scanning. In order to prepare for high volume commercial production we are moving from individual tile processing to whole wafer processing using silicon methodologies, such as waxless processing, cassette based/touchless wafer handling. We have been developing parametric level screening at the wafer stage to ensure high wafer quality before detector fabrication in order to maximize production yields. These process improvements enable us, and other CZT manufacturers who pursue similar developments, to provide high volume production for photon counting applications in an economically feasible manner. CZT sensors are capable of delivering both high count rates and high-resolution spectroscopic performance, although it is challenging to achieve both of these attributes simultaneously. The paper discusses material challenges, detector design trade-offs and ASIC architectures required to build cost-effective CZT based detection systems. Photon counting ASICs are essential part of the integrated module platforms as charge-sensitive electronics needs to deal with charge-sharing and pile-up effects.
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Occupational stress in human computer interaction.
Smith, M J; Conway, F T; Karsh, B T
1999-04-01
There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.
Discovery & Interaction in Astro 101 Laboratory Experiments
NASA Astrophysics Data System (ADS)
Maloney, Frank Patrick; Maurone, Philip; DeWarf, Laurence E.
2016-01-01
The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for arts students. We report on a strategy, begun in 1992, for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. These experiments have evolved as :a) the quality and speed of the hardware has greatly increasedb) the corresponding hardware costs have decreasedc) the students have become computer and Internet literated) the importance of computationally and scientifically literate arts graduates in the workplace has increased.We present the current suite of laboratory experiments, and describe the nature, procedures, and goals in this two-semester laboratory for liberal arts majors at the Astro 101 university level.
SubspaceEM: A Fast Maximum-a-posteriori Algorithm for Cryo-EM Single Particle Reconstruction
Dvornek, Nicha C.; Sigworth, Fred J.; Tagare, Hemant D.
2015-01-01
Single particle reconstruction methods based on the maximum-likelihood principle and the expectation-maximization (E–M) algorithm are popular because of their ability to produce high resolution structures. However, these algorithms are computationally very expensive, requiring a network of computational servers. To overcome this computational bottleneck, we propose a new mathematical framework for accelerating maximum-likelihood reconstructions. The speedup is by orders of magnitude and the proposed algorithm produces similar quality reconstructions compared to the standard maximum-likelihood formulation. Our approach uses subspace approximations of the cryo-electron microscopy (cryo-EM) data and projection images, greatly reducing the number of image transformations and comparisons that are computed. Experiments using simulated and actual cryo-EM data show that speedup in overall execution time compared to traditional maximum-likelihood reconstruction reaches factors of over 300. PMID:25839831
Computational fluid dynamics modeling and analysis for the Giant Magellan Telescope (GMT)
NASA Astrophysics Data System (ADS)
Ladd, John; Slotnick, Jeffrey; Norby, William; Bigelow, Bruce; Burgett, William
2016-08-01
The Giant Magellan Telescope (GMT) is planned for construction at a summit of Cerro Las Campanas at the Los Campanas Observatory (LCO) in Chile. GMT will be the most powerful ground-based telescope in operation in the world. Aero-thermal interactions between the site topography, enclosure, internal systems, and optics are complex. A key parameter for optical quality is the thermal gradient between the terrain and the air entering the enclosure, and how quickly that gradient can be dissipated to equilibrium. To ensure the highest quality optical performance, careful design of the telescope enclosure building, location of the enclosure on the summit, and proper venting of the airflow within the enclosure is essential to minimize the impact of velocity and temperature gradients in the air entering the enclosure. High-fidelity Reynolds-Averaged Navier Stokes (RANS) Computational Fluid Dynamics (CFD) analysis of the GMT, enclosure, and LCO terrain is performed to study (a) the impact of either an open or closed enclosure base soffit external shape design, (b) the effect of telescope/enclosure location on the mountain summit, and (c) the effect of enclosure venting patterns. Details on the geometry modeling, grid discretization, and flow solution are first described. Then selected computational results are shown to quantify the quality of the airflow entering the GMT enclosure based on soffit, site location, and venting considerations. Based on the results, conclusions are provided on GMT soffit design, site location, and enclosure venting. The current work is not used to estimate image quality but will be addressed in future analyses as described in the conclusions.
Golestaneh, S Alireza; Karam, Lina
2016-08-24
Perceptual image quality assessment (IQA) attempts to use computational models to estimate the image quality in accordance with subjective evaluations. Reduced-reference (RR) image quality assessment (IQA) methods make use of partial information or features extracted from the reference image for estimating the quality of distorted images. Finding a balance between the number of RR features and accuracy of the estimated image quality is essential and important in IQA. In this paper we propose a training-free low-cost RRIQA method that requires a very small number of RR features (6 RR features). The proposed RRIQA algorithm is based on the discrete wavelet transform (DWT) of locally weighted gradient magnitudes.We apply human visual system's contrast sensitivity and neighborhood gradient information to weight the gradient magnitudes in a locally adaptive manner. The RR features are computed by measuring the entropy of each DWT subband, for each scale, and pooling the subband entropies along all orientations, resulting in L RR features (one average entropy per scale) for an L-level DWT. Extensive experiments performed on seven large-scale benchmark databases demonstrate that the proposed RRIQA method delivers highly competitive performance as compared to the state-of-the-art RRIQA models as well as full reference ones for both natural and texture images. The MATLAB source code of REDLOG and the evaluation results are publicly available online at https://http://lab.engineering.asu.edu/ivulab/software/redlog/.
de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana
2017-06-01
The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.
May, Matthias S; Wüst, Wolfgang; Brand, Michael; Stahl, Christian; Allmendinger, Thomas; Schmidt, Bernhard; Uder, Michael; Lell, Michael M
2011-07-01
We sought to evaluate the image quality of iterative reconstruction in image space (IRIS) in half-dose (HD) datasets compared with full-dose (FD) and HD filtered back projection (FBP) reconstruction in abdominal computed tomography (CT). To acquire data with FD and HD simultaneously, contrast-enhanced abdominal CT was performed with a dual-source CT system, both tubes operating at 120 kV, 100 ref.mAs, and pitch 0.8. Three different image datasets were reconstructed from the raw data: Standard FD images applying FBP which served as reference, HD images applying FBP and HD images applying IRIS. For the HD data sets, only data from 1 tube detector-system was used. Quantitative image quality analysis was performed by measuring image noise in tissue and air. Qualitative image quality was evaluated according to the European Guidelines on Quality criteria for CT. Additional assessment of artifacts, lesion conspicuity, and edge sharpness was performed. : Image noise in soft tissue was substantially decreased in HD-IRIS (-3.4 HU, -22%) and increased in HD-FBP (+6.2 HU, +39%) images when compared with the reference (mean noise, 15.9 HU). No significant differences between the FD-FBP and HD-IRIS images were found for the visually sharp anatomic reproduction, overall diagnostic acceptability (P = 0.923), lesion conspicuity (P = 0.592), and edge sharpness (P = 0.589), while HD-FBP was rated inferior. Streak artifacts and beam hardening was significantly more prominent in HD-FBP while HD-IRIS images exhibited a slightly different noise pattern. Direct intrapatient comparison of standard FD body protocols and HD-IRIS reconstruction suggest that the latest iterative reconstruction algorithms allow for approximately 50% dose reduction without deterioration of the high image quality necessary for confident diagnosis.
Morphometric analysis - Cone beam computed tomography to predict bone quality and quantity.
Hohlweg-Majert, B; Metzger, M C; Kummer, T; Schulze, D
2011-07-01
Modified quantitative computed tomography is a method used to predict bone quality and quantify the bone mass of the jaw. The aim of this study was to determine whether bone quantity or quality was detected by cone beam computed tomography (CBCT) combined with image analysis. MATERIALS AND PROCEDURES: Different measurements recorded on two phantoms (Siemens phantom, Comac phantom) were evaluated on images taken with the Somatom VolumeZoom (Siemens Medical Solutions, Erlangen, Germany) and the NewTom 9000 (NIM s.r.l., Verona, Italy) in order to calculate a calibration curve. The spatial relationships of six sample cylinders and the repositioning from four pig skull halves relative to adjacent defined anatomical structures were assessed by means of three-dimensional visualization software. The calibration curves for computer tomography (CT) and cone beam computer tomography (CBCT) using the Siemens phantom showed linear correlation in both modalities between the Hounsfield Units (HU) and bone morphology. A correction factor for CBCT was calculated. Exact information about the micromorphology of the bone cylinders was only available using of micro computer tomography. Cone-beam computer tomography is a suitable choice for analysing bone mass, but, it does not give any information about bone quality. 2010 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs
NASA Astrophysics Data System (ADS)
RIngenburg, Michael F.
Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.
A Markovian model market—Akerlof's lemons and the asymmetry of information
NASA Astrophysics Data System (ADS)
Tilles, Paulo F. C.; Ferreira, Fernando F.; Francisco, Gerson; Pereira, Carlos de B.; Sarti, Flavia M.
2011-07-01
In this work we study an agent based model to investigate the role of asymmetric information degrees for market evolution. This model is quite simple and may be treated analytically since the consumers evaluate the quality of a certain good taking into account only the quality of the last good purchased plus her perceptive capacity β. As a consequence, the system evolves according to a stationary Markov chain. The value of a good offered by the firms increases along with quality according to an exponent α, which is a measure of the technology. It incorporates all the technological capacity of the production systems such as education, scientific development and techniques that change the productivity rates. The technological level plays an important role to explain how the asymmetry of information may affect the market evolution in this model. We observe that, for high technological levels, the market can detect adverse selection. The model allows us to compute the maximum asymmetric information degree before the market collapses. Below this critical point the market evolves during a limited period of time and then dies out completely. When β is closer to 1 (symmetric information), the market becomes more profitable for high quality goods, although high and low quality markets coexist. The maximum asymmetric information level is a consequence of an ergodicity breakdown in the process of quality evaluation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S
We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less
ERIC Educational Resources Information Center
Austin, Peter C.
2012-01-01
Researchers are increasingly using observational or nonrandomized data to estimate causal treatment effects. Essential to the production of high-quality evidence is the ability to reduce or minimize the confounding that frequently occurs in observational studies. When using the potential outcome framework to define causal treatment effects, one…
1986-06-30
features of computer aided design systems and statistical quality control procedures that are generic to chip sets and processes. RADIATION HARDNESS -The...System PSP Programmable Signal Processor SSI Small Scale Integration ." TOW Tube Launched, Optically Tracked, Wire Guided TTL Transistor Transitor Logic
Markets and Models for Large-Scale Courseware Development.
ERIC Educational Resources Information Center
Bunderson, C. Victor
Computer-assisted instruction (CAI) is not making an important, visible impact on the educational system of this country. Though its instructional value has been proven time after time, the high cost of the hardware and the lack of quality courseware is preventing CAI from becoming a market success. In order for CAI to reach its market potential…
Making Materials Based on TeX and CAS/DGS--Reports on CADGME 2012 Conference Working Group
ERIC Educational Resources Information Center
Kaneko, Masataka; Yamashita, Satoshi; Kitahara, Kiyoshi; Maeda, Yoshifumi; Usui, Hisashi; Takato, Setsuo
2014-01-01
TeX has become one of the most popular tools for editing teaching materials or textbooks in collegiate mathematics education, since it enables usual mathematics teachers to easily produce high-quality mathematical documents. Its capabilities for visualization and computation are fairly limited, so that many teachers simultaneously use various…
Banić, Nikola; Lončarić, Sven
2015-11-01
Removing the influence of illumination on image colors and adjusting the brightness across the scene are important image enhancement problems. This is achieved by applying adequate color constancy and brightness adjustment methods. One of the earliest models to deal with both of these problems was the Retinex theory. Some of the Retinex implementations tend to give high-quality results by performing local operations, but they are computationally relatively slow. One of the recent Retinex implementations is light random sprays Retinex (LRSR). In this paper, a new method is proposed for brightness adjustment and color correction that overcomes the main disadvantages of LRSR. There are three main contributions of this paper. First, a concept of memory sprays is proposed to reduce the number of LRSR's per-pixel operations to a constant regardless of the parameter values, thereby enabling a fast Retinex-based local image enhancement. Second, an effective remapping of image intensities is proposed that results in significantly higher quality. Third, the problem of LRSR's halo effect is significantly reduced by using an alternative illumination processing method. The proposed method enables a fast Retinex-based image enhancement by processing Retinex paths in a constant number of steps regardless of the path size. Due to the halo effect removal and remapping of the resulting intensities, the method outperforms many of the well-known image enhancement methods in terms of resulting image quality. The results are presented and discussed. It is shown that the proposed method outperforms most of the tested methods in terms of image brightness adjustment, color correction, and computational speed.
NASA Astrophysics Data System (ADS)
Henderson, J. M.; Eluszkiewicz, J.; Mountain, M. E.; Nehrkorn, T.; Chang, R. Y.-W.; Karion, A.; Miller, J. B.; Sweeney, C.; Steiner, N.; Wofsy, S. C.; Miller, C. E.
2014-10-01
This paper describes the atmospheric modeling that underlies the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) science analysis, including its meteorological and atmospheric transport components (Polar variant of the Weather Research and Forecasting (WRF) and Stochastic Time Inverted Lagrangian Transport (STILT) models), and provides WRF validation for May-October 2012 and March-November 2013 - the first two years of the aircraft field campaign. A triply nested computational domain for WRF was chosen so that the innermost domain with 3.3 km grid spacing encompasses the entire mainland of Alaska and enables the substantial orography of the state to be represented by the underlying high-resolution topographic input field. Summary statistics of the WRF model performance on the 3.3 km grid indicate good overall agreement with quality-controlled surface and radiosonde observations. Two-meter temperatures are generally too cold by approximately 1.4 K in 2012 and 1.1 K in 2013, while 2 m dewpoint temperatures are too low (dry) by 0.2 K in 2012 and too high (moist) by 0.6 K in 2013. Wind speeds are biased too low by 0.2 m s-1 in 2012 and 0.3 m s-1 in 2013. Model representation of upper level variables is very good. These measures are comparable to model performance metrics of similar model configurations found in the literature. The high quality of these fine-resolution WRF meteorological fields inspires confidence in their use to drive STILT for the purpose of computing surface influences ("footprints") at commensurably increased resolution. Indeed, footprints generated on a 0.1° grid show increased spatial detail compared with those on the more common 0.5° grid, lending itself better for convolution with flux models for carbon dioxide and methane across the heterogeneous Alaskan landscape. Ozone deposition rates computed using STILT footprints indicate good agreement with observations and exhibit realistic seasonal variability, further indicating that WRF-STILT footprints are of high quality and will support accurate estimates of CO2 and CH4 surface-atmosphere fluxes using CARVE observations.
NASA Astrophysics Data System (ADS)
Henderson, J. M.; Eluszkiewicz, J.; Mountain, M. E.; Nehrkorn, T.; Chang, R. Y.-W.; Karion, A.; Miller, J. B.; Sweeney, C.; Steiner, N.; Wofsy, S. C.; Miller, C. E.
2015-04-01
This paper describes the atmospheric modeling that underlies the Carbon in Arctic Reservoirs Vulnerability Experiment (CARVE) science analysis, including its meteorological and atmospheric transport components (polar variant of the Weather Research and Forecasting (WRF) and Stochastic Time Inverted Lagrangian Transport (STILT) models), and provides WRF validation for May-October 2012 and March-November 2013 - the first 2 years of the aircraft field campaign. A triply nested computational domain for WRF was chosen so that the innermost domain with 3.3 km grid spacing encompasses the entire mainland of Alaska and enables the substantial orography of the state to be represented by the underlying high-resolution topographic input field. Summary statistics of the WRF model performance on the 3.3 km grid indicate good overall agreement with quality-controlled surface and radiosonde observations. Two-meter temperatures are generally too cold by approximately 1.4 K in 2012 and 1.1 K in 2013, while 2 m dewpoint temperatures are too low (dry) by 0.2 K in 2012 and too high (moist) by 0.6 K in 2013. Wind speeds are biased too low by 0.2 m s-1 in 2012 and 0.3 m s-1 in 2013. Model representation of upper level variables is very good. These measures are comparable to model performance metrics of similar model configurations found in the literature. The high quality of these fine-resolution WRF meteorological fields inspires confidence in their use to drive STILT for the purpose of computing surface influences ("footprints") at commensurably increased resolution. Indeed, footprints generated on a 0.1° grid show increased spatial detail compared with those on the more common 0.5° grid, better allowing for convolution with flux models for carbon dioxide and methane across the heterogeneous Alaskan landscape. Ozone deposition rates computed using STILT footprints indicate good agreement with observations and exhibit realistic seasonal variability, further indicating that WRF-STILT footprints are of high quality and will support accurate estimates of CO2 and CH4 surface-atmosphere fluxes using CARVE observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise
2006-09-01
The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.
Extreme-Scale De Novo Genome Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georganas, Evangelos; Hofmeyr, Steven; Egan, Rob
De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and themore » large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.« less
Efficient tiled calculation of over-10-gigapixel holograms using ray-wavefront conversion.
Igarashi, Shunsuke; Nakamura, Tomoya; Matsushima, Kyoji; Yamaguchi, Masahiro
2018-04-16
In the calculation of large-scale computer-generated holograms, an approach called "tiling," which divides the hologram plane into small rectangles, is often employed due to limitations on computational memory. However, the total amount of computational complexity severely increases with the number of divisions. In this paper, we propose an efficient method for calculating tiled large-scale holograms using ray-wavefront conversion. In experiments, the effectiveness of the proposed method was verified by comparing its calculation cost with that using the previous method. Additionally, a hologram of 128K × 128K pixels was calculated and fabricated by a laser-lithography system, and a high-quality 105 mm × 105 mm 3D image including complicated reflection and translucency was optically reconstructed.
Garrido-Morgado, Álvaro; González-Benito, Óscar; Martos-Partal, Mercedes
2016-01-01
Creating and maintaining customer loyalty are strategic requirements for modern business. In the current competitive context, product quality, and brand experience are crucial in building and maintaining customer loyalty. Consumer loyalty, which may be classified into cognitive loyalty and affective loyalty, is related to customers' quality perception. Cue utilization theory distinguishes two dimensions for perceived quality, extrinsic quality—linked to the brand—and intrinsic quality—related with internal product characteristics. We propose that (i) cognitive loyalty is more influenced by intrinsic product quality whereas extrinsic product quality (brand name) is more salient for affective loyalty and, (ii) different commercial stimuli have a differential effectiveness on intrinsic and extrinsic perceived quality. In fact, in this study, we analyze how perceived quality dimensions may influence the effectiveness of two different commercial stimuli: displays and advertising flyers. While displays work within the point of sale under time-constrained conditions where consumers are more likely to use heuristics to simplify their decisions, advertising flyers work outside of the point of sale under low time-constrained conditions, and therefore favor a more reasoned purchase decision where systematic processing will be more likely. We analyze the role of quality perception in determining the effectiveness of both these commercial stimuli for selling products that induce high purchase involvement and perceived risk. The empirical analysis focuses on computer products sold by one of Europe's largest computer retailers and it combines scanner, observational, and survey data. The results show that both dimensions of quality perceptions moderate the influence of displays and advertising flyers on sales, but their impact is different on each commercial stimuli. Extrinsic quality perception increases to a greater extent the effect of displays due to the use of a brand name heuristic. However, intrinsic quality perception improves to a greater extent the effect of advertising flyers, which in turn are more closely related to systematic decision processing. PMID:27014144
A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.
Kim, Minchan; Seo, Jiwon; Lee, Jiyun
2014-08-14
Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.
A Comprehensive Method for GNSS Data Quality Determination to Improve Ionospheric Data Analysis
Kim, Minchan; Seo, Jiwon; Lee, Jiyun
2014-01-01
Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005
Use of parallel computing in mass processing of laser data
NASA Astrophysics Data System (ADS)
Będkowski, J.; Bratuś, R.; Prochaska, M.; Rzonca, A.
2015-12-01
The first part of the paper includes a description of the rules used to generate the algorithm needed for the purpose of parallel computing and also discusses the origins of the idea of research on the use of graphics processors in large scale processing of laser scanning data. The next part of the paper includes the results of an efficiency assessment performed for an array of different processing options, all of which were substantially accelerated with parallel computing. The processing options were divided into the generation of orthophotos using point clouds, coloring of point clouds, transformations, and the generation of a regular grid, as well as advanced processes such as the detection of planes and edges, point cloud classification, and the analysis of data for the purpose of quality control. Most algorithms had to be formulated from scratch in the context of the requirements of parallel computing. A few of the algorithms were based on existing technology developed by the Dephos Software Company and then adapted to parallel computing in the course of this research study. Processing time was determined for each process employed for a typical quantity of data processed, which helped confirm the high efficiency of the solutions proposed and the applicability of parallel computing to the processing of laser scanning data. The high efficiency of parallel computing yields new opportunities in the creation and organization of processing methods for laser scanning data.
Inexpensive DAQ based physics labs
NASA Astrophysics Data System (ADS)
Lewis, Benjamin; Clark, Shane
2015-11-01
Quality Data Acquisition (DAQ) based physics labs can be designed using microcontrollers and very low cost sensors with minimal lab equipment. A prototype device with several sensors and documentation for a number of DAQ-based labs is showcased. The device connects to a computer through Bluetooth and uses a simple interface to control the DAQ and display real time graphs, storing the data in .txt and .xls formats. A full device including a larger number of sensors combined with software interface and detailed documentation would provide a high quality physics lab education for minimal cost, for instance in high schools lacking lab equipment or students taking online classes. An entire semester’s lab course could be conducted using a single device with a manufacturing cost of under $20.
Design of Restoration Method Based on Compressed Sensing and TwIST Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Fei; Piao, Yan
2018-04-01
In order to improve the subjective and objective quality of degraded images at low sampling rates effectively,save storage space and reduce computational complexity at the same time, this paper proposes a joint restoration algorithm of compressed sensing and two step iterative threshold shrinkage (TwIST). The algorithm applies the TwIST algorithm which used in image restoration to the compressed sensing theory. Then, a small amount of sparse high-frequency information is obtained in frequency domain. The TwIST algorithm based on compressed sensing theory is used to accurately reconstruct the high frequency image. The experimental results show that the proposed algorithm achieves better subjective visual effects and objective quality of degraded images while accurately restoring degraded images.
Development and testing of a superconducting link for an IR detector
NASA Technical Reports Server (NTRS)
Caton, R.; Selim, R.
1991-01-01
The development and testing of a ceramic superconducting link for an infrared detector is summarized. Areas of study included the materials used, the electrical contacts, radiation and temperature cycling effects, aging, thermal conductivity, and computer models of an ideal link. Materials' samples were processed in a tube furnace at temperatures of 840 C to 865 C for periods up to 17 days and transition temperatures and critical current densities were recorded. The project achieved better quality high superconducting transition temperature material through improved processing and also achieved high quality electrical contacts. Studies on effects of electron irradiation, temperature cycling, and aging on superconducting properties indicate that the materials will be suitable for space applications. Various presentations and publications on the study's results are reported.
Facial expression reconstruction on the basis of selected vertices of triangle mesh
NASA Astrophysics Data System (ADS)
Peszor, Damian; Wojciechowska, Marzena
2016-06-01
Facial expression reconstruction is an important issue in the field of computer graphics. While it is relatively easy to create an animation based on meshes constructed through video recordings, this kind of high-quality data is often not transferred to another model because of lack of intermediary, anthropometry-based way to do so. However, if a high-quality mesh is sampled with sufficient density, it is possible to use obtained feature points to encode the shape of surrounding vertices in a way that can be easily transferred to another mesh with corresponding feature points. In this paper we present a method used for obtaining information for the purpose of reconstructing changes in facial surface on the basis of selected feature points.
Extracting transient Rayleigh wave and its application in detecting quality of highway roadbed
Liu, J.; Xia, J.; Luo, Y.; Li, X.; Xu, S.; ,
2004-01-01
This paper first explains the tau-p mapping method of extracting Rayleigh waves (LR waves) from field shot gathers. It also explains a mathematical model of physical character parameters of quality of high-grade roads. This paper then discusses an algorithm of computing dispersion curves using adjacent channels. Shear velocity and physical character parameters are obtained by inversion of dispersion curves. The algorithm using adjacent channels to calculating dispersion curves eliminates average effects that exist by using multi-channels to obtain dispersion curves so that it improves longitudinal and transverse resolution of LR waves and precision of non-invasive detection, and also broadens its application fields. By analysis of modeling results of detached computation of the ground roll and real examples of detecting density and pressure strength of a high-grade roadbed, and by comparison of shallow seismic image method with borehole cores, we concluded that: 1 the abnormal scale and configuration obtained by LR waves are mostly the same as the result of shallow seismic image method; 2 an average relative error of density obtained from LR waves inversion is 1.6% comparing with borehole coring; 3 transient LR waves in detecting density and pressure strength of a high-grade roadbed is feasible and effective.
2002-01-01
1-hour and proposed 8-hour National Ambient Air Quality Standards. Reactive biogenic (natural) volatile organic compounds emitted from plants have...uncertainty in predicting plant species composition and frequency. Isoprene emissions computed for the study area from the project’s high-resolution...Landcover Database (BELD 2), while monoterpene and other reactive volatile organic compound emission rates were almost 26% and 28% lower, respectively
Granata, Massimo; Craig, Kieran; Cagnoli, Gianpietro; Carcy, Cécile; Cunningham, William; Degallaix, Jérôme; Flaminio, Raffaele; Forest, Danièle; Hart, Martin; Hennig, Jan-Simon; Hough, James; MacLaren, Ian; Martin, Iain William; Michel, Christophe; Morgado, Nazario; Otmani, Salim; Pinard, Laurent; Rowan, Sheila
2013-12-15
We report on low-frequency measurements of the mechanical loss of a high-quality (transmissivity T<5 ppm at λ(0)=1064 nm, absorption loss <0.5 ppm) multilayer dielectric coating of ion-beam-sputtered fused silica and titanium-doped tantala in the 10-300 K temperature range. A useful parameter for the computation of coating thermal noise on different substrates is derived as a function of temperature and frequency.
2014-01-01
Background Patient decision aids (PtDA) are developed to facilitate informed, value-based decisions about health. Research suggests that even when informed with necessary evidence and information, cognitive errors can prevent patients from choosing the option that is most congruent with their own values. We sought to utilize principles of behavioural economics to develop a computer application that presents information from conventional decision aids in a way that reduces these errors, subsequently promoting higher quality decisions. Method The Dynamic Computer Interactive Decision Application (DCIDA) was developed to target four common errors that can impede quality decision making with PtDAs: unstable values, order effects, overweighting of rare events, and information overload. Healthy volunteers were recruited to an interview to use three PtDAs converted to the DCIDA on a computer equipped with an eye tracker. Participants were first used a conventional PtDA, and then subsequently used the DCIDA version. User testing was assessed based on whether respondents found the software both usable: evaluated using a) eye-tracking, b) the system usability scale, and c) user verbal responses from a ‘think aloud’ protocol; and useful: evaluated using a) eye-tracking, b) whether preferences for options were changed, and c) and the decisional conflict scale. Results Of the 20 participants recruited to the study, 11 were male (55%), the mean age was 35, 18 had at least a high school education (90%), and 8 (40%) had a college or university degree. Eye-tracking results, alongside a mean system usability scale score of 73 (range 68–85), indicated a reasonable degree of usability for the DCIDA. The think aloud study suggested areas for further improvement. The DCIDA also appeared to be useful to participants wherein subjects focused more on the features of the decision that were most important to them (21% increase in time spent focusing on the most important feature). Seven subjects (25%) changed their preferred option when using DCIDA. Conclusion Preliminary results suggest that DCIDA has potential to improve the quality of patient decision-making. Next steps include larger studies to test individual components of DCIDA and feasibility testing with patients making real decisions. PMID:25084808
Windows Program For Driving The TDU-850 Printer
NASA Technical Reports Server (NTRS)
Parrish, Brett T.
1995-01-01
Program provides WYSIWYG compatibility between video display and printout. PDW is Microsoft Windows printer-driver computer program for use with Raytheon TDU-850 printer. Provides previously unavailable linkage between printer and IBM PC-compatible computers running Microsoft Windows. Enhances capabilities of Raytheon TDU-850 hardcopier by emulating all textual and graphical features normally supported by laser/ink-jet printers and makes printer compatible with any Microsoft Windows application. Also provides capabilities not found in laser/ink-jet printer drivers by providing certain Windows applications with ability to render high quality, true gray-scale photographic hardcopy on TDU-850. Written in C language.
A method for brain 3D surface reconstruction from MR images
NASA Astrophysics Data System (ADS)
Zhao, De-xin
2014-09-01
Due to the encephalic tissues are highly irregular, three-dimensional (3D) modeling of brain always leads to complicated computing. In this paper, we explore an efficient method for brain surface reconstruction from magnetic resonance (MR) images of head, which is helpful to surgery planning and tumor localization. A heuristic algorithm is proposed for surface triangle mesh generation with preserved features, and the diagonal length is regarded as the heuristic information to optimize the shape of triangle. The experimental results show that our approach not only reduces the computational complexity, but also completes 3D visualization with good quality.
Spectral decontamination of a real-time helicopter simulation
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1983-01-01
Nonlinear mathematical models of a rotor system, referred to as rotating blade-element models, produce steady-state, high-frequency harmonics of significant magnitude. In a discrete simulation model, certain of these harmonics may be incompatible with realistic real-time computational constraints because of their aliasing into the operational low-pass region. However, the energy is an aliased harmonic may be suppressed by increasing the computation rate of an isolated, causal nonlinearity and using an appropriate filter. This decontamination technique is applied to Sikorsky's real-time model of the Black Hawk helicopter, as supplied to NASA for handling-qualities investigations.
Zhang, H H; Gao, S; Chen, W; Shi, L; D'Souza, W D; Meyer, R R
2013-03-21
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equallyspaced beams (eplans), we have developed a global search metaheuristic process based on the nested partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are of superior quality.
Zhang, H H; Gao, S; Chen, W; Shi, L; D’Souza, W D; Meyer, R R
2013-01-01
An important element of radiation treatment planning for cancer therapy is the selection of beam angles (out of all possible coplanar and non-coplanar angles in relation to the patient) in order to maximize the delivery of radiation to the tumor site and minimize radiation damage to nearby organs-at-risk. This category of combinatorial optimization problem is particularly difficult because direct evaluation of the quality of treatment corresponding to any proposed selection of beams requires the solution of a large-scale dose optimization problem involving many thousands of variables that represent doses delivered to volume elements (voxels) in the patient. However, if the quality of angle sets can be accurately estimated without expensive computation, a large number of angle sets can be considered, increasing the likelihood of identifying a very high quality set. Using a computationally efficient surrogate beam set evaluation procedure based on single-beam data extracted from plans employing equally-spaced beams (eplans), we have developed a global search metaheuristic process based on the Nested Partitions framework for this combinatorial optimization problem. The surrogate scoring mechanism allows us to assess thousands of beam set samples within a clinically acceptable time frame. Tests on difficult clinical cases demonstrate that the beam sets obtained via our method are superior quality. PMID:23459411
Cost-effective handling of digital medical images in the telemedicine environment.
Choong, Miew Keen; Logeswaran, Rajasvaran; Bister, Michel
2007-09-01
This paper concentrates on strategies for less costly handling of medical images. Aspects of digitization using conventional digital cameras, lossy compression with good diagnostic quality, and visualization through less costly monitors are discussed. For digitization of film-based media, subjective evaluation of the suitability of digital cameras as an alternative to the digitizer was undertaken. To save on storage, bandwidth and transmission time, the acceptable degree of compression with diagnostically no loss of important data was studied through randomized double-blind tests of the subjective image quality when compression noise was kept lower than the inherent noise. A diagnostic experiment was undertaken to evaluate normal low cost computer monitors as viable viewing displays for clinicians. The results show that conventional digital camera images of X-ray images were diagnostically similar to the expensive digitizer. Lossy compression, when used moderately with the imaging noise to compression noise ratio (ICR) greater than four, can bring about image improvement with better diagnostic quality than the original image. Statistical analysis shows that there is no diagnostic difference between expensive high quality monitors and conventional computer monitors. The results presented show good potential in implementing the proposed strategies to promote widespread cost-effective telemedicine and digital medical environments. 2006 Elsevier Ireland Ltd
Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin
2016-06-27
Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system.
Liu, Yanchi; Wang, Xue; Liu, Youda; Cui, Sujin
2016-01-01
Power quality analysis issues, especially the measurement of harmonic and interharmonic in cyber-physical energy systems, are addressed in this paper. As new situations are introduced to the power system, the impact of electric vehicles, distributed generation and renewable energy has introduced extra demands to distributed sensors, waveform-level information and power quality data analytics. Harmonics and interharmonics, as the most significant disturbances, require carefully designed detection methods for an accurate measurement of electric loads whose information is crucial to subsequent analyzing and control. This paper gives a detailed description of the power quality analysis framework in networked environment and presents a fast and resolution-enhanced method for harmonic and interharmonic measurement. The proposed method first extracts harmonic and interharmonic components efficiently using the single-channel version of Robust Independent Component Analysis (RobustICA), then estimates the high-resolution frequency from three discrete Fourier transform (DFT) samples with little additional computation, and finally computes the amplitudes and phases with the adaptive linear neuron network. The experiments show that the proposed method is time-efficient and leads to a better accuracy of the simulated and experimental signals in the presence of noise and fundamental frequency deviation, thus providing a deeper insight into the (inter)harmonic sources or even the whole system. PMID:27355946
Light Weight MP3 Watermarking Method for Mobile Terminals
NASA Astrophysics Data System (ADS)
Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro
This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.
NASA Astrophysics Data System (ADS)
Jeon, P.-H.; Lee, C.-L.; Kim, D.-H.; Lee, Y.-J.; Jeon, S.-S.; Kim, H.-J.
2014-03-01
Multi-detector computed tomography (MDCT) can be used to easily and rapidly perform numerous acquisitions, possibly leading to a marked increase in the radiation dose to individual patients. Technical options dedicated to automatically adjusting the acquisition parameters according to the patient's size are of specific interest in pediatric radiology. A constant tube potential reduction can be achieved for adults and children, while maintaining a constant detector energy fluence. To evaluate radiation dose, the weighted CT dose index (CTDIw) was calculated based on the CT dose index (CTDI) measured using an ion chamber, and image noise and image contrast were measured from a scanned image to evaluate image quality. The dose-weighted contrast-to-noise ratio (CNRD) was calculated from the radiation dose, image noise, and image contrast measured from a scanned image. The noise derivative (ND) is a quality index for dose efficiency. X-ray spectra with tube voltages ranging from 80 to 140 kVp were used to compute the average photon energy. Image contrast and the corresponding contrast-to-noise ratio (CNR) were determined for lesions of soft tissue, muscle, bone, and iodine relative to a uniform water background, as the iodine contrast increases at lower energy (i.e., k-edge of iodine is 33 keV closer to the beam energy) using mixed water-iodine contrast normalization (water 0, iodine 25, 100, 200, and 1000 HU, respectively). The proposed values correspond to high quality images and can be reduced if only high-contrast organs are assessed. The potential benefit of lowering the tube voltage is an improved CNRD, resulting in a lower radiation dose and optimization of image quality. Adjusting the tube potential in abdominal CT would be useful in current pediatric radiography, where the choice of X-ray techniques generally takes into account the size of the patient as well as the need to balance the conflicting requirements of diagnostic image quality and radiation dose optimization.
Stability-Constrained Aerodynamic Shape Optimization with Applications to Flying Wings
NASA Astrophysics Data System (ADS)
Mader, Charles Alexander
A set of techniques is developed that allows the incorporation of flight dynamics metrics as an additional discipline in a high-fidelity aerodynamic optimization. Specifically, techniques for including static stability constraints and handling qualities constraints in a high-fidelity aerodynamic optimization are demonstrated. These constraints are developed from stability derivative information calculated using high-fidelity computational fluid dynamics (CFD). Two techniques are explored for computing the stability derivatives from CFD. One technique uses an automatic differentiation adjoint technique (ADjoint) to efficiently and accurately compute a full set of static and dynamic stability derivatives from a single steady solution. The other technique uses a linear regression method to compute the stability derivatives from a quasi-unsteady time-spectral CFD solution, allowing for the computation of static, dynamic and transient stability derivatives. Based on the characteristics of the two methods, the time-spectral technique is selected for further development, incorporated into an optimization framework, and used to conduct stability-constrained aerodynamic optimization. This stability-constrained optimization framework is then used to conduct an optimization study of a flying wing configuration. This study shows that stability constraints have a significant impact on the optimal design of flying wings and that, while static stability constraints can often be satisfied by modifying the airfoil profiles of the wing, dynamic stability constraints can require a significant change in the planform of the aircraft in order for the constraints to be satisfied.
Automatic retinal interest evaluation system (ARIES).
Yin, Fengshou; Wong, Damon Wing Kee; Yow, Ai Ping; Lee, Beng Hai; Quan, Ying; Zhang, Zhuo; Gopalakrishnan, Kavitha; Li, Ruoying; Liu, Jiang
2014-01-01
In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases such as glaucoma, age-related macular degeneration and diabetic retinopathy. However, in practice, retinal image quality is a big concern as automatic systems without consideration of degraded image quality will likely generate unreliable results. In this paper, an automatic retinal image quality assessment system (ARIES) is introduced to assess both image quality of the whole image and focal regions of interest. ARIES achieves 99.54% accuracy in distinguishing fundus images from other types of images through a retinal image identification step in a dataset of 35342 images. The system employs high level image quality measures (HIQM) to perform image quality assessment, and achieves areas under curve (AUCs) of 0.958 and 0.987 for whole image and optic disk region respectively in a testing dataset of 370 images. ARIES acts as a form of automatic quality control which ensures good quality images are used for processing, and can also be used to alert operators of poor quality images at the time of acquisition.
MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control
NASA Astrophysics Data System (ADS)
Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming
2017-09-01
Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.
Technical Note: Improving the VMERGE treatment planning algorithm for rotational radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaddy, Melissa R., E-mail: mrgaddy@ncsu.edu; Papp,
2016-07-15
Purpose: The authors revisit the VMERGE treatment planning algorithm by Craft et al. [“Multicriteria VMAT optimization,” Med. Phys. 39, 686–696 (2012)] for arc therapy planning and propose two changes to the method that are aimed at improving the achieved trade-off between treatment time and plan quality at little additional planning time cost, while retaining other desirable properties of the original algorithm. Methods: The original VMERGE algorithm first computes an “ideal,” high quality but also highly time consuming treatment plan that irradiates the patient from all possible angles in a fine angular grid with a highly modulated beam and then makesmore » this plan deliverable within practical treatment time by an iterative fluence map merging and sequencing algorithm. We propose two changes to this method. First, we regularize the ideal plan obtained in the first step by adding an explicit constraint on treatment time. Second, we propose a different merging criterion that comprises of identifying and merging adjacent maps whose merging results in the least degradation of radiation dose. Results: The effect of both suggested modifications is evaluated individually and jointly on clinical prostate and paraspinal cases. Details of the two cases are reported. Conclusions: In the authors’ computational study they found that both proposed modifications, especially the regularization, yield noticeably improved treatment plans for the same treatment times than what can be obtained using the original VMERGE method. The resulting plans match the quality of 20-beam step-and-shoot IMRT plans with a delivery time of approximately 2 min.« less
Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales
NASA Astrophysics Data System (ADS)
Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.
2017-12-01
When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.
Jacobs, Richard H. A. H.; Haak, Koen V.; Thumfart, Stefan; Renken, Remco; Henson, Brian; Cornelissen, Frans W.
2016-01-01
Our world is filled with texture. For the human visual system, this is an important source of information for assessing environmental and material properties. Indeed—and presumably for this reason—the human visual system has regions dedicated to processing textures. Despite their abundance and apparent relevance, only recently the relationships between texture features and high-level judgments have captured the interest of mainstream science, despite long-standing indications for such relationships. In this study, we explore such relationships, as these might be used to predict perceived texture qualities. This is relevant, not only from a psychological/neuroscience perspective, but also for more applied fields such as design, architecture, and the visual arts. In two separate experiments, observers judged various qualities of visual textures such as beauty, roughness, naturalness, elegance, and complexity. Based on factor analysis, we find that in both experiments, ~75% of the variability in the judgments could be explained by a two-dimensional space, with axes that are closely aligned to the beauty and roughness judgments. That a two-dimensional judgment space suffices to capture most of the variability in the perceived texture qualities suggests that observers use a relatively limited set of internal scales on which to base various judgments, including aesthetic ones. Finally, for both of these judgments, we determined the relationship with a large number of texture features computed for each of the texture stimuli. We find that the presence of lower spatial frequencies, oblique orientations, higher intensity variation, higher saturation, and redness correlates with higher beauty ratings. Features that captured image intensity and uniformity correlated with roughness ratings. Therefore, a number of computational texture features are predictive of these judgments. This suggests that perceived texture qualities—including the aesthetic appreciation—are sufficiently universal to be predicted—with reasonable accuracy—based on the computed feature content of the textures. PMID:27493628
Searching for a business case for quality in Medicaid managed care.
Greene, Sandra B; Reiter, Kristin L; Kilpatrick, Kerry E; Leatherman, Sheila; Somers, Stephen A; Hamblin, Allison
2008-01-01
Despite the prevalence of evidence-based interventions to improve quality in health care systems, there is a paucity of documented evidence of a financial return on investment (ROI) for these interventions from the perspective of the investing entity. To report on a demonstration project designed to measure the business case for selected quality interventions in high-risk high-cost patient populations in 10 Medicaid managed care organizations across the United States. Using claims and enrollment data gathered over a 3-year period and data on the costs of designing, implementing, and operating the interventions, ROIs were computed for 11 discrete evidence-based quality-enhancing interventions. A complex case management program to treat adults with multiple comorbidities achieved the largest ROI of 12.21:1. This was followed by an ROI of 6.35:1 for a program which treated children with asthma with a history of high emergency room (ER) use and/or inpatient admissions for their disease. An intervention for high-risk pregnant mothers produced a 1.26:1 ROI, and a program for adult patients with diabetes resulted in a 1.16:1 return. The remaining seven interventions failed to show positive returns, although four sites came close to realizing sufficient savings to offset investment costs. Evidence-based interventions designed to improve the quality of patient care may have the best opportunity to yield a positive financial return if it is focused on high-risk high-cost populations and conditions associated with avoidable emergency and inpatient utilization. Developing the necessary tracking systems for the claims and financial investments is critical to perform accurate financial ROI analyses.
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
Quantitative Prediction of Computational Quality (so the S and C Folks will Accept it)
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Luckring, James M.; Morrison, Joseph H.
2004-01-01
Our choice of title may seem strange but we mean each word. In this talk, we are not going to be concerned with computations made "after the fact", i.e. those for which data are available and which are being conducted for explanation and insight. Here we are interested in preventing S&C design problems by finding them through computation before data are available. For such a computation to have any credibility with those who absorb the risk, it is necessary to quantitatively PREDICT the quality of the computational results.
Computer-aided US diagnosis of breast lesions by using cell-based contour grouping.
Cheng, Jie-Zhi; Chou, Yi-Hong; Huang, Chiun-Sheng; Chang, Yeun-Chung; Tiu, Chui-Mei; Chen, Kuei-Wu; Chen, Chung-Ming
2010-06-01
To develop a computer-aided diagnostic algorithm with automatic boundary delineation for differential diagnosis of benign and malignant breast lesions at ultrasonography (US) and investigate the effect of boundary quality on the performance of a computer-aided diagnostic algorithm. This was an institutional review board-approved retrospective study with waiver of informed consent. A cell-based contour grouping (CBCG) segmentation algorithm was used to delineate the lesion boundaries automatically. Seven morphologic features were extracted. The classifier was a logistic regression function. Five hundred twenty breast US scans were obtained from 520 subjects (age range, 15-89 years), including 275 benign (mean size, 15 mm; range, 5-35 mm) and 245 malignant (mean size, 18 mm; range, 8-29 mm) lesions. The newly developed computer-aided diagnostic algorithm was evaluated on the basis of boundary quality and differentiation performance. The segmentation algorithms and features in two conventional computer-aided diagnostic algorithms were used for comparative study. The CBCG-generated boundaries were shown to be comparable with the manually delineated boundaries. The area under the receiver operating characteristic curve (AUC) and differentiation accuracy were 0.968 +/- 0.010 and 93.1% +/- 0.7, respectively, for all 520 breast lesions. At the 5% significance level, the newly developed algorithm was shown to be superior to the use of the boundaries and features of the two conventional computer-aided diagnostic algorithms in terms of AUC (0.974 +/- 0.007 versus 0.890 +/- 0.008 and 0.788 +/- 0.024, respectively). The newly developed computer-aided diagnostic algorithm that used a CBCG segmentation method to measure boundaries achieved a high differentiation performance. Copyright RSNA, 2010
In-process fault detection for textile fabric production: onloom imaging
NASA Astrophysics Data System (ADS)
Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til
2011-05-01
Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.
Monitoring techniques and alarm procedures for CMS services and sites in WLCG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molina-Perez, J.; Bonacorsi, D.; Gutsche, O.
2012-01-01
The CMS offline computing system is composed of roughly 80 sites (including most experienced T3s) and a number of central services to distribute, process and analyze data worldwide. A high level of stability and reliability is required from the underlying infrastructure and services, partially covered by local or automated monitoring and alarming systems such as Lemon and SLS, the former collects metrics from sensors installed on computing nodes and triggers alarms when values are out of range, the latter measures the quality of service and warns managers when service is affected. CMS has established computing shift procedures with personnel operatingmore » worldwide from remote Computing Centers, under the supervision of the Computing Run Coordinator at CERN. This dedicated 24/7 computing shift personnel is contributing to detect and react timely on any unexpected error and hence ensure that CMS workflows are carried out efficiently and in a sustained manner. Synergy among all the involved actors is exploited to ensure the 24/7 monitoring, alarming and troubleshooting of the CMS computing sites and services. We review the deployment of the monitoring and alarming procedures, and report on the experience gained throughout the first two years of LHC operation. We describe the efficiency of the communication tools employed, the coherent monitoring framework, the proactive alarming systems and the proficient troubleshooting procedures that helped the CMS Computing facilities and infrastructure to operate at high reliability levels.« less
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Minnis, P.; Spangenberg, D.; Ayers, J. K.; Palikonda, R.; Vakhnin, A.; Dubois, R.; Murphy, P. R.
2014-12-01
The processing, storage and dissemination of satellite cloud and radiation products produced at NASA Langley Research Center are key activities for the Climate Science Branch. A constellation of systems operates in sync to accomplish these goals. Because of the complexity involved with operating such intricate systems, there are both high failure rates and high costs for hardware and system maintenance. Cloud computing has the potential to ameliorate cost and complexity issues. Over time, the cloud computing model has evolved and hybrid systems comprising off-site as well as on-site resources are now common. Towards our mission of providing the highest quality research products to the widest audience, we have explored the use of the Amazon Web Services (AWS) Cloud and Storage and present a case study of our results and efforts. This project builds upon NASA Langley Cloud and Radiation Group's experience with operating large and complex computing infrastructures in a reliable and cost effective manner to explore novel ways to leverage cloud computing resources in the atmospheric science environment. Our case study presents the project requirements and then examines the fit of AWS with the LaRC computing model. We also discuss the evaluation metrics, feasibility, and outcomes and close the case study with the lessons we learned that would apply to others interested in exploring the implementation of the AWS system in their own atmospheric science computing environments.
78 FR 25482 - Notice of Revised Determination on Reconsideration
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
...-PROGRESSIVE SOFTWARE COMPUTING, QUALITY TESTING SERVICES, INC., RAILROAD CONSTRUCTION CO. OF SOUTH JERSEY, INC..., LP, PSCI- Progressive Software Computing, Quality Testing Services, Inc., Railroad Construction Co..., ANDERSON CONSTRUCTION SERVICES, BAKER PETROLITE, BAKERCORP, BELL-FAST FIRE PROTECTION INC., BOLTTECH INC...
DOT National Transportation Integrated Search
1997-04-01
The Land Use, Air Quality, and Transportation Integrated Modeling Environment (LATIME) represents an integrated approach to computer modeling and simulation of land use allocation, travel demand, and mobile source emissions for the Albuquerque, New M...
Thomas, Christoph; Brodoefel, Harald; Tsiflikas, Ilias; Bruckner, Friederike; Reimann, Anja; Ketelsen, Dominik; Drosch, Tanja; Claussen, Claus D; Kopp, Andreas; Heuschmid, Martin; Burgstahler, Christof
2010-02-01
To prospectively evaluate the influence of the clinical pretest probability assessed by the Morise score onto image quality and diagnostic accuracy in coronary dual-source computed tomography angiography (DSCTA). In 61 patients, DSCTA and invasive coronary angiography were performed. Subjective image quality and accuracy for stenosis detection (>50%) of DSCTA with invasive coronary angiography as gold standard were evaluated. The influence of pretest probability onto image quality and accuracy was assessed by logistic regression and chi-square testing. Correlations of image quality and accuracy with the Morise score were determined using linear regression. Thirty-eight patients were categorized into the high, 21 into the intermediate, and 2 into the low probability group. Accuracies for the detection of significant stenoses were 0.94, 0.97, and 1.00, respectively. Logistic regressions and chi-square tests showed statistically significant correlations between Morise score and image quality (P < .0001 and P < .001) and accuracy (P = .0049 and P = .027). Linear regression revealed a cutoff Morise score for a good image quality of 16 and a cutoff for a barely diagnostic image quality beyond the upper Morise scale. Pretest probability is a weak predictor of image quality and diagnostic accuracy in coronary DSCTA. A sufficient image quality for diagnostic images can be reached with all pretest probabilities. Therefore, coronary DSCTA might be suitable also for patients with a high pretest probability. Copyright 2010 AUR. Published by Elsevier Inc. All rights reserved.
Lell, Michael M; May, Matthias; Deak, Paul; Alibek, Sedat; Kuefner, Michael; Kuettner, Axel; Köhler, Henrik; Achenbach, Stephan; Uder, Michael; Radkow, Tanja
2011-02-01
computed tomography (CT) is considered the method of choice in thoracic imaging for a variety of indications. Sedation is usually necessary to enable CT and to avoid deterioration of image quality because of patient movement in small children. We evaluated a new, subsecond high-pitch scan mode (HPM), which obviates the need of sedation and to hold the breath. a total of 60 patients were included in this study. 30 patients (mean age, 14 ± 17 month; range, 0-55 month) were examined with a dual source CT system in an HPM. Scan parameters were as follows: pitch = 3.0, 128 × 0.6 mm slice acquisition, 0.28 seconds gantry rotation time, ref. mAs adapted to the body weight (50-100 mAs) at 80 kV. Images were reconstructed with a slice thickness of 0.75 mm. None of the children was sedated for the CT examination and no breathing instructions were given. Image quality was assessed focusing on motion artifacts and delineation of the vascular structures and lung parenchyma. Thirty patients (mean age, 15 ± 17 month; range, 0-55 month) were examined under sedation on 2 different CT systems (10-slice CT, n = 18; 64-slice CT, n = 13 patients) in conventional pitch mode (CPM). Dose values were calculated from the dose length product provided in the patient protocol/dose reports, Monte Carlo simulations were performed to assess dose distribution for CPM and HPM. all scans were performed without complications. Image quality was superior with HPM, because of a significant reduction in motion artifacts, as compared to CPM with 10- and 64-slice CT. In the control group, artifacts were encountered at the level of the diaphragm (n = 30; 100%), the borders of the heart (n = 30; 100%), and the ribs (n = 20; 67%) and spine (n = 6; 20%), whereas motion artifacts were detected in the HPM-group only in 6 patients in the lung parenchyma next to the diaphragm or the heart (P < 0,001). Dose values were within the same range in the patient examinations (CPM, 1.9 ± 0.6 mSv; HPM, 1.9 ± 0.5 mSv; P = 0.95), although z-overscanning increased with the increase of detector width and pitch-value. high-pitch chest CT is a robust method to provide highest image quality making sedation or controlled ventilation for the examination of infants, small or uncooperative children unnecessary, whereas maintaining low radiation dose values.
Multi-GPU implementation of a VMAT treatment plan optimization algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun
Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less
Optical Scanning Architectures For Electronic Printing Applications
NASA Astrophysics Data System (ADS)
Johnson, Richard V.
1987-06-01
The explosive growth of computer technology in recent years has precipitated an equally dramatic growth in the market for nonimpact electronic printers. One of the most popular methods for implementing a high quality nonimpact electronic printer is to integrate a laser scanner with a xerographic copier/duplicator. The subject of this article is a discussion of alternative optical scanner architectures, including both traditional designs which are well represented in the marketplace, and also more exotic designs configured with spatial light modulators, designs which to date have had scant penetration into the marketplace but which can offer superior image quality.
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594