Evaluating virtual hosted desktops for graphics-intensive astronomy
NASA Astrophysics Data System (ADS)
Meade, B. F.; Fluke, C. J.
2018-04-01
Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.
Improved Distance Learning Environment For Marine Forces Reserve
2016-09-01
keyboard, to 20 form a desktop computer . Laptop computers share similar components but add mobility to the user. If additional desktop computers ...for stationary computing devices such as desktop PCs and laptops include the Microsoft Windows, Mac OS, and Linux families of OSs 44 (Hopkins...opportunities to all Marines. For active duty Marines, government-provided desktops and laptops (GPDLs) typically support DL T&E or learning resource
Development of a small-scale computer cluster
NASA Astrophysics Data System (ADS)
Wilhelm, Jay; Smith, Justin T.; Smith, James E.
2008-04-01
An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.
Where the Cloud Meets the Commons
ERIC Educational Resources Information Center
Ipri, Tom
2011-01-01
Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…
Three Approaches to Green Computing on Campus
ERIC Educational Resources Information Center
Thompson, John T.
2009-01-01
A "carbon footprint" is the "total set of greenhouse gas emissions caused directly and indirectly by an (individual, event, organization, and product) expressed as CO2" emissions. Since CO2 emissions are indicative of energy use, the higher the associated CO2 emissions, typically the greater the associated costs. A typical desktop PC system…
MICA: desktop software for comprehensive searching of DNA databases
Stokes, William A; Glick, Benjamin S
2006-01-01
Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays) that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software. PMID:17018144
Propagation Environment Assessment Using UAV Electromagnetic Sensors
2018-03-01
could be added, we limit this study to two dimensions.) The computer program then processes the data and determines the existence of any atmospheric... computer to have large processing capacity, and a typical workstation desktop or laptop can perform the function. E. FLIGHT PATTERNS AND DATA...different types of flight patterns were studied , and our findings show that the vertical flight pattern using a rotary platform is more efficient
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... limited to) desktop computers, integrated desktop computers, laptop/notebook/ netbook computers, and... computer, and 65% of U.S. households owning a notebook, laptop, or netbook computer, in 2013.\\4\\ Coverage... recently published studies. In these studies, the average annual energy use for a desktop computer was...
Farias Zuniga, Amanda M; Côté, Julie N
2017-06-01
The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.
Transitioning EEG experiments away from the laboratory using a Raspberry Pi 2.
Kuziek, Jonathan W P; Shienh, Axita; Mathewson, Kyle E
2017-02-01
Electroencephalography (EEG) experiments are typically performed in controlled laboratory settings to minimise noise and produce reliable measurements. These controlled conditions also reduce the applicability of the obtained results to more varied environments and may limit their relevance to everyday situations. Advances in computer portability may increase the mobility and applicability of EEG results while decreasing costs. In this experiment we show that stimulus presentation using a Raspberry Pi 2 computer provides a low cost, reliable alternative to a traditional desktop PC in the administration of EEG experimental tasks. Significant and reliable MMN and P3 activity, typical event-related potentials (ERPs) associated with an auditory oddball paradigm, were measured while experiments were administered using the Raspberry Pi 2. While latency differences in ERP triggering were observed between systems, these differences reduced power only marginally, likely due to the reduced processing power of the Raspberry Pi 2. An auditory oddball task administered using the Raspberry Pi 2 produced similar ERPs to those derived from a desktop PC in a laboratory setting. Despite temporal differences and slight increases in trials needed for similar statistical power, the Raspberry Pi 2 can be used to design and present auditory experiments comparable to a PC. Our results show that the Raspberry Pi 2 is a low cost alternative to the desktop PC when administering EEG experiments and, due to its small size and low power consumption, will enable mobile EEG experiments unconstrained by a traditional laboratory setting. Copyright © 2016 Elsevier B.V. All rights reserved.
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
Task Scheduling in Desktop Grids: Open Problems
NASA Astrophysics Data System (ADS)
Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny
2017-12-01
We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.
Centrifuge: rapid and sensitive classification of metagenomic sequences
Song, Li; Breitwieser, Florian P.
2016-01-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649
36 CFR § 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Desktop and portable computers. § 1194.26 Section § 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
Desktop Computing Integration Project
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
A convenient and accurate parallel Input/Output USB device for E-Prime.
Canto, Rosario; Bufalari, Ilaria; D'Ausilio, Alessandro
2011-03-01
Psychological and neurophysiological experiments require the accurate control of timing and synchrony for Input/Output signals. For instance, a typical Event-Related Potential (ERP) study requires an extremely accurate synchronization of stimulus delivery with recordings. This is typically done via computer software such as E-Prime, and fast communications are typically assured by the Parallel Port (PP). However, the PP is an old and disappearing technology that, for example, is no longer available on portable computers. Here we propose a convenient USB device enabling parallel I/O capabilities. We tested this device against the PP on both a desktop and a laptop machine in different stress tests. Our data demonstrate the accuracy of our system, which suggests that it may be a good substitute for the PP with E-Prime.
Undergraduate computational physics projects on quantum computing
NASA Astrophysics Data System (ADS)
Candela, D.
2015-08-01
Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.
IMAGE EXPLORER: Astronomical Image Analysis on an HTML5-based Web Application
NASA Astrophysics Data System (ADS)
Gopu, A.; Hayashi, S.; Young, M. D.
2014-05-01
Large datasets produced by recent astronomical imagers cause the traditional paradigm for basic visual analysis - typically downloading one's entire image dataset and using desktop clients like DS9, Aladin, etc. - to not scale, despite advances in desktop computing power and storage. This paper describes Image Explorer, a web framework that offers several of the basic visualization and analysis functionality commonly provided by tools like DS9, on any HTML5 capable web browser on various platforms. It uses a combination of the modern HTML5 canvas, JavaScript, and several layers of lossless PNG tiles producted from the FITS image data. Astronomers are able to rapidly and simultaneously open up several images on their web-browser, adjust the intensity min/max cutoff or its scaling function, and zoom level, apply color-maps, view position and FITS header information, execute typically used data reduction codes on the corresponding FITS data using the FRIAA framework, and overlay tiles for source catalog objects, etc.
Distributed-Memory Computing With the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Riley, Christopher J.; Cheatwood, F. McNeil
1997-01-01
The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA), a Navier-Stokes solver, has been modified for use in a parallel, distributed-memory environment using the Message-Passing Interface (MPI) standard. A standard domain decomposition strategy is used in which the computational domain is divided into subdomains with each subdomain assigned to a processor. Performance is examined on dedicated parallel machines and a network of desktop workstations. The effect of domain decomposition and frequency of boundary updates on performance and convergence is also examined for several realistic configurations and conditions typical of large-scale computational fluid dynamic analysis.
75 FR 25185 - Broadband Initiatives Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...
Comparison of bias analysis strategies applied to a large data set.
Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M
2014-07-01
Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.
Centrifuge: rapid and sensitive classification of metagenomic sequences.
Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L
2016-12-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.
Illuminator, a desktop program for mutation detection using short-read clonal sequencing.
Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T
2011-10-01
Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.
Desktop Publishing Made Simple.
ERIC Educational Resources Information Center
Wentling, Rose Mary
1989-01-01
The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)
Running GUI Applications on Peregrine from OSX | High-Performance Computing
Learn how to use Virtual Network Computing to access a Linux graphical desktop environment on Peregrine local port (on, e.g., your laptop), starts a VNC server process that manages a virtual desktop on your virtual desktop. This is persistent, so remember it-you will use this password whenever accessing
ERIC Educational Resources Information Center
Hall, H. L.
1988-01-01
Reports on the advantages and disadvantages of desktop publishing, using the Apple Macintosh and "Pagemaker" software, to produce a high school yearbook. Asserts that while desktop publishing may be initially more time consuming for those unfamiliar with computers, desktop publishing gives high school journalism staffs more control over…
Pages from the Desktop: Desktop Publishing Today.
ERIC Educational Resources Information Center
Crawford, Walt
1994-01-01
Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…
Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.
ERIC Educational Resources Information Center
Crawford, Walt
1987-01-01
Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…
Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.
Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P
2017-11-01
Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P < .001). Participants demonstrated no significant differences in lipid-layer grade and tear meniscus height between the two environments (all P > .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P < .001), which was associated with a higher proportion of subjects reporting greater comfort relative to baseline (36% vs. 5%, P < .001). Even with a modest increase in relative humidity locally, the desktop humidifier shows potential to improve tear-film stability and subjective comfort during computer use.Trial registration no: ACTRN12617000326392.
2007-04-01
judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit
Seat Interfaces for Aircrew Performance and Safety
2010-01-01
Quantum -II Desktop System consists of a keyboard and hardware accessories (electrodes, cables, etc.), and interfaces with a desktop computer via software...segment. Resistance and reactance data was collected to estimate blood volume changes. The Quantum -II Desktop system collected continuous data of...Approved for public release; distribution unlimited. 88 ABW Cleared 03/13/2015; 88ABW-2015-1053. mockup also included a laptop computer , a
Competition in Defense Acquisitions
2008-05-14
NASA employees to maintain desktop assets No way to track costs, no standardization, not tracking service quality NASA’s Outsourcing Desktop...assets to the private sector. ODIN Goals Cut desktop computing costs Increase service quality Achieve interoperability and standardization Focus...not tracking service quality NASA’s Outsourcing Desktop Initiative (ODIN) transferred the responsibility for providing and managing the vast
Desktop Technology for Newspapers: Use of the Computer Tool.
ERIC Educational Resources Information Center
Wilson, Howard Alan
This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…
Mobility, Emotion, and Universality in Future Collaboration
NASA Astrophysics Data System (ADS)
Chignell, Mark; Hosono, Naotsune; Fels, Deborah; Lottridge, Danielle; Waterworth, John
The Graphical user interface has traditionally supported personal productivity, efficiency, and usability. With computer supported cooperative work, the focus has been on typical people, doing typical work in a highly rational model of interaction. Recent trends towards mobility, and emotional and universal design are extending the user interface paradigm beyond the routine. As computing moves into the hand and away from the desktop, there is a greater need for dealing with emotions and distractions. Busy and distracted people represent a new kind of disability, but one that will be increasingly prevalent. In this panel we examine the current state of the art, and prospects for future collaboration in non-normative computing requirements. This panel draws together researchers who are studying the problems of mobility, emotion and universality. The goal of the panel is to discuss how progress in these areas will change the nature of future collaboration.
What's New in Software? Mastery of the Computer through Desktop Publishing.
ERIC Educational Resources Information Center
Hedley, Carolyn N.; Ellsworth, Nancy J.
1993-01-01
Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
Desktop Publishing: Its Impact on Community College Journalism.
ERIC Educational Resources Information Center
Grzywacz-Gray, John; And Others
1987-01-01
Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)
Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms
Hasbrouck, W.P.
1983-01-01
Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.
omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling
Phan, John H.; Kothari, Sonal; Wang, May D.
2016-01-01
Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062
Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues
NASA Astrophysics Data System (ADS)
Chakravarthy, Srinivas R.; Rumyantsev, Alexander
2018-03-01
Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.
Disaster easily averted? Data confidentiality and the hospital desktop computer.
Sethi, Neeraj; Lane, Gethin; Newton, Sophie; Egan, Philip; Ghosh, Samit
2014-05-01
We specifically identified the hospital desktop computer as a potential source of breaches in confidentiality. We aimed to evaluate if there was accessible, unprotected, confidential information stored on the desktop screen on computers in a district general hospital and if so, how a teaching intervention could improve this situation. An unannounced spot check of 59 ward computers was performed. Data were collected regarding how many had confidential information stored on the desktop screen without any password protection. An online learning module was mandated for healthcare staff and a second cycle of inspection performed. A district general hospital. Two doctors conducted the audit. Computers in clinical areas were assessed. All clinical staff with computer access underwent the online learning module. An online learning module regarding data protection and confidentiality. In the first cycle, 55% of ward computers had easily accessible patient or staff confidential information stored on their desktop screen. This included handovers, referral letters, staff sick leave lists, audits and nursing reports. The majority (85%) of computers accessed were logged in under a generic username and password. The intervention produced an improvement in the second cycle findings with only 26% of computers being found to have unprotected confidential information stored on them. The failure to comply with appropriate confidential data protection regulations is a persistent problem. Education produces some improvement but we also propose a systemic approach to solving this problem. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Feasibility of video codec algorithms for software-only playback
NASA Astrophysics Data System (ADS)
Rodriguez, Arturo A.; Morse, Ken
1994-05-01
Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.
Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... services and manage networked resources for client devices such as desktop and laptop computers. These... include desktop or laptop computers, which are not primarily accessed via network connections. DOE seeks... Determination of Computer Servers as a Covered Consumer Product AGENCY: Office of Energy Efficiency and...
Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…
Desktop Publishing for Counselors.
ERIC Educational Resources Information Center
Lucking, Robert; Mitchum, Nancy
1990-01-01
Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…
Desktop Publishing: A Powerful Tool for Advanced Composition Courses.
ERIC Educational Resources Information Center
Sullivan, Patricia
1988-01-01
Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)
Argonne National Laboratory High Energy Physics Division Windows Desktops Problem Report Service Request Password Help New Users Back to HEP Computing Email on ANL Exchange: See Windows Clients section (Outlook or Thunderbird recommended) Web Browsers: Web Browsers for Windows Desktops Software: Available
The Next Generation of Lab and Classroom Computing - The Silver Lining
2016-12-01
desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The research... infrastructure , VDI, hardware cost, software cost, manpower, availability, cloud computing, private cloud, bring your own device, BYOD, thin client...virtual desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The
Campus Computing 1990: The EDUCOM/USC Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
The National Survey of Desktop Computer Use in Higher Education was conducted in the spring and summer of 1990 by the Center for Scholarly Technology at the University of Southern California, in cooperation with EDUCOM and with support from 15 corporate sponsors. The survey was designed to collect information about campus planning, policies, and…
NASA Technical Reports Server (NTRS)
Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.
NASA Technical Reports Server (NTRS)
Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan
1998-01-01
Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.
A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies
Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; ...
2015-01-21
Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with themore » data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. Furthermore, a tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial.« less
A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies
Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; Blackwood, Christopher B.; Rosen, Gail L.
2015-01-01
Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with the data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. A tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial. Availability: http://www.ece.drexel.edu/gailr/EESI/tutorial.php. PMID:25607539
Performance Comparison of Mainframe, Workstations, Clusters, and Desktop Computers
NASA Technical Reports Server (NTRS)
Farley, Douglas L.
2005-01-01
A performance evaluation of a variety of computers frequently found in a scientific or engineering research environment was conducted using a synthetic and application program benchmarks. From a performance perspective, emerging commodity processors have superior performance relative to legacy mainframe computers. In many cases, the PC clusters exhibited comparable performance with traditional mainframe hardware when 8-12 processors were used. The main advantage of the PC clusters was related to their cost. Regardless of whether the clusters were built from new computers or whether they were created from retired computers their performance to cost ratio was superior to the legacy mainframe computers. Finally, the typical annual maintenance cost of legacy mainframe computers is several times the cost of new equipment such as multiprocessor PC workstations. The savings from eliminating the annual maintenance fee on legacy hardware can result in a yearly increase in total computational capability for an organization.
Desk-top publishing using IBM-compatible computers.
Grencis, P W
1991-01-01
This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.
Video Conferencing: The Next Wave for International Business Communication.
ERIC Educational Resources Information Center
Sondak, Norman E.; Sondak, Eileen M.
This paper suggests that desktop computer-based video conferencing, with high fidelity sound, and group software support, is emerging as a major communications option. Briefly addressed are the following critical factors that are propelling the computer-based video conferencing revolution: (1) widespread availability of desktop computers…
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617
Optical Design Using Small Dedicated Computers
NASA Astrophysics Data System (ADS)
Sinclair, Douglas C.
1980-09-01
Since the time of the 1975 International Lens Design Conference, we have developed a series of optical design programs for Hewlett-Packard desktop computers. The latest programs in the series, OSLO-25G and OSLO-45G, have most of the capabilities of general-purpose optical design programs, including optimization based on exact ray-trace data. The computational techniques used in the programs are similar to ones used in other programs, but the creative environment experienced by a designer working directly with these small dedicated systems is typically much different from that obtained with shared-computer systems. Some of the differences are due to the psychological factors associated with using a system having zero running cost, while others are due to the design of the program, which emphasizes graphical output and ease of use, as opposed to computational speed.
Hasbrouck, W.P.
1983-01-01
Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.
MDA-image: an environment of networked desktop computers for teleradiology/pathology.
Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P
1991-04-01
MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.
Writing Essays on a Laptop or a Desktop Computer: Does It Matter?
ERIC Educational Resources Information Center
Ling, Guangming; Bridgeman, Brent
2013-01-01
To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…
Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-01-01
Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313
Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-06-01
Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.
Contemporary issues in HIM. The application layer--III.
Wear, L L; Pinkert, J R
1993-07-01
We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.
A comparison of the postures assumed when using laptop computers and desktop computers.
Straker, L; Jones, K J; Miller, J
1997-08-01
This study evaluated the postural implications of using a laptop computer. Laptop computer screens and keyboards are joined, and are therefore unable to be adjusted separately in terms of screen height and distance, and keyboard height and distance. The posture required for their use is likely to be constrained, as little adjustment can be made for the anthropometric differences of users. In addition to the postural constraints, the study looked at discomfort levels and performance when using laptops as compared with desktops. Statistical analysis showed significantly greater neck flexion and head tilt with laptop use. The other body angles measured (trunk, shoulder, elbow, wrist, and scapula and neck protraction/retraction) showed no statistical differences. The average discomfort experienced after using the laptop for 20 min, although appearing greater than the discomfort experienced after using the desktop, was not significantly greater. When using the laptop, subjects tended to perform better than when using the desktop, though not significantly so. Possible reasons for the results are discussed and implications of the findings outlined.
Hasbrouck, W.P.
1983-01-01
Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.
Open Radio Communications Architecture Core Framework V1.1.0 Volume 1 Software Users Manual
2005-02-01
on a PC utilizing the KDE desktop that comes with Red Hat Linux . The default desktop for most Red Hat Linux installations is the GNOME desktop. The...SCA) v2.2. The software was designed for a desktop computer running the Linux operating system (OS). It was developed in C++, uses ACE/TAO for CORBA...middleware, Xerces for the XML parser, and Red Hat Linux for the Operating System. The software is referred to as, Open Radio Communication
ERIC Educational Resources Information Center
Zilka, Gila
2017-01-01
The viewing and browsing habits of Israeli children age 8-12 are the subject of this study. The participants did not have a computer at home and were given either a desktop or hybrid computer for home use. Television viewing and internet surfing habits were described, examining whether the children did so with their parents, family members, and…
Creative Computer Detective: The Basics of Teaching Desktop Publishing.
ERIC Educational Resources Information Center
Slothower, Jodie
Teaching desktop publishing (dtp) in college journalism classes is most effective when the instructor integrates into specific courses four types of software--a word processor, a draw program, a paint program and a layout program. In a course on design and layout, the instructor can demonstrate with the computer how good design can be created and…
26 CFR 1.179-5 - Time and manner of making election.
Code of Federal Regulations, 2010 CFR
2010-04-01
... desktop computer costing $1,500. On Taxpayer's 2003 Federal tax return filed on April 15, 2004, Taxpayer elected to expense under section 179 the full cost of the laptop computer and the full cost of the desktop... provided by the Internal Revenue Code, the regulations under the Code, or other guidance published in the...
Application of desktop computers in nuclear engineering education
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graves, H.W. Jr.
1990-01-01
Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solutionmore » to the problem being solved, and yet be flexible enough to accommodate most problem solution options.« less
Desktop Manufacturing Technologies.
ERIC Educational Resources Information Center
Snyder, Mark
1991-01-01
Desktop manufacturing is the use of data from a computer-assisted design system to construct actual models of an object. Emerging processes are stereolithography, laser sintering, ballistic particle manufacturing, laminated object manufacturing, and photochemical machining. (SK)
Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?
ERIC Educational Resources Information Center
Ling, Guangming
2016-01-01
To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…
ERIC Educational Resources Information Center
Fryer, Wesley
2004-01-01
There has long been a power struggle between techies and teachers over classroom computer desktops. IT personnel tend to believe allowing "inept" educators to have unfettered access to their computer's hard drive is an open invitation for trouble. Conversely, teachers often perceive tech support to be "uncaring" adversaries standing in the way of…
Computer usage and national energy consumption: Results from a field-metering study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desroches, Louis-Benoit; Fuchs, Heidi; Greenblatt, Jeffery
The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Baymore » Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power of power supplies to computing needs, and improving the efficiency of individual components.« less
Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo
2014-01-01
Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.
Accuracy of remote chest X-ray interpretation using Google Glass technology.
Spaedy, Emily; Christakopoulos, Georgios E; Tarar, Muhammad Nauman J; Christopoulos, Georgios; Rangan, Bavana V; Roesle, Michele; Ochoa, Cristhiaan D; Yarbrough, William; Banerjee, Subhash; Brilakis, Emmanouil S
2016-09-15
We sought to explore the accuracy of remote chest X-ray reading using hands-free, wearable technology (Google Glass, Google, Mountain View, California). We compared interpretation of twelve chest X-rays with 23 major cardiopulmonary findings by faculty and fellows from cardiology, radiology, and pulmonary-critical care via: (1) viewing the chest X-ray image on the Google Glass screen; (2) viewing a photograph of the chest X-ray taken using Google Glass and interpreted on a mobile device; (3) viewing the original chest X-ray on a desktop computer screen. One point was given for identification of each correct finding and a subjective rating of user experience was recorded. Fifteen physicians (5 faculty and 10 fellows) participated. The average chest X-ray reading score (maximum 23 points) as viewed through the Google Glass, Google Glass photograph on a mobile device, and the original X-ray viewed on a desktop computer was 14.1±2.2, 18.5±1.5 and 21.3±1.7, respectively (p<0.0001 between Google Glass and mobile device, p<0.0001 between Google Glass and desktop computer and p=0.0004 between mobile device and desktop computer). Of 15 physicians, 11 (73.3%) felt confident in detecting findings using the photograph taken by Google Glass as viewed on a mobile device. Remote chest X-ray interpretation using hands-free, wearable technology (Google Glass) is less accurate than interpretation using a desktop computer or a mobile device, suggesting that further technical improvements are needed before widespread application of this novel technology. Published by Elsevier Ireland Ltd.
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.
2013-01-01
This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…
ERIC Educational Resources Information Center
Technology & Learning, 2008
2008-01-01
When it comes to IT, there has always been an important link between data center control and client flexibility. As computing power increases, so do the potentially crippling threats to security, productivity and financial stability. This article talks about Dell's On-Demand Desktop Streaming solution which is designed to centralize complete…
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.
Szeto, Grace P; Lee, Raymond
2002-04-01
To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation
Davis, J P; Akella, S; Waddell, P H
2004-01-01
Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.
Interaction devices for hands-on desktop design
NASA Astrophysics Data System (ADS)
Ju, Wendy; Madsen, Sally; Fiene, Jonathan; Bolas, Mark T.; McDowall, Ian E.; Faste, Rolf
2003-05-01
Starting with a list of typical hand actions - such as touching or twisting - a collection of physical input device prototypes was created to study better ways of engaging the body and mind in the computer aided design process. These devices were interchangeably coupled with a graphics system to allow for rapid exploration of the interplay between the designer's intent, body motions, and the resulting on-screen design. User testing showed that a number of key considerations should influence the future development of such devices: coupling between the physical and virtual worlds, tactile feedback, and scale. It is hoped that these explorations contribute to the greater goal of creating user interface devices that increase the fluency, productivity and joy of computer-augmented design.
GPU-Accelerated Molecular Modeling Coming Of Age
Stone, John E.; Hardy, David J.; Ufimtsev, Ivan S.
2010-01-01
Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. PMID:20675161
GPU-accelerated molecular modeling coming of age.
Stone, John E; Hardy, David J; Ufimtsev, Ivan S; Schulten, Klaus
2010-09-01
Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. (c) 2010 Elsevier Inc. All rights reserved.
Pinthong, Watthanai; Muangruen, Panya
2016-01-01
Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC) is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC) as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST) to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software. PMID:27547555
Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States
ERIC Educational Resources Information Center
Sung, Eunmo; Mayer, Richard E.
2012-01-01
College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…
Validation of tablet-based evaluation of color fundus images
Christopher, Mark; Moga, Daniela C.; Russell, Stephen R.; Folk, James C.; Scheetz, Todd; Abràmoff, Michael D.
2012-01-01
Purpose To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer to recommendations made using a standard desktop display. Methods A tablet computer (iPad) and a desktop PC with a high-definition color display were compared. For each platform, two retinal specialists independently rated 1200 color fundus images from patients at risk for DR using an annotation program, Truthseeker. The specialists determined whether each image had referable DR, and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet- and desktop display-based referral ratings were compared using cross-platform, intra-observer kappa as the primary outcome measure. Additionally, inter-observer kappa, sensitivity, specificity, and area under ROC (AUC) were determined. Results A high level of cross-platform, intra-observer agreement was found for the DR referral ratings between the platforms (κ=0.778), and for the two graders, (κ=0.812). Inter-observer agreement was similar for the two platforms (κ=0.544 and κ=0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an AUC of 0.950 compared to desktop display-based ratings. Conclusions In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR. PMID:22495326
Using Avizo Software on the Peregrine System | High-Performance Computing |
be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can
Desktop Publishing: A New Frontier for Instructional Technologists.
ERIC Educational Resources Information Center
Bell, Norman T.; Warner, James W.
1986-01-01
Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)
Implementing desktop image access of GI images
NASA Astrophysics Data System (ADS)
Grevera, George J.; Feingold, Eric R.; Horii, Steven C.; Laufer, Igor
1996-05-01
In this paper we present a specific example of the current state-of-the-art in desktop image access in the GI section of the Department of Radiology at the Hospital of the University of Pennsylvania. We describe a system which allows physicians to view and manipulate images from a Philips digital fluoroscopy system at the workstations in their offices. Typically they manipulate and view these images on their desktop Macs and then submit the results for slide making or save the images in digital teaching files. In addition to a discussion of the current state-of-the-art here at HUP, we also discuss some future directions that we are pursuing.
CIM for 300-mm semiconductor fab
NASA Astrophysics Data System (ADS)
Luk, Arthur
1997-08-01
Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.
MICROPROCESSOR-BASED DATA-ACQUISITION SYSTEM FOR A BOREHOLE RADAR.
Bradley, Jerry A.; Wright, David L.
1987-01-01
An efficient microprocessor-based system is described that permits real-time acquisition, stacking, and digital recording of data generated by a borehole radar system. Although the system digitizes, stacks, and records independently of a computer, it is interfaced to a desktop computer for program control over system parameters such as sampling interval, number of samples, number of times the data are stacked prior to recording on nine-track tape, and for graphics display of the digitized data. The data can be transferred to the desktop computer during recording, or it can be played back from a tape at a latter time. Using the desktop computer, the operator observes results while recording data and generates hard-copy graphics in the field. Thus, the radar operator can immediately evaluate the quality of data being obtained, modify system parameters, study the radar logs before leaving the field, and rerun borehole logs if necessary. The system has proven to be reliable in the field and has increased productivity both in the field and in the laboratory.
Desktop computer graphics for RMS/payload handling flight design
NASA Technical Reports Server (NTRS)
Homan, D. J.
1984-01-01
A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.
A VM-shared desktop virtualization system based on OpenStack
NASA Astrophysics Data System (ADS)
Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie
2018-04-01
With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.
The Nimrod computational workbench: a case study in desktop metacomputing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramson, D.; Sosic, R.; Foster, I.
The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less
Desktop Virtualization: Applications and Considerations
ERIC Educational Resources Information Center
Hodgman, Matthew R.
2013-01-01
As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…
Modeling of Heat Transfer and Ablation of Refractory Material Due to Rocket Plume Impingement
NASA Technical Reports Server (NTRS)
Harris, Michael F.; Vu, Bruce T.
2012-01-01
CR Tech's Thermal Desktop-SINDA/FLUINT software was used in the thermal analysis of a flame deflector design for Launch Complex 39B at Kennedy Space Center, Florida. The analysis of the flame deflector takes into account heat transfer due to plume impingement from expected vehicles to be launched at KSC. The heat flux from the plume was computed using computational fluid dynamics provided by Ames Research Center in Moffet Field, California. The results from the CFD solutions were mapped onto a 3-D Thermal Desktop model of the flame deflector using the boundary condition mapping capabilities in Thermal Desktop. The ablation subroutine in SINDA/FLUINT was then used to model the ablation of the refractory material.
Basics of Desktop Publishing. Teacher Edition.
ERIC Educational Resources Information Center
Beeby, Ellen
This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…
Desktop Virtualization in Action: Simplicity Is Power
ERIC Educational Resources Information Center
Fennell, Dustin
2010-01-01
Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…
ERIC Educational Resources Information Center
Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd
2008-01-01
This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…
ERIC Educational Resources Information Center
Tenopir, Carol
2004-01-01
With wireless connectivity and small laptop computers, people are no longer tied to the desktop for online searching. Handheld personal digital assistants (PDAs) offer even greater portability. So far, the most common uses of PDAs are as calendars and address books, or to interface with a laptop or desktop machine. More advanced PDAs, like…
Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).
ERIC Educational Resources Information Center
Guthrie, Jim
1995-01-01
Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…
WEB-BASED MODELING OF A FERTILIZER SOLUTION SPILL IN THE OHIO RIVER
Environmental computer models are usually desktop models. Some web-enabled models are beginning to appear where the user can use a browser to run the models on a central web server. Several issues arise when a desktop model is transferred to a web architecture. This paper discuss...
RighTime: A real time clock correcting program for MS-DOS-based computer systems
NASA Technical Reports Server (NTRS)
Becker, G. Thomas
1993-01-01
A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.
A Discussion of Using a Reconfigurable Processor to Implement the Discrete Fourier Transform
NASA Technical Reports Server (NTRS)
White, Michael J.
2004-01-01
This paper presents the design and implementation of the Discrete Fourier Transform (DFT) algorithm on a reconfigurable processor system. While highly applicable to many engineering problems, the DFT is an extremely computationally intensive algorithm. Consequently, the eventual goal of this work is to enhance the execution of a floating-point precision DFT algorithm by off loading the algorithm from the computing system. This computing system, within the context of this research, is a typical high performance desktop computer with an may of field programmable gate arrays (FPGAs). FPGAs are hardware devices that are configured by software to execute an algorithm. If it is desired to change the algorithm, the software is changed to reflect the modification, then download to the FPGA, which is then itself modified. This paper will discuss methodology for developing the DFT algorithm to be implemented on the FPGA. We will discuss the algorithm, the FPGA code effort, and the results to date.
Back to the future: virtualization of the computing environment at the W. M. Keck Observatory
NASA Astrophysics Data System (ADS)
McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.
2014-07-01
Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.
Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms
ERIC Educational Resources Information Center
Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick
2009-01-01
This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…
Meaning-Making in Online Language Learner Interactions via Desktop Videoconferencing
ERIC Educational Resources Information Center
Satar, H. Müge
2016-01-01
Online language learning and teaching in multimodal contexts has been identified as one of the key research areas in computer-aided learning (CALL) (Lamy, 2013; White, 2014). This paper aims to explore meaning-making in online language learner interactions via desktop videoconferencing (DVC) and in doing so illustrate multimodal transcription and…
Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.
ERIC Educational Resources Information Center
Knupfer, Nancy Nelson; McIsaac, Marina Stock
1989-01-01
Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)
Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.
ERIC Educational Resources Information Center
Danziger, Pamela N.
This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…
Design Analysis Kit for Optimization and Terascale Applications 6.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less
Micromagnetics on high-performance workstation and mobile computational platforms
NASA Astrophysics Data System (ADS)
Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.
2015-05-01
The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2004-01-01
Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younkin, James R; Kuhn, Michael J; Gradle, Colleen
New Brunswick Laboratory (NBL) has a numerous inventory containing thousands of plutonium and uranium certified reference materials. The current manual inventory process is well established but is a lengthy process which requires significant oversight and double checking to ensure correctness. Oak Ridge National Laboratory has worked with NBL to develop and deploy a new inventory system which utilizes handheld computers with barcode scanners and radio frequency identification (RFID) readers termed the Tagged Item Inventory System (TIIS). Certified reference materials are identified by labels which incorporate RFID tags and barcodes. The label printing process and RFID tag association process are integratedmore » into the main desktop software application. Software on the handheld computers syncs with software on designated desktop machines and the NBL inventory database to provide a seamless inventory process. This process includes: 1) identifying items to be inventoried, 2) downloading the current inventory information to the handheld computer, 3) using the handheld to read item and location labels, and 4) syncing the handheld computer with a designated desktop machine to analyze the results, print reports, etc. The security of this inventory software has been a major concern. Designated roles linked to authenticated logins are used to control access to the desktop software while password protection and badge verification are used to control access to the handheld computers. The overall system design and deployment at NBL will be presented. The performance of the system will also be discussed with respect to a small piece of the overall inventory. Future work includes performing a full inventory at NBL with the Tagged Item Inventory System and comparing performance, cost, and radiation exposures to the current manual inventory process.« less
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
Modems and More: The Computer Branches Out.
ERIC Educational Resources Information Center
Dyrli, Odvard Egil
1986-01-01
Surveys new "peripherals," electronic devices that attach to computers. Devices such as videodisc players, desktop laser printers, large screen projectors, and input mechanisms that circumvent the keyboard dramatically expand the computer's instructional uses. (Author/LHW)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... DoD published a proposed rule in the Federal Register at 76 FR 21847 on April 19, 2011, to add DFARS..., the contract line item may be for a desktop computer, but the actual items delivered, invoiced, and..., Desktop with 20 EA CPU, Monitor, Keyboard and Mouse. Alternative line-item structure offer where monitors...
Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts
ERIC Educational Resources Information Center
Jacobson, Jeffery
2013-01-01
In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
....usda.gov . SUPPLEMENTARY INFORMATION: A. Background A proposed rule was published in the Federal.... Computers or other technical equipment means central processing units, laptops, desktops, computer mouses...
Creating a Parallel Version of VisIt for Microsoft Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlock, B J; Biagas, K S; Rawson, P L
2011-12-07
VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less
An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.
2011-12-01
The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.
Bufton, Marcia J; Marklin, Richard W; Nagurka, Mark L; Simoneau, Guy G
2006-08-15
This study aimed to compare and analyse rubber-dome desktop, spring-column desktop and notebook keyboards in terms of key stiffness and fingertip typing force. The spring-column keyboard resulted in the highest mean peak contact force (0.86N), followed by the rubber dome desktop (0.68N) and the notebook (0.59N). All these differences were statistically significant. Likewise, the spring-column keyboard registered the highest fingertip typing force and the notebook keyboard the lowest. A comparison of forces showed the notebook (rubber dome) keyboard had the highest fingertip-to-peak contact force ratio (overstrike force), and the spring-column generated the least excess force (as a ratio of peak contact force). The results of this study could aid in optimizing computer key design that could possibly reduce subject discomfort and fatigue.
A Platform-Independent Plugin for Navigating Online Radiology Cases.
Balkman, Jason D; Awan, Omer A
2016-06-01
Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.
Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.
2012-01-01
Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and installation”. PMID:22934238
Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R
2012-01-01
Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".
76 FR 70861 - Promoting Efficient Spending
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... heads to take even more aggressive steps to ensure the Government is a good steward of taxpayer money...., mobile phones, smartphones, desktop and laptop computers, and tablet personal computers) issued to...
The Printout: Computers and Reading in the United Kingdom.
ERIC Educational Resources Information Center
Ewing, James M.
1988-01-01
Offers an overview of some reading and language arts computer projects in the United Kingdom, including language teaching and intelligent knowledge-based systems, assessment of written style by computer, and desktop publishing in the primary school. (ARH)
Computational biomedicine: a challenge for the twenty-first century.
Coveney, Peter V; Shublaq, Nour W
2012-01-01
With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2018-03-01
Recently, computer, mobile phone and Internet use has increased. This study aimed to determine the possible relation between self-reported wrist and finger symptoms (aches, pain or numbness) and using computers/mobile phones, and to analyze how the symptoms are specifically associated with utilizing desktop computers, portable computers or mini-computers and mobile phones. A questionnaire was sent to 15,000 working-age Finns (age 18-65). Via a questionnaire, 723 persons reported wrist and finger symptoms often or more with use. Over 80% use mobile phones daily and less than 30% use desktop computers or the Internet daily at leisure, e.g., over 89.8% quite often or often experienced pain, numbness or aches in the neck, and 61.3% had aches in the hips and the lower back. Only 33.7% connected their symptoms to computer use. In the future, the development of new devices and Internet services should incorporate the ergonomics of the hands and wrists.
Acceleration of FDTD mode solver by high-performance computing techniques.
Han, Lin; Xi, Yanping; Huang, Wei-Ping
2010-06-21
A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.
Desktop supercomputer: what can it do?
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Degtyarev, A.; Korkhov, V.
2017-12-01
The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.
A virtual computer lab for distance biomedical technology education.
Locatis, Craig; Vega, Anibal; Bhagwat, Medha; Liu, Wei-Li; Conde, Jose
2008-03-13
The National Library of Medicine's National Center for Biotechnology Information offers mini-courses which entail applying concepts in biochemistry and genetics to search genomics databases and other information sources. They are highly interactive and involve use of 3D molecular visualization software that can be computationally taxing. Methods were devised to offer the courses at a distance so as to provide as much functionality of a computer lab as possible, the venue where they are normally taught. The methods, which can be employed with varied videoconferencing technology and desktop sharing software, were used to deliver mini-courses at a distance in pilot applications where students could see demonstrations by the instructor and the instructor could observe and interact with students working at their remote desktops. Student ratings of the learning experience and comments to open ended questions were similar to those when the courses are offered face to face. The real time interaction and the instructor's ability to access student desktops from a distance in order to provide individual assistance and feedback were considered invaluable. The technologies and methods mimic much of the functionality of computer labs and may be usefully applied in any context where content changes frequently, training needs to be offered on complex computer applications at a distance in real time, and where it is necessary for the instructor to monitor students as they work.
Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0
NASA Technical Reports Server (NTRS)
Perry, John; Stroud, C. W.
1986-01-01
A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.
Comfort with Computers in the Library.
ERIC Educational Resources Information Center
Agati, Joseph
2002-01-01
Sets forth a list of do's and don't's when integrating aesthetics, functionality, and technology into college library computer workstation furniture. The article discusses workstation access for both portable computer users and for staff, whose needs involve desktop computers that are possibly networked with printers and other peripherals. (GR)
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2012 CFR
2012-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2014 CFR
2014-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2013 CFR
2013-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2011 CFR
2011-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2010 CFR
2010-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
Temporal and spatial organization of doctors' computer usage in a UK hospital department.
Martins, H M G; Nightingale, P; Jones, M R
2005-06-01
This paper describes the use of an application accessible via distributed desktop computing and wireless mobile devices in a specialist department of a UK acute hospital. Data (application logs, in-depth interviews, and ethnographic observation) were simultaneously collected to study doctors' work via this application, when and where they accessed different areas of it, and from what computing devices. These show that the application is widely used, but in significantly different ways over time and space. For example, physicians and surgeons differ in how they use the application and in their choice of mobile or desktop computing. Consultants and junior doctors in the same teams also seem to access different sources of patient information, at different times, and from different locations. Mobile technology was used almost exclusively during the morning by groups of clinicians, predominantly for ward rounds.
Align and conquer: moving toward plug-and-play color imaging
NASA Astrophysics Data System (ADS)
Lee, Ho J.
1996-03-01
The rapid evolution of the low-cost color printing and image capture markets has precipitated a huge increase in the use of color imagery by casual end users on desktop systems, as opposed to traditional professional color users working with specialized equipment. While the cost of color equipment and software has decreased dramatically, the underlying system-level problems associated with color reproduction have remained the same, and in many cases are more difficult to address in a casual environment than in a professional setting. The proliferation of color imaging technologies so far has resulted in a wide availability of component solutions which work together poorly. A similar situation in the desktop computing market has led to the various `Plug-and-Play' standards, which provide a degree of interoperability between a range of products on disparate computing platforms. This presentation will discuss some of the underlying issues and emerging trends in the desktop and consumer digital color imaging markets.
Streamlined, Inexpensive 3D Printing of the Brain and Skull.
Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S
2015-01-01
Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.
Assessment of drug information resource preferences of pharmacy students and faculty
Hanrahan, Conor T.; Cole, Sabrina W.
2014-01-01
A 39-item survey instrument was distributed to faculty and students at Wingate University School of Pharmacy to assess student and faculty drug information (DI) resource use and access preferences. The response rate was 81% (n = 289). Faculty and professional year 2 to 4 students preferred access on laptop or desktop computers (67% and 75%, respectively), followed by smartphones (27% and 22%, respectively). Most faculty and students preferred using Lexicomp Online for drug information (53% and 74%, respectively). Results indicate that DI resources use is similar between students and faculty; laptop or desktop computers are the preferred platforms for accessing drug information. PMID:24860270
An Analysis of Scalable GPU-Based Ray-Guided Volume Rendering
Fogal, Thomas; Schiewe, Alexander; Krüger, Jens
2014-01-01
Volume rendering continues to be a critical method for analyzing large-scale scalar fields, in disciplines as diverse as biomedical engineering and computational fluid dynamics. Commodity desktop hardware has struggled to keep pace with data size increases, challenging modern visualization software to deliver responsive interactions for O(N3) algorithms such as volume rendering. We target the data type common in these domains: regularly-structured data. In this work, we demonstrate that the major limitation of most volume rendering approaches is their inability to switch the data sampling rate (and thus data size) quickly. Using a volume renderer inspired by recent work, we demonstrate that the actual amount of visualizable data for a scene is typically bound considerably lower than the memory available on a commodity GPU. Our instrumented renderer is used to investigate design decisions typically swept under the rug in volume rendering literature. The renderer is freely available, with binaries for all major platforms as well as full source code, to encourage reproduction and comparison with future research. PMID:25506079
Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.
2015-12-01
Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.
Optimizing ion channel models using a parallel genetic algorithm on graphical processors.
Ben-Shalom, Roy; Aviv, Amit; Razon, Benjamin; Korngreen, Alon
2012-01-01
We have recently shown that we can semi-automatically constrain models of voltage-gated ion channels by combining a stochastic search algorithm with ionic currents measured using multiple voltage-clamp protocols. Although numerically successful, this approach is highly demanding computationally, with optimization on a high performance Linux cluster typically lasting several days. To solve this computational bottleneck we converted our optimization algorithm for work on a graphical processing unit (GPU) using NVIDIA's CUDA. Parallelizing the process on a Fermi graphic computing engine from NVIDIA increased the speed ∼180 times over an application running on an 80 node Linux cluster, considerably reducing simulation times. This application allows users to optimize models for ion channel kinetics on a single, inexpensive, desktop "super computer," greatly reducing the time and cost of building models relevant to neuronal physiology. We also demonstrate that the point of algorithm parallelization is crucial to its performance. We substantially reduced computing time by solving the ODEs (Ordinary Differential Equations) so as to massively reduce memory transfers to and from the GPU. This approach may be applied to speed up other data intensive applications requiring iterative solutions of ODEs. Copyright © 2012 Elsevier B.V. All rights reserved.
Bruno Garza, J L; Young, J G
2015-01-01
Extended use of conventional computer input devices is associated with negative musculoskeletal outcomes. While many alternative designs have been proposed, it is unclear whether these devices reduce biomechanical loading and musculoskeletal outcomes. To review studies describing and evaluating the biomechanical loading and musculoskeletal outcomes associated with conventional and alternative input devices. Included studies evaluated biomechanical loading and/or musculoskeletal outcomes of users' distal or proximal upper extremity regions associated with the operation of alternative input devices (pointing devices, mice, other devices) that could be used in a desktop personal computing environment during typical office work. Some alternative pointing device designs (e.g. rollerbar) were consistently associated with decreased biomechanical loading while other designs had inconsistent results across studies. Most alternative keyboards evaluated in the literature reduce biomechanical loading and musculoskeletal outcomes. Studies of other input devices (e.g. touchscreen and gestural controls) were rare, however, those reported to date indicate that these devices are currently unsuitable as replacements for traditional devices. Alternative input devices that reduce biomechanical loading may make better choices for preventing or alleviating musculoskeletal outcomes during computer use, however, it is unclear whether many existing designs are effective.
Anandakrishnan, Ramu; Scogland, Tom R. W.; Fenley, Andrew T.; Gordon, John C.; Feng, Wu-chun; Onufriev, Alexey V.
2010-01-01
Tools that compute and visualize biomolecular electrostatic surface potential have been used extensively for studying biomolecular function. However, determining the surface potential for large biomolecules on a typical desktop computer can take days or longer using currently available tools and methods. Two commonly used techniques to speed up these types of electrostatic computations are approximations based on multi-scale coarse-graining and parallelization across multiple processors. This paper demonstrates that for the computation of electrostatic surface potential, these two techniques can be combined to deliver significantly greater speed-up than either one separately, something that is in general not always possible. Specifically, the electrostatic potential computation, using an analytical linearized Poisson Boltzmann (ALPB) method, is approximated using the hierarchical charge partitioning (HCP) multiscale method, and parallelized on an ATI Radeon 4870 graphical processing unit (GPU). The implementation delivers a combined 934-fold speed-up for a 476,040 atom viral capsid, compared to an equivalent non-parallel implementation on an Intel E6550 CPU without the approximation. This speed-up is significantly greater than the 42-fold speed-up for the HCP approximation alone or the 182-fold speed-up for the GPU alone. PMID:20452792
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.
Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E
2012-03-19
A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community
2012-01-01
Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538
76 FR 43278 - Privacy Act; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... computer (PC). The Security Management Officer's office remains locked when not in use. RETENTION AND... records to include names, addresses, social security numbers, service computation dates, leave usage data... that resides on a desktop computer. RETRIEVABILITY: Records maintained in file folders are indexed and...
Anderson, Karen L; Goldstein, Howard
2004-04-01
Children typically learn in classroom environments that have background noise and reverberation that interfere with accurate speech perception. Amplification technology can enhance the speech perception of students who are hard of hearing. This study used a single-subject alternating treatments design to compare the speech recognition abilities of children who are, hard of hearing when they were using hearing aids with each of three frequency modulated (FM) or infrared devices. Eight 9-12-year-olds with mild to severe hearing loss repeated Hearing in Noise Test (HINT) sentence lists under controlled conditions in a typical kindergarten classroom with a background noise level of +10 dB signal-to-noise (S/N) ratio and 1.1 s reverberation time. Participants listened to HINT lists using hearing aids alone and hearing aids in combination with three types of S/N-enhancing devices that are currently used in mainstream classrooms: (a) FM systems linked to personal hearing aids, (b) infrared sound field systems with speakers placed throughout the classroom, and (c) desktop personal sound field FM systems. The infrared ceiling sound field system did not provide benefit beyond that provided by hearing aids alone. Desktop and personal FM systems in combination with personal hearing aids provided substantial improvements in speech recognition. This information can assist in making S/N-enhancing device decisions for students using hearing aids. In a reverberant and noisy classroom setting, classroom sound field devices are not beneficial to speech perception for students with hearing aids, whereas either personal FM or desktop sound field systems provide listening benefits.
Biologically inspired collision avoidance system for unmanned vehicles
NASA Astrophysics Data System (ADS)
Ortiz, Fernando E.; Graham, Brett; Spagnoli, Kyle; Kelmelis, Eric J.
2009-05-01
In this project, we collaborate with researchers in the neuroscience department at the University of Delaware to develop an Field Programmable Gate Array (FPGA)-based embedded computer, inspired by the brains of small vertebrates (fish). The mechanisms of object detection and avoidance in fish have been extensively studied by our Delaware collaborators. The midbrain optic tectum is a biological multimodal navigation controller capable of processing input from all senses that convey spatial information, including vision, audition, touch, and lateral-line (water current sensing in fish). Unfortunately, computational complexity makes these models too slow for use in real-time applications. These simulations are run offline on state-of-the-art desktop computers, presenting a gap between the application and the target platform: a low-power embedded device. EM Photonics has expertise in developing of high-performance computers based on commodity platforms such as graphic cards (GPUs) and FPGAs. FPGAs offer (1) high computational power, low power consumption and small footprint (in line with typical autonomous vehicle constraints), and (2) the ability to implement massively-parallel computational architectures, which can be leveraged to closely emulate biological systems. Combining UD's brain modeling algorithms and the power of FPGAs, this computer enables autonomous navigation in complex environments, and further types of onboard neural processing in future applications.
Template Interfaces for Agile Parallel Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.
Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less
Addressing Small Computers in the First OS Course
ERIC Educational Resources Information Center
Nutt, Gary
2006-01-01
Small computers are emerging as important components of the contemporary computing scene. Their operating systems vary from specialized software for an embedded system to the same style of OS used on a generic desktop or server computer. This article describes a course in which systems are classified by their hardware capability and the…
75 FR 32915 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... used to authenticate authorized desktop and laptop computer users. Computer servers are scanned monthly... data is also used for management and statistical reports and studies. Routine uses of records... duties. The computer files are password protected with access restricted to authorized users. Records are...
Code of Federal Regulations, 2012 CFR
2012-01-01
..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...
Code of Federal Regulations, 2013 CFR
2013-01-01
..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...
The Natural Link between Teaching History and Computer Skills.
ERIC Educational Resources Information Center
Farnworth, George M.
1992-01-01
Suggests that, because both history and computers are information based, there is an natural link between the two. Argues that history teachers should exploit the technology to help students to understand history while they become computer literate. Points out uses for databases, word processing, desktop publishing, and telecommunications in…
NASA Technical Reports Server (NTRS)
Peabody, Hume L.
2017-01-01
This presentation is meant to be an overview of the model building process It is based on typical techniques (Monte Carlo Ray Tracing for radiation exchange, Lumped Parameter, Finite Difference for thermal solution) used by the aerospace industry This is not intended to be a "How to Use ThermalDesktop" course. It is intended to be a "How to Build Thermal Models" course and the techniques will be demonstrated using the capabilities of ThermalDesktop (TD). Other codes may or may not have similar capabilities. The General Model Building Process can be broken into four top level steps: 1. Build Model; 2. Check Model; 3. Execute Model; 4. Verify Results.
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
An Ecological Framework for Cancer Communication: Implications for Research
Intille, Stephen S; Zabinski, Marion F
2005-01-01
The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614
Cabbage, Kathryn; Brinkley, Shara; Gray, Shelley; Alt, Mary; Cowan, Nelson; Green, Samuel; Kuo, Trudy; Hogan, Tiffany P
2017-06-12
The Comprehensive Assessment Battery for Children - Working Memory (CABC-WM) is a computer-based battery designed to assess different components of working memory in young school-age children. Working memory deficits have been identified in children with language-based learning disabilities, including dyslexia 1 , 2 and language impairment 3 , 4 , but it is not clear whether these children exhibit deficits in subcomponents of working memory, such as visuospatial or phonological working memory. The CABC-WM is administered on a desktop computer with a touchscreen interface and was specifically developed to be engaging and motivating for children. Although the long-term goal of the CABC-WM is to provide individualized working memory profiles in children, the present study focuses on the initial success and utility of the CABC-WM for measuring central executive, visuospatial, phonological loop, and binding constructs in children with typical development. Immediate next steps are to administer the CABC-WM to children with specific language impairment, dyslexia, and comorbid specific language impairment and dyslexia.
A hardware-in-the-loop simulation program for ground-based radar
NASA Astrophysics Data System (ADS)
Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna
2011-06-01
A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.
Multimedia architectures: from desktop systems to portable appliances
NASA Astrophysics Data System (ADS)
Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.
1997-01-01
Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.
Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.
ERIC Educational Resources Information Center
Ogg, Harold C.
This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…
Code of Federal Regulations, 2011 CFR
2011-01-01
... home computer systems of an employee; or (4) Whether the information is active or inactive. (k) Record... (e.g., e-mail, databases, spreadsheets, PowerPoint presentations, electronic reporting systems... information is stored or located, including network servers, desktop or laptop computers and handheld...
ERIC Educational Resources Information Center
Zhao, Jensen J.; Ray, Charles M.; Dye, Lee J.; Davis, Rodney
1998-01-01
Executives (n=63) and office-systems educators (n=88) recommended for workers the following categories of computer end-user skills: hardware, operating systems, word processing, spreadsheets, database, desktop publishing, and presentation. (SK)
Desktop publishing and validation of custom near visual acuity charts.
Marran, Lynn; Liu, Lei; Lau, George
2008-11-01
Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.
Tablet PCs: A Physical Educator's New Clipboard
ERIC Educational Resources Information Center
Nye, Susan B.
2010-01-01
Computers in education have come a long way from the abacus of 5,000 years ago to the desktop and laptop computers of today. Computers have transformed the educational environment, and with each new iteration of smaller and more powerful machines come additional advantages for teaching practices. The Tablet PC is one. Tablet PCs are fully…
ERIC Educational Resources Information Center
Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.
2006-01-01
Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…
The Role of Wireless Computing Technology in the Design of Schools.
ERIC Educational Resources Information Center
Nair, Prakash
This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
Ultra-Scale Computing for Emergency Evacuation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng
2010-01-01
Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, itmore » is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.« less
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Definitions. 23.701... DRUG-FREE WORKPLACE Contracting for Environmentally Preferable Products and Services 23.701 Definitions. As used in this subpart— Computer monitor means a video display unit used with a computer. Desktop...
Lock It Up! Computer Security.
ERIC Educational Resources Information Center
Wodarz, Nan
1997-01-01
The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…
Desktop chaotic systems: Intuition and visualization
NASA Technical Reports Server (NTRS)
Bright, Michelle M.; Melcher, Kevin J.; Qammar, Helen K.; Hartley, Tom T.
1993-01-01
This paper presents a dynamic study of the Wildwood Pendulum, a commercially available desktop system which exhibits a strange attractor. The purpose of studying this chaotic pendulum is twofold: to gain insight in the paradigmatic approach of modeling, simulating, and determining chaos in nonlinear systems; and to provide a desktop model of chaos as a visual tool. For this study, the nonlinear behavior of this chaotic pendulum is modeled, a computer simulation is performed, and an experimental performance is measured. An assessment of the pendulum in the phase plane shows the strange attractor. Through the use of a box-assisted correlation dimension methodology, the attractor dimension is determined for both the model and the experimental pendulum systems. Correlation dimension results indicate that the pendulum and the model are chaotic and their fractal dimensions are similar.
ERIC Educational Resources Information Center
Green, Kenneth C.
This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…
ERIC Educational Resources Information Center
Porec, Carol J.
1989-01-01
Describes how "The Children's Writing and Publishing Center" (a desktop publishing program for elementary students) combines word processing with computer graphics and motivates students to write letters. (MM)
ERIC Educational Resources Information Center
Brink, Dan
1987-01-01
Reviews the current state of printing software and printing hardware compatibility and capacity. Discusses the changing relationship between author and publisher resulting from the advent of desktop publishing. (LMO)
A Librarian Without Books:Systems Librarianship in Astronomy
NASA Astrophysics Data System (ADS)
Kneale, R. A.
2007-10-01
The author discusses one aspect of the changing nature of librarianship by focusing on a high-tech microcosm of an already high-tech profession, that of systems librarianship. She is the Systems Librarian for the Advanced Technology Solar Telescope (ATST) project, based in Tucson, Arizona. The project is engaged in the design and development of a 4-meter solar telescope, planned for the summit of Haleakalā, Maui, Hawai'i. Most of the day-to-day tasks at ATST involve software in one form or another; the author makes heavy use of Remote Desktop and Virtual Network Computing (VNC) to manage installations on eight different servers (four Windows, four Unix) in two states, plus staff desktops (Windows XP) from the comfy chair in front of her computer.
Anandakrishnan, Ramu; Scogland, Tom R W; Fenley, Andrew T; Gordon, John C; Feng, Wu-chun; Onufriev, Alexey V
2010-06-01
Tools that compute and visualize biomolecular electrostatic surface potential have been used extensively for studying biomolecular function. However, determining the surface potential for large biomolecules on a typical desktop computer can take days or longer using currently available tools and methods. Two commonly used techniques to speed-up these types of electrostatic computations are approximations based on multi-scale coarse-graining and parallelization across multiple processors. This paper demonstrates that for the computation of electrostatic surface potential, these two techniques can be combined to deliver significantly greater speed-up than either one separately, something that is in general not always possible. Specifically, the electrostatic potential computation, using an analytical linearized Poisson-Boltzmann (ALPB) method, is approximated using the hierarchical charge partitioning (HCP) multi-scale method, and parallelized on an ATI Radeon 4870 graphical processing unit (GPU). The implementation delivers a combined 934-fold speed-up for a 476,040 atom viral capsid, compared to an equivalent non-parallel implementation on an Intel E6550 CPU without the approximation. This speed-up is significantly greater than the 42-fold speed-up for the HCP approximation alone or the 182-fold speed-up for the GPU alone. Copyright (c) 2010 Elsevier Inc. All rights reserved.
New generation of 3D desktop computer interfaces
NASA Astrophysics Data System (ADS)
Skerjanc, Robert; Pastoor, Siegmund
1997-05-01
Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).
Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki
2009-01-01
We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.
LTCP 2D Graphical User Interface. Application Description and User's Guide
NASA Technical Reports Server (NTRS)
Ball, Robert; Navaz, Homayun K.
1996-01-01
A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2013 CFR
2013-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2012 CFR
2012-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2014 CFR
2014-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2011 CFR
2011-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2013 CFR
2013-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2012 CFR
2012-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
Farahani, Navid; Post, Robert; Duboy, Jon; Ahmed, Ishtiaque; Kolowitz, Brian J; Krinchai, Teppituk; Monaco, Sara E; Fine, Jeffrey L; Hartman, Douglas J; Pantanowitz, Liron
2016-01-01
Digital slides obtained from whole slide imaging (WSI) platforms are typically viewed in two dimensions using desktop personal computer monitors or more recently on mobile devices. To the best of our knowledge, we are not aware of any studies viewing digital pathology slides in a virtual reality (VR) environment. VR technology enables users to be artificially immersed in and interact with a computer-simulated world. Oculus Rift is among the world's first consumer-targeted VR headsets, intended primarily for enhanced gaming. Our aim was to explore the use of the Oculus Rift for examining digital pathology slides in a VR environment. An Oculus Rift Development Kit 2 (DK2) was connected to a 64-bit computer running Virtual Desktop software. Glass slides from twenty randomly selected lymph node cases (ten with benign and ten malignant diagnoses) were digitized using a WSI scanner. Three pathologists reviewed these digital slides on a 27-inch 5K display and with the Oculus Rift after a 2-week washout period. Recorded endpoints included concordance of final diagnoses and time required to examine slides. The pathologists also rated their ease of navigation, image quality, and diagnostic confidence for both modalities. There was 90% diagnostic concordance when reviewing WSI using a 5K display and Oculus Rift. The time required to examine digital pathology slides on the 5K display averaged 39 s (range 10-120 s), compared to 62 s with the Oculus Rift (range 15-270 s). All pathologists confirmed that digital pathology slides were easily viewable in a VR environment. The ratings for image quality and diagnostic confidence were higher when using the 5K display. Using the Oculus Rift DK2 to view and navigate pathology whole slide images in a virtual environment is feasible for diagnostic purposes. However, image resolution using the Oculus Rift device was limited. Interactive VR technologies such as the Oculus Rift are novel tools that may be of use in digital pathology.
48 CFR 52.223-16 - Acquisition of EPEAT-Registered Personal Computer Products.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) and liquid crystal display (LCD). Desktop computer means a computer where the main unit is intended to... periods of time either with or without a direct connection to an AC power source. Notebooks must utilize... products that, at the time of submission of proposals and at the time of award, were EPEAT® bronze...
Using "Audacity" and One Classroom Computer to Experiment with Timbre
ERIC Educational Resources Information Center
Smith, Kenneth H.
2011-01-01
One computer, one class, and one educator can be an effective combination to engage students as a group in music composition, performance, and analysis. Having one desktop computer and a television monitor in the music classroom is not an uncommon or new scenario, especially in a time when many school budgets are being cut. This article…
Numerical Optimization Using Desktop Computers
1980-09-11
concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong
2017-01-01
Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2017-04-01
Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.
Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?
Billah, Syed Masum; Ashok, Vikas; Porter, Donald E.; Ramakrishnan, IV
2017-01-01
Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments. PMID:28782061
Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?
Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V
2017-05-01
Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.
NASA Astrophysics Data System (ADS)
Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.
2013-12-01
A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)
Technology in Education: Research Says!!
ERIC Educational Resources Information Center
Canuel, Ron
2011-01-01
A large amount of research existed in the field of technology in the classroom; however, almost all was focused on the impact of desktop computers and the infamous "school computer room". However, the activities in a classroom represent a multitude of behaviours and interventions, including personal dynamics, classroom management and…
USDA-ARS?s Scientific Manuscript database
Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...
Distributing Data from Desktop to Hand-Held Computers
NASA Technical Reports Server (NTRS)
Elmore, Jason L.
2005-01-01
A system of server and client software formats and redistributes data from commercially available desktop to commercially available hand-held computers via both wired and wireless networks. This software is an inexpensive means of enabling engineers and technicians to gain access to current sensor data while working in locations in which such data would otherwise be inaccessible. The sensor data are first gathered by a data-acquisition server computer, then transmitted via a wired network to a data-distribution computer that executes the server portion of the present software. Data in all sensor channels -- both raw sensor outputs in millivolt units and results of conversion to engineering units -- are made available for distribution. Selected subsets of the data are transmitted to each hand-held computer via the wired and then a wireless network. The selection of the subsets and the choice of the sequences and formats for displaying the data is made by means of a user interface generated by the client portion of the software. The data displayed on the screens of hand-held units can be updated at rates from 1 to
"Software Tools" to Improve Student Writing.
ERIC Educational Resources Information Center
Oates, Rita Haugh
1987-01-01
Reviews several software packages that analyze text readability, check for spelling and style problems, offer desktop publishing capabilities, teach interviewing skills, and teach grammar using a computer game. (SRT)
Desktop system for accounting, audit, and research in A&E.
Taylor, C J; Brain, S G; Bull, F; Crosby, A C; Ferguson, D G
1997-01-01
The development of a database for audit, research, and accounting in accident and emergency (A&E) is described. The system uses a desktop computer, an optical scanner, sophisticated optical mark reader software, and workload management data. The system is highly flexible, easy to use, and at a cost of around 16,000 pounds affordable for larger departments wishing to move towards accounting. For smaller departments, it may be an alternative to full computerisation. Images Figure 1 Figure 2 Figure 3 Figure 5 Figure 6 PMID:9132200
Bringing the medical library to the office desktop.
Brown, S R; Decker, G; Pletzke, C J
1991-01-01
This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
SkinScan©: A PORTABLE LIBRARY FOR MELANOMA DETECTION ON HANDHELD DEVICES
Wadhawan, Tarun; Situ, Ning; Lancaster, Keith; Yuan, Xiaojing; Zouridakis, George
2011-01-01
We have developed a portable library for automated detection of melanoma termed SkinScan© that can be used on smartphones and other handheld devices. Compared to desktop computers, embedded processors have limited processing speed, memory, and power, but they have the advantage of portability and low cost. In this study we explored the feasibility of running a sophisticated application for automated skin cancer detection on an Apple iPhone 4. Our results demonstrate that the proposed library with the advanced image processing and analysis algorithms has excellent performance on handheld and desktop computers. Therefore, deployment of smartphones as screening devices for skin cancer and other skin diseases can have a significant impact on health care delivery in underserved and remote areas. PMID:21892382
The social computing room: a multi-purpose collaborative visualization environment
NASA Astrophysics Data System (ADS)
Borland, David; Conway, Michael; Coposky, Jason; Ginn, Warren; Idaszak, Ray
2010-01-01
The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.
GREEN SUPERCOMPUTING IN A DESKTOP BOX
DOE Office of Scientific and Technical Information (OSTI.GOV)
HSU, CHUNG-HSING; FENG, WU-CHUN; CHING, AVERY
2007-01-17
The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicatedmore » computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.« less
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
A Survey of Students Participating in a Computer-Assisted Education Programme
ERIC Educational Resources Information Center
Yel, Elif Binboga; Korhan, Orhan
2015-01-01
This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…
ERIC Educational Resources Information Center
Chester, Ivan
2007-01-01
CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
The Human-Computer Interaction of Cross-Cultural Gaming Strategy
ERIC Educational Resources Information Center
Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander
2015-01-01
This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…
Streamlined, Inexpensive 3D Printing of the Brain and Skull
Cash, Sydney S.
2015-01-01
Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes. PMID:26295459
Interpretation of Coronary Angiograms Recorded Using Google Glass: A Comparative Analysis.
Duong, Thao; Wosik, Jedrek; Christakopoulos, Georgios E; Martínez Parachini, José Roberto; Karatasakis, Aris; Tarar, Muhammad Nauman Javed; Resendes, Erica; Rangan, Bavana V; Roesle, Michele; Grodin, Jerrold; Abdullah, Shuaib M; Banerjee, Subhash; Brilakis, Emmanouil S
2015-10-01
Google Glass (Google, Inc) is a voice-activated, hands-free, optical head-mounted display device capable of taking pictures, recording videos, and transmitting data via wi-fi. In the present study, we examined the accuracy of coronary angiogram interpretation, recorded using Google Glass. Google Glass was used to record 15 angiograms with 17 major findings and the participants were asked to interpret those recordings on: (1) an iPad (Apple, Inc); or (2) a desktop computer. Interpretation was compared with the original angiograms viewed on a desktop. Ten physicians (2 interventional cardiologists and 8 cardiology fellows) participated. One point was assigned for each correct finding, for a maximum of 17 points. The mean angiogram interpretation score for Google Glass angiogram recordings viewed on an iPad or a desktop vs the original angiograms viewed on a desktop was 14.9 ± 1.1, 15.2 ± 1.8, and 15.9 ± 1.1, respectively (P=.06 between the iPad and the original angiograms, P=.51 between the iPad and recordings viewed on a desktop, and P=.43 between the recordings viewed on a desktop and the original angiograms). In a post-study survey, one of the 10 physicians (10%) was "neutral" with the quality of the recordings using Google Glass, 6 physicians (60%) were "somewhat satisfied," and 3 physicians (30%) were "very satisfied." This small pilot study suggests that the quality of coronary angiogram video recordings obtained using Google Glass may be adequate for recognition of major findings, supporting its expanding use in telemedicine.
Group Velocity Dispersion Curves from Wigner-Ville Distributions
NASA Astrophysics Data System (ADS)
Lloyd, Simon; Bokelmann, Goetz; Sucic, Victor
2013-04-01
With the widespread adoption of ambient noise tomography, and the increasing number of local earthquakes recorded worldwide due to dense seismic networks and many very dense temporary experiments, we consider it worthwhile to evaluate alternative Methods to measure surface wave group velocity dispersions curves. Moreover, the increased computing power of even a simple desktop computer makes it feasible to routinely use methods other than the typically employed multiple filtering technique (MFT). To that end we perform tests with synthetic and observed seismograms using the Wigner-Ville distribution (WVD) frequency time analysis, and compare dispersion curves measured with WVD and MFT with each other. Initial results suggest WVD to be at least as good as MFT at measuring dispersion, albeit at a greater computational expense. We therefore need to investigate if, and under which circumstances, WVD yields better dispersion curves than MFT, before considering routinely applying the method. As both MFT and WVD generally work well for teleseismic events and at longer periods, we explore how well the WVD method performs at shorter periods and for local events with smaller epicentral distances. Such dispersion information could potentially be beneficial for improving velocity structure resolution within the crust.
A physical and economic model of the nuclear fuel cycle
NASA Astrophysics Data System (ADS)
Schneider, Erich Alfred
A model of the nuclear fuel cycle that is suitable for use in strategic planning and economic forecasting is presented. The model, to be made available as a stand-alone software package, requires only a small set of fuel cycle and reactor specific input parameters. Critical design criteria include ease of use by nonspecialists, suppression of errors to within a range dictated by unit cost uncertainties, and limitation of runtime to under one minute on a typical desktop computer. Collision probability approximations to the neutron transport equation that lead to a computationally efficient decoupling of the spatial and energy variables are presented and implemented. The energy dependent flux, governed by coupled integral equations, is treated by multigroup or continuous thermalization methods. The model's output includes a comprehensive nuclear materials flowchart that begins with ore requirements, calculates the buildup of 24 actinides as well as fission products, and concludes with spent fuel or reprocessed material composition. The costs, direct and hidden, of the fuel cycle under study are also computed. In addition to direct disposal and plutonium recycling strategies in current use, the model addresses hypothetical cycles. These include cycles chosen for minor actinide burning and for their low weapons-usable content.
Potential Flow Theory and Operation Guide for the Panel Code PMARC. Version 14
NASA Technical Reports Server (NTRS)
Ashby, Dale L.
1999-01-01
The theoretical basis for PMARC, a low-order panel code for modeling complex three-dimensional bodies, in potential flow, is outlined. PMARC can be run on a wide variety of computer platforms, including desktop machines, workstations, and supercomputers. Execution times for PMARC vary tremendously depending on the computer resources used, but typically range from several minutes for simple or moderately complex cases to several hours for very large complex cases. Several of the advanced features currently included in the code, such as internal flow modeling, boundary layer analysis, and time-dependent flow analysis, including problems involving relative motion, are discussed in some detail. The code is written in Fortran77, using adjustable-size arrays so that it can be easily redimensioned to match problem requirements and computer hardware constraints. An overview of the program input is presented. A detailed description of the input parameters is provided in the appendices. PMARC results for several test cases are presented along with analytic or experimental data, where available. The input files for these test cases are given in the appendices. PMARC currently supports plotfile output formats for several commercially available graphics packages. The supported graphics packages are Plot3D, Tecplot, and PmarcViewer.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., electronic tapes and back-up tapes, optical discs, CD-ROMS, and DVDs), and voicemail records; (2) Where the information is stored or located, including network servers, desktop or laptop computers and handheld...
A framework for interactive visualization of digital medical images.
Koehring, Andrew; Foo, Jung Leng; Miyano, Go; Lobe, Thom; Winer, Eliot
2008-10-01
The visualization of medical images obtained from scanning techniques such as computed tomography and magnetic resonance imaging is a well-researched field. However, advanced tools and methods to manipulate these data for surgical planning and other tasks have not seen widespread use among medical professionals. Radiologists have begun using more advanced visualization packages on desktop computer systems, but most physicians continue to work with basic two-dimensional grayscale images or not work directly with the data at all. In addition, new display technologies that are in use in other fields have yet to be fully applied in medicine. It is our estimation that usability is the key aspect in keeping this new technology from being more widely used by the medical community at large. Therefore, we have a software and hardware framework that not only make use of advanced visualization techniques, but also feature powerful, yet simple-to-use, interfaces. A virtual reality system was created to display volume-rendered medical models in three dimensions. It was designed to run in many configurations, from a large cluster of machines powering a multiwalled display down to a single desktop computer. An augmented reality system was also created for, literally, hands-on interaction when viewing models of medical data. Last, a desktop application was designed to provide a simple visualization tool, which can be run on nearly any computer at a user's disposal. This research is directed toward improving the capabilities of medical professionals in the tasks of preoperative planning, surgical training, diagnostic assistance, and patient education.
ERIC Educational Resources Information Center
Bozzone, Meg A.
1997-01-01
Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)
LOSCAR: Long-term Ocean-atmosphere-Sediment CArbon cycle Reservoir Model
NASA Astrophysics Data System (ADS)
Zeebe, R. E.
2011-06-01
The LOSCAR model is designed to efficiently compute the partitioning of carbon between ocean, atmosphere, and sediments on time scales ranging from centuries to millions of years. While a variety of computationally inexpensive carbon cycle models are already available, many are missing a critical sediment component, which is indispensable for long-term integrations. One of LOSCAR's strengths is the coupling of ocean-atmosphere routines to a computationally efficient sediment module. This allows, for instance, adequate computation of CaCO3 dissolution, calcite compensation, and long-term carbon cycle fluxes, including weathering of carbonate and silicate rocks. The ocean component includes various biogeochemical tracers such as total carbon, alkalinity, phosphate, oxygen, and stable carbon isotopes. We have previously published applications of the model tackling future projections of ocean chemistry and weathering, pCO2 sensitivity to carbon cycle perturbations throughout the Cenozoic, and carbon/calcium cycling during the Paleocene-Eocene Thermal Maximum. The focus of the present contribution is the detailed description of the model including numerical architecture, processes and parameterizations, tuning, and examples of input and output. Typical CPU integration times of LOSCAR are of order seconds for several thousand model years on current standard desktop machines. The LOSCAR source code in C can be obtained from the author by sending a request to loscar.model@gmail.com.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
Identifying Broadband Rotational Spectra with Neural Networks
NASA Astrophysics Data System (ADS)
Zaleski, Daniel P.; Prozument, Kirill
2017-06-01
A typical broadband rotational spectrum may contain several thousand observable transitions, spanning many species. Identifying the individual spectra, particularly when the dynamic range reaches 1,000:1 or even 10,000:1, can be challenging. One approach is to apply automated fitting routines. In this approach, combinations of 3 transitions can be created to form a "triple", which allows fitting of the A, B, and C rotational constants in a Watson-type Hamiltonian. On a standard desktop computer, with a target molecule of interest, a typical AUTOFIT routine takes 2-12 hours depending on the spectral density. A new approach is to utilize machine learning to train a computer to recognize the patterns (frequency spacing and relative intensities) inherit in rotational spectra and to identify the individual spectra in a raw broadband rotational spectrum. Here, recurrent neural networks have been trained to identify different types of rotational spectra and classify them accordingly. Furthermore, early results in applying convolutional neural networks for spectral object recognition in broadband rotational spectra appear promising. Perez et al. "Broadband Fourier transform rotational spectroscopy for structure determination: The water heptamer." Chem. Phys. Lett., 2013, 571, 1-15. Seifert et al. "AUTOFIT, an Automated Fitting Tool for Broadband Rotational Spectra, and Applications to 1-Hexanal." J. Mol. Spectrosc., 2015, 312, 13-21. Bishop. "Neural networks for pattern recognition." Oxford university press, 1995.
A Streaming Language Implementation of the Discontinuous Galerkin Method
NASA Technical Reports Server (NTRS)
Barth, Timothy; Knight, Timothy
2005-01-01
We present a Brook streaming language implementation of the 3-D discontinuous Galerkin method for compressible fluid flow on tetrahedral meshes. Efficient implementation of the discontinuous Galerkin method using the streaming model of computation introduces several algorithmic design challenges. Using a cycle-accurate simulator, performance characteristics have been obtained for the Stanford Merrimac stream processor. The current Merrimac design achieves 128 Gflops per chip and the desktop board is populated with 16 chips yielding a peak performance of 2 Teraflops. Total parts cost for the desktop board is less than $20K. Current cycle-accurate simulations for discretizations of the 3-D compressible flow equations yield approximately 40-50% of the peak performance of the Merrimac streaming processor chip. Ongoing work includes the assessment of the performance of the same algorithm on the 2 Teraflop desktop board with a target goal of achieving 1 Teraflop performance.
A Parallel Particle Swarm Optimization Algorithm Accelerated by Asynchronous Evaluations
NASA Technical Reports Server (NTRS)
Venter, Gerhard; Sobieszczanski-Sobieski, Jaroslaw
2005-01-01
A parallel Particle Swarm Optimization (PSO) algorithm is presented. Particle swarm optimization is a fairly recent addition to the family of non-gradient based, probabilistic search algorithms that is based on a simplified social model and is closely tied to swarming theory. Although PSO algorithms present several attractive properties to the designer, they are plagued by high computational cost as measured by elapsed time. One approach to reduce the elapsed time is to make use of coarse-grained parallelization to evaluate the design points. Previous parallel PSO algorithms were mostly implemented in a synchronous manner, where all design points within a design iteration are evaluated before the next iteration is started. This approach leads to poor parallel speedup in cases where a heterogeneous parallel environment is used and/or where the analysis time depends on the design point being analyzed. This paper introduces an asynchronous parallel PSO algorithm that greatly improves the parallel e ciency. The asynchronous algorithm is benchmarked on a cluster assembled of Apple Macintosh G5 desktop computers, using the multi-disciplinary optimization of a typical transport aircraft wing as an example.
Jannovar: a java library for exome annotation.
Jäger, Marten; Wang, Kai; Bauer, Sebastian; Smedley, Damian; Krawitz, Peter; Robinson, Peter N
2014-05-01
Transcript-based annotation and pedigree analysis are two basic steps in the computational analysis of whole-exome sequencing experiments in genetic diagnostics and disease-gene discovery projects. Here, we present Jannovar, a stand-alone Java application as well as a Java library designed to be used in larger software frameworks for exome and genome analysis. Jannovar uses an interval tree to identify all transcripts affected by a given variant, and provides Human Genome Variation Society-compliant annotations both for variants affecting coding sequences and splice junctions as well as untranslated regions and noncoding RNA transcripts. Jannovar can also perform family-based pedigree analysis with Variant Call Format (VCF) files with data from members of a family segregating a Mendelian disorder. Using a desktop computer, Jannovar requires a few seconds to annotate a typical VCF file with exome data. Jannovar is freely available under the BSD2 license. Source code as well as the Java application and library file can be downloaded from http://compbio.charite.de (with tutorial) and https://github.com/charite/jannovar. © 2014 WILEY PERIODICALS, INC.
Utilization of KSC Present Broadband Communications Data System for Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2002-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
ERIC Educational Resources Information Center
Peng, Hsinyi; Chou, Chien; Chang, Chun-Yu
2008-01-01
Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend toward ubiquitous computing is evolving. In this study, we re-visit the interactivity concept and its applications for interactive function design in a ubiquitous-learning system (ULS). Further, we compare interactivity dimensions and…
Taking the Plunge: Districts Leap into Virtualization
ERIC Educational Resources Information Center
Demski, Jennifer
2010-01-01
Moving from a traditional desktop computing environment to a virtualized solution is a daunting task. In this article, the author presents case histories of three districts that have made the conversion to virtual computing to learn about their experiences: What prompted them to make the move, and what were their objectives? Which obstacles prove…
Computer assisted yarding cost analysis.
Ronald W. Mifflin
1980-01-01
Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...
Desktop Social Science: Coming of Age.
ERIC Educational Resources Information Center
Dwyer, David C.; And Others
Beginning in 1985, Apple Computer, Inc. and several school districts began a collaboration to examine the impact of intensive computer use on instruction and learning in K-12 classrooms. This paper follows the development of a Macintosh II-based management and retrieval system for text data undertaken to store and retrieve oral reflections of…
Refocusing the Vision: The Future of Instructional Technology
ERIC Educational Resources Information Center
Pence, Harry E.; McIntosh, Steven
2011-01-01
Two decades ago, many campuses mobilized a major effort to deal with a clear problem; faculty and students needed access to desktop computing technologies. Now the situation is much more complex. Responding to the current challenges, like mobile computing and social networking, will be ore difficult but equally important. There is a clear need for…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
...., desktop, laptop, handheld or other computer types) containing protected personal identifiers or PHI is... as the National Indian Women's Resource Center, to conduct analytical and evaluation studies. 8... SYSTEM: STORAGE: File folders, ledgers, card files, microfiche, microfilm, computer tapes, disk packs...
Utilization of KSC Present Broadband Communications Data System For Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2001-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan
2016-06-27
The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.
Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Arnegard, Ruth J.; Comstock, J. R., Jr.
1991-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
The multi-attribute task battery for human operator workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Comstock, J. Raymond, Jr.; Arnegard, Ruth J.
1992-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
Goostrey, Sonya; Treleaven, Julia; Johnston, Venerina
2014-05-01
This study evaluated the impact on neck movement and muscle activity of placing documents in three commonly used locations: in-line, flat desktop left of the keyboard and laterally placed level with the computer screen. Neck excursion during three standard head movements between the computer monitor and each document location and neck extensor and upper trapezius muscle activity during a 5 min typing task for each of the document locations was measured in 20 healthy participants. Results indicated that muscle activity and neck flexion were least when documents were placed laterally suggesting it may be the optimal location. The desktop option produced both the greatest neck movement and muscle activity in all muscle groups. The in-line document location required significantly more neck flexion but less lateral flexion and rotation than the laterally placed document. Evaluation of other holders is needed to guide decision making for this commonly used office equipment. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
DataPlus - a revolutionary applications generator for DOS hand-held computers
David Dean; Linda Dean
2000-01-01
DataPlus allows the user to easily design data collection templates for DOS-based hand-held computers that mimic clipboard data sheets. The user designs and tests the application on the desktop PC and then transfers it to a DOS field computer. Other features include: error checking, missing data checks, and sensor input from RS-232 devices such as bar code wands,...
2-D Animation's Not Just for Mickey Mouse.
ERIC Educational Resources Information Center
Weinman, Lynda
1995-01-01
Discusses characteristics of two-dimensional (2-D) animation; highlights include character animation, painting issues, and motion graphics. Sidebars present Silicon Graphics animations tools and 2-D animation programs for the desktop computer. (DGM)
Metallurgical Plant Optimization Through the use of Flowsheet Simulation Modelling
NASA Astrophysics Data System (ADS)
Kennedy, Mark William
Modern metallurgical plants typically have complex flowsheets and operate on a continuous basis. Real time interactions within such processes can be complex and the impacts of streams such as recycles on process efficiency and stability can be highly unexpected prior to actual operation. Current desktop computing power, combined with state-of-the-art flowsheet simulation software like Metsim, allow for thorough analysis of designs to explore the interaction between operating rate, heat and mass balances and in particular the potential negative impact of recycles. Using plant information systems, it is possible to combine real plant data with simple steady state models, using dynamic data exchange links to allow for near real time de-bottlenecking of operations. Accurate analytical results can also be combined with detailed unit operations models to allow for feed-forward model-based-control. This paper will explore some examples of the application of Metsim to real world engineering and plant operational issues.
Simulating Isotope Enrichment by Gaseous Diffusion
NASA Astrophysics Data System (ADS)
Reed, Cameron
2015-04-01
A desktop-computer simulation of isotope enrichment by gaseous diffusion has been developed. The simulation incorporates two non-interacting point-mass species whose members pass through a cascade of cells containing porous membranes and retain constant speeds as they reflect off the walls of the cells and the spaces between holes in the membranes. A particular feature is periodic forward recycling of enriched material to cells further along the cascade along with simultaneous return of depleted material to preceding cells. The number of particles, the mass ratio, the initial fractional abundance of the lighter species, and the time between recycling operations can be chosen by the user. The simulation is simple enough to be understood on the basis of two-dimensional kinematics, and demonstrates that the fractional abundance of the lighter-isotope species increases along the cascade. The logic of the simulation will be described and results of some typical runs will be presented and discussed.
Gobba, Fabriziomaria
2017-01-01
The aim of this research is to study the symptoms and use of computers/mobile phones of individuals nearing retirement age (≥55 years). A questionnaire was sent to 15,000 Finns (aged 18–65). People who were ≥55 years of age were compared to the rest of the population. Six thousand one hundred and twenty-one persons responded to the questionnaire; 1226 of them were ≥55 years of age. Twenty-four percent of the ≥55-year-old respondents used desktop computers daily for leisure; 47.8% of them frequently experienced symptoms in the neck, and 38.5% in the shoulders. Workers aged ≥55 years had many more physical symptoms than younger people, except with respect to symptoms of the neck. Female daily occupational users of desktop computers had more physical symptoms in the neck. It is essential to take into account that, for people aged ≥55 years, the use of technology can be a sign of wellness. However, physical symptoms in the neck can be associated with the use of computers. PMID:28991182
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Using the Power of Media to Communicate Science: A Question of Style?
ERIC Educational Resources Information Center
Imhof, Heidi
1991-01-01
Discusses educational effects of the style, content, and quality inherent in several multimedia and desktop-publishing products available to science teachers, including books, interactive software, videos, and computer simulations. (JJK)
... used to track you on all kinds of internet-connected devices that have browsers, such as smart phones, tablets, laptop and desktop computers. How does tracking in mobile apps occur? When you access mobile applications, companies don’t have access to ...
Greenhouse Gas Emissions Model (GEM) for Medium- and Heavy-Duty Vehicle Compliance
EPA’s Greenhouse Gas Emissions Model (GEM) is a free, desktop computer application that estimates the greenhouse gas (GHG) emissions and fuel efficiency performance of specific aspects of heavy-duty vehicles.
Demonstrate provider accessibility with desktop and online services.
2001-10-01
It's available on personal computers with a CD or through Internet access. Assess instantly the accessibility of your provider network or the most promising areas to establish a health service with new GIS tools.
Pest management in Douglas-fir seed orchards: a microcomputer decision method
James B. Hoy; Michael I. Haverty
1988-01-01
The computer program described provides a Douglas-fir seed orchard manager (user) with a quantitative method for making insect pest management decisions on a desk-top computer. The decision system uses site-specific information such as estimates of seed crop size, insect attack rates, insecticide efficacy and application costs, weather, and crop value. At sites where...
Computerized preparation of a scientific poster.
Lugo, M; Speaker, M G; Cohen, E J
1989-01-01
We prepared an attractive and effective poster using a Macintosh computer and Laserwriter and Imagewriter II printers. The advantages of preparing the poster in this fashion were increased control of the final product, decreased cost, and a sense of artistic satisfaction. Although we employed only the above mentioned computer, the desktop publishing techniques described can be used with other systems.
Detecting Target Data in Network Traffic
2017-03-01
COMPUTER SCIENCE from the NAVAL POSTGRADUATE SCHOOL March 2017 Approved by: Michael McCarrin Thesis Co-Advisor Robert Beverly Thesis Co-Advisor Peter...Denning Chair, Department of Computer Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Data exfiltration over a n etwork p oses a t hreat...phone. Further, Guri et al. were able to use these GSM frequencies to obtain information from a desktop computer by manipulating memory to produce GSM
of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005
National Mobile Inventory Model (NMIM)
The National Mobile Inventory Model (NMIM) is a free, desktop computer application developed by EPA to help you develop estimates of current and future emission inventories for on-road motor vehicles and nonroad equipment. To learn more search the archive
Lumped Parameter Model (LPM) for Light-Duty Vehicles
EPA’s Lumped Parameter Model (LPM) is a free, desktop computer application that estimates the effectiveness (CO2 Reduction) of various technology combinations or “packages,” in a manner that accounts for synergies between technologies.
Accelerating Monte Carlo simulations with an NVIDIA ® graphics processor
NASA Astrophysics Data System (ADS)
Martinsen, Paul; Blaschke, Johannes; Künnemeyer, Rainer; Jordan, Robert
2009-10-01
Modern graphics cards, commonly used in desktop computers, have evolved beyond a simple interface between processor and display to incorporate sophisticated calculation engines that can be applied to general purpose computing. The Monte Carlo algorithm for modelling photon transport in turbid media has been implemented on an NVIDIA ® 8800 GT graphics card using the CUDA toolkit. The Monte Carlo method relies on following the trajectory of millions of photons through the sample, often taking hours or days to complete. The graphics-processor implementation, processing roughly 110 million scattering events per second, was found to run more than 70 times faster than a similar, single-threaded implementation on a 2.67 GHz desktop computer. Program summaryProgram title: Phoogle-C/Phoogle-G Catalogue identifier: AEEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 51 264 No. of bytes in distributed program, including test data, etc.: 2 238 805 Distribution format: tar.gz Programming language: C++ Computer: Designed for Intel PCs. Phoogle-G requires a NVIDIA graphics card with support for CUDA 1.1 Operating system: Windows XP Has the code been vectorised or parallelized?: Phoogle-G is written for SIMD architectures RAM: 1 GB Classification: 21.1 External routines: Charles Karney Random number library. Microsoft Foundation Class library. NVIDA CUDA library [1]. Nature of problem: The Monte Carlo technique is an effective algorithm for exploring the propagation of light in turbid media. However, accurate results require tracing the path of many photons within the media. The independence of photons naturally lends the Monte Carlo technique to implementation on parallel architectures. Generally, parallel computing can be expensive, but recent advances in consumer grade graphics cards have opened the possibility of high-performance desktop parallel-computing. Solution method: In this pair of programmes we have implemented the Monte Carlo algorithm described by Prahl et al. [2] for photon transport in infinite scattering media to compare the performance of two readily accessible architectures: a standard desktop PC and a consumer grade graphics card from NVIDIA. Restrictions: The graphics card implementation uses single precision floating point numbers for all calculations. Only photon transport from an isotropic point-source is supported. The graphics-card version has no user interface. The simulation parameters must be set in the source code. The desktop version has a simple user interface; however some properties can only be accessed through an ActiveX client (such as Matlab). Additional comments: The random number library used has a LGPL ( http://www.gnu.org/copyleft/lesser.html) licence. Running time: Runtime can range from minutes to months depending on the number of photons simulated and the optical properties of the medium. References:http://www.nvidia.com/object/cuda_home.html. S. Prahl, M. Keijzer, Sl. Jacques, A. Welch, SPIE Institute Series 5 (1989) 102.
Los Alamos radiation transport code system on desktop computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less
Farahani, Navid; Post, Robert; Duboy, Jon; Ahmed, Ishtiaque; Kolowitz, Brian J.; Krinchai, Teppituk; Monaco, Sara E.; Fine, Jeffrey L.; Hartman, Douglas J.; Pantanowitz, Liron
2016-01-01
Background: Digital slides obtained from whole slide imaging (WSI) platforms are typically viewed in two dimensions using desktop personal computer monitors or more recently on mobile devices. To the best of our knowledge, we are not aware of any studies viewing digital pathology slides in a virtual reality (VR) environment. VR technology enables users to be artificially immersed in and interact with a computer-simulated world. Oculus Rift is among the world's first consumer-targeted VR headsets, intended primarily for enhanced gaming. Our aim was to explore the use of the Oculus Rift for examining digital pathology slides in a VR environment. Methods: An Oculus Rift Development Kit 2 (DK2) was connected to a 64-bit computer running Virtual Desktop software. Glass slides from twenty randomly selected lymph node cases (ten with benign and ten malignant diagnoses) were digitized using a WSI scanner. Three pathologists reviewed these digital slides on a 27-inch 5K display and with the Oculus Rift after a 2-week washout period. Recorded endpoints included concordance of final diagnoses and time required to examine slides. The pathologists also rated their ease of navigation, image quality, and diagnostic confidence for both modalities. Results: There was 90% diagnostic concordance when reviewing WSI using a 5K display and Oculus Rift. The time required to examine digital pathology slides on the 5K display averaged 39 s (range 10–120 s), compared to 62 s with the Oculus Rift (range 15–270 s). All pathologists confirmed that digital pathology slides were easily viewable in a VR environment. The ratings for image quality and diagnostic confidence were higher when using the 5K display. Conclusion: Using the Oculus Rift DK2 to view and navigate pathology whole slide images in a virtual environment is feasible for diagnostic purposes. However, image resolution using the Oculus Rift device was limited. Interactive VR technologies such as the Oculus Rift are novel tools that may be of use in digital pathology. PMID:27217972
Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine
NASA Astrophysics Data System (ADS)
Boehm, J.; Liu, K.; Alis, C.
2016-06-01
In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.
Touch-screen tablet user configurations and case-supported tilt affect head and neck flexion angles.
Young, Justin G; Trudeau, Matthieu; Odell, Dan; Marinelli, Kim; Dennerlein, Jack T
2012-01-01
The aim of this study was to determine how head and neck postures vary when using two media tablet (slate) computers in four common user configurations. Fifteen experienced media tablet users completed a set of simulated tasks with two media tablets in four typical user configurations. The four configurations were: on the lap and held with the user's hands, on the lap and in a case, on a table and in a case, and on a table and in a case set at a high angle for watching movies. An infra-red LED marker based motion analysis system measured head/neck postures. Head and neck flexion significantly varied across the four configurations and across the two tablets tested. Head and neck flexion angles during tablet use were greater, in general, than angles previously reported for desktop and notebook computing. Postural differences between tablets were driven by case designs, which provided significantly different tilt angles, while postural differences between configurations were driven by gaze and viewing angles. Head and neck posture during tablet computing can be improved by placing the tablet higher to avoid low gaze angles (i.e. on a table rather than on the lap) and through the use of a case that provides optimal viewing angles.
Lemaire, Edward; Greene, G
2003-01-01
We produced continuing education material in physical rehabilitation using a variety of electronic media. We compared four methods of delivering the learning modules: in person with a computer projector, desktop videoconferencing, Web pages and CD-ROM. Health-care workers at eight community hospitals and two nursing homes were asked to participate in the project. A total of 394 questionnaires were received for all modalities: 73 for in-person sessions, 50 for desktop conferencing, 227 for Web pages and 44 for CD-ROM. This represents a 100% response rate from the in-person, desktop conferencing and CD-ROM groups; the response rate for the Web group is unknown, since the questionnaires were completed online. Almost all participants found the modules to be helpful in their work. The CD-ROM group gave significantly higher ratings than the Web page group, although all four learning modalities received high ratings. A combination of all four modalities would be required to provide the best possible learning opportunity.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
NASA Astrophysics Data System (ADS)
He, Yong; Xue, Guang-Huai; Fu, Jian-Zhong
2014-11-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-01-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods. PMID:25427880
Cybersickness and desktop simulations: field of view effects and user experience
NASA Astrophysics Data System (ADS)
Toet, Alexander; de Vries, Sjoerd C.; van Emmerik, Martijn L.; Bos, Jelte E.
2008-04-01
We used a desktop computer game environment to study the effect Field-of-View (FOV) on cybersickness. In particular, we examined the effect of differences between the internal FOV (iFOV, the FOV which the graphics generator is using to render its images) and the external FOV (eFOV, the FOV of the presented images as seen from the physical viewpoint of the observer). Somewhat counter-intuitively, we find that congruent iFOVs and eFOVs lead to a higher incidence of cybersickness. A possible explanation is that the incongruent conditions were too extreme, thereby reducing the experience of vection. We also studied the user experience (appraisal) of this virtual environment as a function of the degree of cybersickness. We find that cybersick participants experience the simulated environment as less pleasant and more arousing, and possibly also as more distressing. Our present findings have serious implications for desktop simulations used both in military and in civilian training, instruction and planning applications.
Open source OCR framework using mobile devices
NASA Astrophysics Data System (ADS)
Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan
2008-02-01
Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer.
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-11-27
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Architectures for single-chip image computing
NASA Astrophysics Data System (ADS)
Gove, Robert J.
1992-04-01
This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.
Practical advantages of evolutionary computation
NASA Astrophysics Data System (ADS)
Fogel, David B.
1997-10-01
Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.
Hands in space: gesture interaction with augmented-reality interfaces.
Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai
2014-01-01
Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.
3 CFR 13589 - Executive Order 13589 of November 9, 2011. Promoting Efficient Spending
Code of Federal Regulations, 2012 CFR
2012-01-01
... more aggressive steps to ensure the Government is a good steward of taxpayer money. Sec. 2. Agency... the number of IT devices (e.g., mobile phones, smartphones, desktop and laptop computers, and tablet...
Computerized Fortune Cookies--a Classroom Treat.
ERIC Educational Resources Information Center
Reissman, Rose
1996-01-01
Discusses the use of fortune cookie fortunes in a middle school class combined with computer graphics, desktop publishing, and word processing technology to create writing assignments, games, and discussions. Topics include cultural contexts, and students creating their own fortunes. (LRW)
75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
... Carolina. The notice will be published soon in the Federal Register. At the request of the State Agency... production of desktop computers. The company reports that workers leased from Staffing Solutions, South East...
NASA Technical Reports Server (NTRS)
Harrison, Cecil A.
1986-01-01
The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…
Use an Interactive Whiteboard: Get a Handle on How This Technology Can Spice up the Classroom
ERIC Educational Resources Information Center
Branzburg, Jeffrey
2006-01-01
Interactive whiteboards are desirable peripherals these days. When hooked up to a computer, the whiteboard's screen becomes a "live" computer desktop, which can be tapped to pull down menus, highlight, and move or open files. Users can also circle relevant sections on the projected image, draw geometric figures, and underline. Then they can save…
Information Technology: A Road to the Future? To Promote Academic Justice and Excellence Series.
ERIC Educational Resources Information Center
Gilbert, Steven W.; Green, Kenneth C.
This publication is intended to provide college faculty and staff with a guide to information technology issues in higher education. Mid-Way through the 1990s, higher education confronts the second phase of the information technology (IT) revolution, a shift in emphasis from the computer as a desktop tool to the computer as a communications…
WinHPC System User Basics | High-Performance Computing | NREL
guidance for starting to use this high-performance computing (HPC) system at NREL. Also see WinHPC policies ) when you are finished. Simply quitting Remote Desktop will keep your session active and using resources node). 2. Log in with your NREL.gov username/password. Remember to log out when finished. Mac 1. If you
Situated Computing: The Next Frontier for HCI Research
2002-01-01
population works and lives with information. Most individuals interact with information through a single portal: a personal desktop or laptop...of single devices, nor will one person necessarily own each device. This leap of imagination requires that human-computer interaction (HCI...wireless technologies, including Bluetooth [16], IrDA [22] (Infrared Data Association- standards for infrared communications) and HomeRF TM [21
Powering Down from the Bottom up: Greener Client Computing
ERIC Educational Resources Information Center
O'Donnell, Tom
2009-01-01
A decade ago, people wanting to practice "green computing" recycled their printer paper, turned their personal desktop systems off from time to time, and tried their best to donate old equipment to a nonprofit instead of throwing it away. A campus IT department can shave a few watts off just about any IT process--the real trick is planning and…
Visual attention for a desktop virtual environment with ambient scent
Toet, Alexander; van Schaik, Martin G.
2013-01-01
In the current study participants explored a desktop virtual environment (VE) representing a suburban neighborhood with signs of public disorder (neglect, vandalism, and crime), while being exposed to either room air (control group), or subliminal levels of tar (unpleasant; typically associated with burned or waste material) or freshly cut grass (pleasant; typically associated with natural or fresh material) ambient odor. They reported all signs of disorder they noticed during their walk together with their associated emotional response. Based on recent evidence that odors reflexively direct visual attention to (either semantically or affectively) congruent visual objects, we hypothesized that participants would notice more signs of disorder in the presence of ambient tar odor (since this odor may bias attention to unpleasant and negative features), and less signs of disorder in the presence of ambient grass odor (since this odor may bias visual attention toward the vegetation in the environment and away from the signs of disorder). Contrary to our expectations the results provide no indication that the presence of an ambient odor affected the participants’ visual attention for signs of disorder or their emotional response. However, the paradigm used in present study does not allow us to draw any conclusions in this respect. We conclude that a closer affective, semantic, or spatiotemporal link between the contents of a desktop VE and ambient scents may be required to effectively establish diagnostic associations that guide a user’s attention. In the absence of these direct links, ambient scent may be more diagnostic for the physical environment of the observer as a whole than for the particular items in that environment (or, in this case, items represented in the VE). PMID:24324453
NASA Astrophysics Data System (ADS)
Santagati, C.; Inzerillo, L.; Di Paola, F.
2013-07-01
3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.
A desktop 3D printer with dual extruders to produce customised electronic circuitry
NASA Astrophysics Data System (ADS)
Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan
2018-03-01
3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.
GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.
Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan
2011-05-01
Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.
NASA Astrophysics Data System (ADS)
Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José
2018-05-01
We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.
Lessons from a doctoral thesis.
Peiris, A N; Mueller, R A; Sheridan, D P
1990-01-01
The production of a doctoral thesis is a time-consuming affair that until recently was done in conjunction with professional publishing services. Advances in computer technology have made many sophisticated desktop publishing techniques available to the microcomputer user. We describe the computer method used, the problems encountered, and the solutions improvised in the production of a doctoral thesis by computer. The Apple Macintosh was selected for its ease of use and intrinsic graphics capabilities. A scanner was used to incorporate text from published papers into a word processing program. The body of the text was updated and supplemented with new sections. Scanned graphics from the published papers were less suitable for publication, and the original data were replotted and modified with a graphics-drawing program. Graphics were imported and incorporated in the text. Final hard copy was produced by a laser printer and bound with both conventional and rapid new binding techniques. Microcomputer-based desktop processing methods provide a rapid and cost-effective means of communicating the written word. We anticipate that this evolving technology will have increased use by physicians in both the private and academic sectors.
What caused the breach? An examination of use of information technology and health data breaches.
Wikina, Suanu Bliss
2014-01-01
Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states-Virginia, Illinois, California, Florida, New York, and Tennessee-in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates.
What Caused the Breach? An Examination of Use of Information Technology and Health Data Breaches
Wikina, Suanu Bliss
2014-01-01
Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states—Virginia, Illinois, California, Florida, New York, and Tennessee—in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates. PMID:25593574
Yang, Yiqun; Urban, Matthew W; McGough, Robert J
2018-05-15
Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green's functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green's function approach are ideally suited for high-performance GPUs.
NASA Technical Reports Server (NTRS)
Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil
2008-01-01
We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].
Review of Collaborative Tools for Planning and Engineering
2007-10-01
including PDAs) and Operating Systems 1 In general, should support laptops, desktops, Windows OS, Mac OS, Palm OS, Windows CE, Blackberry , Sun...better), voting (to establish operating parameters), reactor design, wind tunnel simulation Display same material on every computer, synchronisation
Hyped Type: An Exercise in Creative Typography.
ERIC Educational Resources Information Center
Osterer, Irv
2001-01-01
Provides a history of typography and discusses the effects of technology. Describes an art project in which high school students designed contemporary typographic specimen sheets. Explains that the students created their own broadsheets using Macintosh computers and QuarkXPress desktop publishing. (CMK)
Report #10-P-0194, August 23, 2010. Although EPA indicated it could avoid spending more than $115.4 million over 8.5 years by consolidating the desktop computing environment, improved management practices are needed.
Connecting to HPC VPN | High-Performance Computing | NREL
and password will match your NREL network account login/password. From OS X or Linux, open a terminal finalized. Open a Remote Desktop connection using server name WINHPC02 (this is the login node). Mac Mac
ERIC Educational Resources Information Center
Jordan, Jim
1988-01-01
Summarizes how infograhics are produced and how they provide information graphically in high school publications. Offers suggestions concerning information gathering, graphic format, and software selection, and provides examples of computer/student designed infographics. (MM)
Designing Communication and Learning Environments.
ERIC Educational Resources Information Center
Gayeski, Diane M., Ed.
Designing and remodeling educational facilities are becoming more complex with options that include computer-based collaboration, classrooms with multimedia podiums, conference centers, and workplaces with desktop communication systems. This book provides a collection of articles that address educational facility design categorized in the…
Cloud Based Electronic Health Record Applications are Essential to Expeditionary Patient Care
2017-05-01
security46 and privacy concerns47). Privacy/Security Risks of Cloud Computing A quantitative study based on the preceding literature review...to medical IT wherever there is a Wi-Fi connection and a computing device (desktop, laptop , tablet, phone, etc.). In 2015 the DoD launched MiCare, a...Hosting Services: a Study on Students’ Acceptance,” Computers in Human Behavior, 2013. Takai, Teri. DoD CIO’s 10-Point Plan for IT Modernization
Simulating an Exploding Fission-Bomb Core
NASA Astrophysics Data System (ADS)
Reed, Cameron
2016-03-01
A time-dependent desktop-computer simulation of the core of an exploding fission bomb (nuclear weapon) has been developed. The simulation models a core comprising a mixture of two isotopes: a fissile one (such as U-235) and an inert one (such as U-238) that captures neutrons and removes them from circulation. The user sets the enrichment percentage and scattering and fission cross-sections of the fissile isotope, the capture cross-section of the inert isotope, the number of neutrons liberated per fission, the number of ``initiator'' neutrons, the radius of the core, and the neutron-reflection efficiency of a surrounding tamper. The simulation, which is predicated on ordinary kinematics, follows the three-dimensional motions and fates of neutrons as they travel through the core. Limitations of time and computer memory render it impossible to model a real-life core, but results of numerous runs clearly demonstrate the existence of a critical mass for a given set of parameters and the dramatic effects of enrichment and tamper efficiency on the growth (or decay) of the neutron population. The logic of the simulation will be described and results of typical runs will be presented and discussed.
ERIC Educational Resources Information Center
Griffin, Irma Amado
This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…
Design of MPPT Controller Monitoring Software Based on QT Framework
NASA Astrophysics Data System (ADS)
Meng, X. Z.; Lu, P. G.
2017-10-01
The MPPT controller was a hardware device for tracking the maximum power point of solar photovoltaic array. Multiple controllers could be working as networking mode by specific communicating protocol. In this article, based on C++ GUI programming with Qt frame, we designed one sort of desktop application for monitoring and analyzing operational parameter of MPPT controller. The type of communicating protocol for building network was Modbus protocol which using Remote Terminal Unit mode and The desktop application of host computer was connected with all the controllers in the network through RS485 communication or ZigBee wireless communication. Using this application, user could monitor the parameter of controller wherever they were by internet.
ERIC Educational Resources Information Center
Cutsinger, John
1988-01-01
Explains how a high school literary magazine staff accessed the journalism department's Apple Macintosh computers to typeset its publication. Provides examples of magazine layouts designed partially or completely by "Pagemaker" software on a Macintosh. (MM)
2014-05-01
natural choice. In this document, we describe several aspects of video streaming and the challenges of performing video streaming between Android-based...client application was needed. Typically something like VideoLAN Client ( VLC ) is used for this purpose in a desktop environment. However, while VLC is...a very mature application on Windows and Linux, VLC for Android is still in a beta testing phase, and versions have only been developed to work
Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments
Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu
2017-01-01
High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734
Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.
Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu
2017-01-01
High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.
Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis
NASA Technical Reports Server (NTRS)
Bradley, James R.
2012-01-01
This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.
ANNA: A Convolutional Neural Network Code for Spectroscopic Analysis
NASA Astrophysics Data System (ADS)
Lee-Brown, Donald; Anthony-Twarog, Barbara J.; Twarog, Bruce A.
2018-01-01
We present ANNA, a Python-based convolutional neural network code for the automated analysis of stellar spectra. ANNA provides a flexible framework that allows atmospheric parameters such as temperature and metallicity to be determined with accuracies comparable to those of established but less efficient techniques. ANNA performs its parameterization extremely quickly; typically several thousand spectra can be analyzed in less than a second. Additionally, the code incorporates features which greatly speed up the training process necessary for the neural network to measure spectra accurately, resulting in a tool that can easily be run on a single desktop or laptop computer. Thus, ANNA is useful in an era when spectrographs increasingly have the capability to collect dozens to hundreds of spectra each night. This talk will cover the basic features included in ANNA and demonstrate its performance in two use cases: an open cluster abundance analysis involving several hundred spectra, and a metal-rich field star study. Applicability of the code to large survey datasets will also be discussed.
CaveCAD: a tool for architectural design in immersive virtual environments
NASA Astrophysics Data System (ADS)
Schulze, Jürgen P.; Hughes, Cathleen E.; Zhang, Lelin; Edelstein, Eve; Macagno, Eduardo
2014-02-01
Existing 3D modeling tools were designed to run on desktop computers with monitor, keyboard and mouse. To make 3D modeling possible with mouse and keyboard, many 3D interactions, such as point placement or translations of geometry, had to be mapped to the 2D parameter space of the mouse, possibly supported by mouse buttons or keyboard keys. We hypothesize that had the designers of these existing systems had been able to assume immersive virtual reality systems as their target platforms, they would have been able to design 3D interactions much more intuitively. In collaboration with professional architects, we created a simple, but complete 3D modeling tool for virtual environments from the ground up and use direct 3D interaction wherever possible and adequate. In this publication, we present our approaches for interactions for typical 3D modeling functions, such as geometry creation, modification of existing geometry, and assignment of surface materials. We also discuss preliminary user experiences with this system.
Simulating and Detecting Radiation-Induced Errors for Onboard Machine Learning
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri L.; Bornstein, Benjamin; Granat, Robert; Tang, Benyang; Turmon, Michael
2009-01-01
Spacecraft processors and memory are subjected to high radiation doses and therefore employ radiation-hardened components. However, these components are orders of magnitude more expensive than typical desktop components, and they lag years behind in terms of speed and size. We have integrated algorithm-based fault tolerance (ABFT) methods into onboard data analysis algorithms to detect radiation-induced errors, which ultimately may permit the use of spacecraft memory that need not be fully hardened, reducing cost and increasing capability at the same time. We have also developed a lightweight software radiation simulator, BITFLIPS, that permits evaluation of error detection strategies in a controlled fashion, including the specification of the radiation rate and selective exposure of individual data structures. Using BITFLIPS, we evaluated our error detection methods when using a support vector machine to analyze data collected by the Mars Odyssey spacecraft. We found ABFT error detection for matrix multiplication is very successful, while error detection for Gaussian kernel computation still has room for improvement.
DOT National Transportation Integrated Search
2013-01-01
The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...
Teach Your Computer to Read: Scanners and Optical Character Recognition.
ERIC Educational Resources Information Center
Marsden, Jim
1993-01-01
Desktop scanners can be used with a software technology called optical character recognition (OCR) to convert the text on virtually any paper document into an electronic form. OCR offers educators new flexibility in incorporating text into tests, lesson plans, and other materials. (MLF)
Personal Computers and Laser Printers Are Becoming Popular Tools for Creating Documents on Campuses.
ERIC Educational Resources Information Center
DeLoughry, Thomas J.
1987-01-01
Desktop publishing techniques are bringing control over institutional newsletters, catalogues, brochures, and many other print materials directly to the author's office. The technology also has the potential for integrating campus information systems and saving much time and money. (MSE)
CWRUnet--Case History of a Campus-Wide Fiber-to-the-Desktop Network.
ERIC Educational Resources Information Center
Neff, Raymond K.; Haigh, Peter J.
1992-01-01
This article describes the development at Case Western Reserve University of an all-fiber optic communications network linking 7,300 outlets (faculty offices, student residences, classrooms, libraries, and laboratories) with computer data, television, audio, facsimile, and image information services. (Author/DB)
NASA Technical Reports Server (NTRS)
2008-01-01
NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.
NASA's Climate in a Box: Desktop Supercomputing for Open Scientific Model Development
NASA Astrophysics Data System (ADS)
Wojcik, G. S.; Seablom, M. S.; Lee, T. J.; McConaughy, G. R.; Syed, R.; Oloso, A.; Kemp, E. M.; Greenseid, J.; Smith, R.
2009-12-01
NASA's High Performance Computing Portfolio in cooperation with its Modeling, Analysis, and Prediction program intends to make its climate and earth science models more accessible to a larger community. A key goal of this effort is to open the model development and validation process to the scientific community at large such that a natural selection process is enabled and results in a more efficient scientific process. One obstacle to others using NASA models is the complexity of the models and the difficulty in learning how to use them. This situation applies not only to scientists who regularly use these models but also non-typical users who may want to use the models such as scientists from different domains, policy makers, and teachers. Another obstacle to the use of these models is that access to high performance computing (HPC) accounts, from which the models are implemented, can be restrictive with long wait times in job queues and delays caused by an arduous process of obtaining an account, especially for foreign nationals. This project explores the utility of using desktop supercomputers in providing a complete ready-to-use toolkit of climate research products to investigators and on demand access to an HPC system. One objective of this work is to pre-package NASA and NOAA models so that new users will not have to spend significant time porting the models. In addition, the prepackaged toolkit will include tools, such as workflow, visualization, social networking web sites, and analysis tools, to assist users in running the models and analyzing the data. The system architecture to be developed will allow for automatic code updates for each user and an effective means with which to deal with data that are generated. We plan to investigate several desktop systems, but our work to date has focused on a Cray CX1. Currently, we are investigating the potential capabilities of several non-traditional development environments. While most NASA and NOAA models are designed for Linux operating systems (OS), the arrival of the WindowsHPC 2008 OS provides the opportunity to evaluate the use of a new platform on which to develop and port climate and earth science models. In particular, we are evaluating Microsoft's Visual Studio Integrated Developer Environment to determine its appropriateness for the climate modeling community. In the initial phases of this project, we have ported GEOS-5, WRF, GISS ModelE, and GFS to Linux on a CX1 and are in the process of porting WRF and ModelE to WindowsHPC 2008. Initial tests on the CX1 Linux OS indicate favorable comparisons in terms of performance and consistency of scientific results when compared with experiments executed on NASA high end systems. As in the past, NASA's large clusters will continue to be an important part of our objectives. We envision a seamless environment in which an investigator performs model development and testing on a desktop system and can seamlessly transfer execution to supercomputer clusters for production.
2016-09-01
and network. The computing and network hardware are identified and include routers, servers, firewalls, laptops , backup hard drives, smart phones...deployable hardware units will be necessary. This includes the use of ruggedized laptops and desktop computers , a projector system, communications system...ENGINEERING STUDY AND CONCEPT DEVELOPMENT FOR A HUMANITARIAN AID AND DISASTER RELIEF OPERATIONS MANAGEMENT PLATFORM by Julie A. Reed September
2011-10-01
Fortunately, some products offer centralized management and deployment tools for local desktop implementation . Figure 5 illustrates the... implementation of a secure desktop infrastructure based on virtualization. It includes an overview of desktop virtualization, including an in-depth...environment in the data centre, whereas LHVD places it on the endpoint itself. Desktop virtualization implementation considerations and potential
Development of a Wireless Computer Vision Instrument to Detect Biotic Stress in Wheat
Casanova, Joaquin J.; O'Shaughnessy, Susan A.; Evett, Steven R.; Rush, Charles M.
2014-01-01
Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM). In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV), vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32) than stressed wheat (111.34). In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014), as did the conventional camera (p < 0.0001). Vegetation hue obtained through a wireless computer vision system in this study is a viable option for determining biotic crop stress in irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications. PMID:25251410
ERIC Educational Resources Information Center
Yee, Kevin; Hargis, Jace
2010-01-01
This article discusses the benefits of screencasts and its instructional uses. Well-known for some years to advanced technology users, Screen Capture Software (SCS) offers the promise of recording action on the computer desktop together with voiceover narration, all combined into a single movie file that can be shared, emailed, or uploaded.…
The Technological Evolution in Schools: Reflections and Projections.
ERIC Educational Resources Information Center
Higgins, James E.
1991-01-01
Presents a first-person account of one teacher's experiences with computer hardware and software. The article discusses various programs and applications, such as integrated learning systems, database searching via CD-ROM, desktop publishing, authoring programs, and indicates future changes in instruction with increasing use of technology. (SM)
Incorporating a Human-Computer Interaction Course into Software Development Curriculums
ERIC Educational Resources Information Center
Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph
2015-01-01
Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…
Massive Query Resolution for Rapid Selective Dissemination of Information.
ERIC Educational Resources Information Center
Cohen, Jonathan D.
1999-01-01
Outlines an efficient approach to performing query resolution which, when matched with a keyword scanner, offers rapid selecting and routing for massive Boolean queries, and which is suitable for implementation on a desktop computer. Demonstrates the system's operation with large examples in a practical setting. (AEF)
Medical Recording Tools for Biodosimetry in Radiation Incidents
2005-01-01
assistant) devices. To ac- complish this, the second edition of the AFRRI handbook will be redesigned, using Adobe FrameMaker desk- top publishing...Handbook will be redes- igned for display on hand-held computer devices, using Adobe FrameMaker desktop publishing software. Portions of the text
TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA
The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...
Controlling Robots with Personal Computers.
ERIC Educational Resources Information Center
Singer, Andrew; Rony, Peter
1983-01-01
Discusses new robots that are mechanical arms small enough to sit on a desktop. They offer scaled-down price and performance, but are able to handle light production tasks such as spray painting or part orientation. (Available from W. C. Publications Inc., P.O. Box 1578, Montclair, NJ 07042.) (JOW)
NASA Astrophysics Data System (ADS)
Yang, Yiqun; Urban, Matthew W.; McGough, Robert J.
2018-05-01
Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green’s functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green’s function approach are ideally suited for high-performance GPUs.
LOSCAR: Long-term Ocean-atmosphere-Sediment CArbon cycle Reservoir Model v2.0.4
NASA Astrophysics Data System (ADS)
Zeebe, R. E.
2012-01-01
The LOSCAR model is designed to efficiently compute the partitioning of carbon between ocean, atmosphere, and sediments on time scales ranging from centuries to millions of years. While a variety of computationally inexpensive carbon cycle models are already available, many are missing a critical sediment component, which is indispensable for long-term integrations. One of LOSCAR's strengths is the coupling of ocean-atmosphere routines to a computationally efficient sediment module. This allows, for instance, adequate computation of CaCO3 dissolution, calcite compensation, and long-term carbon cycle fluxes, including weathering of carbonate and silicate rocks. The ocean component includes various biogeochemical tracers such as total carbon, alkalinity, phosphate, oxygen, and stable carbon isotopes. LOSCAR's configuration of ocean geometry is flexible and allows for easy switching between modern and paleo-versions. We have previously published applications of the model tackling future projections of ocean chemistry and weathering, pCO2 sensitivity to carbon cycle perturbations throughout the Cenozoic, and carbon/calcium cycling during the Paleocene-Eocene Thermal Maximum. The focus of the present contribution is the detailed description of the model including numerical architecture, processes and parameterizations, tuning, and examples of input and output. Typical CPU integration times of LOSCAR are of order seconds for several thousand model years on current standard desktop machines. The LOSCAR source code in C can be obtained from the author by sending a request to loscar.model@gmail.com.
Desktop Systems for Manufacturing Carbon Nanotube Films by Chemical Vapor Deposition
2007-06-01
existing low cost tube furnace designs limit the researcher’s ability to fully separate critical reaction parameters such as temperature and flow...Often heated using an external resistive heater coil, a typical configuration, shown in Figure 4, might place a tube made of a non- reactive ...researcher’s ability to fully separate critical parameters such as temperature and flow profiles. Additionally, the use of heating elements external to
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector
NASA Astrophysics Data System (ADS)
Axani, S. N.; Frankiewicz, K.; Conrad, J. M.
2018-03-01
The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.
Computing at DESY — current setup, trends and strategic directions
NASA Astrophysics Data System (ADS)
Ernst, Michael
1998-05-01
Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.
A discrete Fourier transform for virtual memory machines
NASA Technical Reports Server (NTRS)
Galant, David C.
1992-01-01
An algebraic theory of the Discrete Fourier Transform is developed in great detail. Examination of the details of the theory leads to a computationally efficient fast Fourier transform for the use on computers with virtual memory. Such an algorithm is of great use on modern desktop machines. A FORTRAN coded version of the algorithm is given for the case when the sequence of numbers to be transformed is a power of two.
AstroGrid: Taverna in the Virtual Observatory .
NASA Astrophysics Data System (ADS)
Benson, K. M.; Walton, N. A.
This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.
When Everyone Is a Probe, Everyone Is a Learner
ERIC Educational Resources Information Center
Berenfeld, Boris; Krupa, Tatiana; Lebedev, Arseny; Stafeev, Sergey
2014-01-01
Most students globally have mobile devices and the Global Students Laboratory (GlobalLab) project is integrating mobility into learning. First launched in 1991, GlobalLab builds a community of learners engaged in collaborative, distributed investigations. Long relying on stationary desktop computers, or students inputting their observations by…
Software Reviews: Programs Worth a Second Look.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Reviews three computer software programs: (1) "The Children's Writing and Publishing Center"--writing and creative arts, grades 2-8, Apple II; (2) "Slide Shop"--graphics and desktop presentations, grades 4-12, Apple II and IBM; and (3) "Solve It"--problem solving and language arts, grades 4-12, Apple II. (MVL)
Managing Information Technology in Academic Medical Centers: A "Multicultural" Experience.
ERIC Educational Resources Information Center
Friedman, Charles P.; Corn, Milton; Krumrey, Arthur; Perry, David R.; Stevens, Ronald H.
1998-01-01
Examines how beliefs and concerns of academic medicine's diverse professional cultures affect management of information technology. Two scenarios, one dealing with standardization of desktop personal computers and the other with publication of syllabi on an institutional intranet, form the basis for an exercise in which four prototypical members…
The Library Macintosh at SCIL [Small Computers in Libraries]'88.
ERIC Educational Resources Information Center
Valauskas, Edward J.; And Others
1988-01-01
The first of three papers describes the role of Macintosh workstations in a library. The second paper explains why the Macintosh was selected for end-user searching in an academic library, and the third discusses advantages and disadvantages of desktop publishing for librarians. (8 references) (MES)
Ideas without Words--Internationalizing Business Presentations.
ERIC Educational Resources Information Center
Sondak, Norman; Sondak, Eileen
This paper presents elements of the computer graphics environment including information on: Lotus 1-2-3; Apple Macintosh; Desktop Publishing; Object-Oriented Programming; and Microsoft's Windows 3. A brief scenario illustrates the use of the minimization principle in presenting a new product to a group of international financiers. A taxonomy of…
Most Social Scientists Shun Free Use of Supercomputers.
ERIC Educational Resources Information Center
Kiernan, Vincent
1998-01-01
Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…
Classrooms for the Millennials: An Approach for the Next Generation
ERIC Educational Resources Information Center
Gerber, Lindsey N.; Ward, Debra D.
2016-01-01
The purpose of this paper is to introduce educators to three types of applets that are compatible with smartphones, tablets, and desktop computers: screencasting applets, graphing calculator applets, and student response applets. The applets discussed can be seamlessly and effectively integrated into classrooms to help facilitate lectures, collect…
Evaluating Technology Integration in the Elementary School: A Site-Based Approach.
ERIC Educational Resources Information Center
Mowe, Richard
This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…
The Impact on Homes with a School-Provided Laptop
ERIC Educational Resources Information Center
Bu Shell, Shawna M.
2012-01-01
For decades, educational policy advocates have argued for providing technology to students to enhance their learning environments. From filmstrips (Oppenheimer, 1997) to desktop computers (Cuban, 2002) to laptops (Silvernail, 2009; Warschauer, 2006), they have attempted to change the environment and style in which students learn, and how tools can…
ERIC Educational Resources Information Center
Van Horn, Royal
2001-01-01
Several years after the first audiovisual Macintosh computer appeared, most educators are still oblivious of this technology. Almost every other economic sector (including the porn industry) makes abundant use of digital and streaming video. Desktop movie production is so easy that primary grade students can do it. Tips are provided. (MLH)
Colleges' Effort To Prepare for Y2K May Yield Benefits for Many Years.
ERIC Educational Resources Information Center
Olsen, Florence
2000-01-01
Suggests that the money spent ($100 billion) to fix the Y2K bug in the United States resulted in improved campus computer systems. Reports from campuses around the country indicate that both mainframe and desktop systems experienced fewer problems than expected. (DB)
Student Record Automating Using Desktop Computer Technologies.
ERIC Educational Resources Information Center
Almerico, Gina M.; Baker, Russell K.; Matassini, Norma
Teacher education programs nationwide are required by state and federal governments to maintain comprehensive student records of all current and graduated students in their programs. A private, mid-sized university established a faculty team to analyze record-keeping procedures to comply with these government requirements. The team's mandate was…
Goal-seismic computer programs in BASIC: Part I; Store, plot, and edit array data
Hasbrouck, Wilfred P.
1979-01-01
Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in an extended BASIC language specially augmented for acceptance by the Tektronix 4051 Graphic System. This report presents five computer programs used to store, plot, and edit array data for the line, cross, and triangle arrays commonly employed in our coal-seismic investigations. * Use of brand names in this report is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey.
The spinal posture of computing adolescents in a real-life setting
2014-01-01
Background It is assumed that good postural alignment is associated with the less likelihood of musculoskeletal pain symptoms. Encouraging good sitting postures have not reported consequent musculoskeletal pain reduction in school-based populations, possibly due to a lack of clear understanding of good posture. Therefore this paper describes the variability of postural angles in a cohort of asymptomatic high-school students whilst working on desk-top computers in a school computer classroom and to report on the relationship between the postural angles and age, gender, height, weight and computer use. Methods The baseline data from a 12 month longitudinal study is reported. The study was conducted in South African school computer classrooms. 194 Grade 10 high-school students, from randomly selected high-schools, aged 15–17 years, enrolled in Computer Application Technology for the first time, asymptomatic during the preceding month, and from whom written informed consent were obtained, participated in the study. The 3D Posture Analysis Tool captured five postural angles (head flexion, neck flexion, cranio-cervical angle, trunk flexion and head lateral bend) while the students were working on desk-top computers. Height, weight and computer use were also measured. Individual and combinations of postural angles were analysed. Results 944 Students were screened for eligibility of which the data of 194 students are reported. Trunk flexion was the most variable angle. Increased neck flexion and the combination of increased head flexion, neck flexion and trunk flexion were significantly associated with increased weight and BMI (p = 0.0001). Conclusions High-school students sit with greater ranges of trunk flexion (leaning forward or reclining) when using the classroom computer. Increased weight is significantly associated with increased sagittal plane postural angles. PMID:24950887
Volunteered Cloud Computing for Disaster Management
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.
Azimi, Parham; Zhao, Dan; Pouzet, Claire; Crain, Neil E; Stephens, Brent
2016-02-02
Previous research has shown that desktop 3D printers can emit large numbers of ultrafine particles (UFPs, particles less than 100 nm) and some hazardous volatile organic compounds (VOCs) during printing, although very few filament and 3D printer combinations have been tested to date. Here we quantify emissions of UFPs and speciated VOCs from five commercially available filament extrusion desktop 3D printers utilizing up to nine different filaments by controlled experiments in a test chamber. Median estimates of time-varying UFP emission rates ranged from ∼10(8) to ∼10(11) min(-1) across all tested combinations, varying primarily by filament material and, to a lesser extent, bed temperature. The individual VOCs emitted in the largest quantities included caprolactam from nylon-based and imitation wood and brick filaments (ranging from ∼2 to ∼180 μg/min), styrene from acrylonitrile butadiene styrene (ABS) and high-impact polystyrene (HIPS) filaments (ranging from ∼10 to ∼110 μg/min), and lactide from polylactic acid (PLA) filaments (ranging from ∼4 to ∼5 μg/min). Results from a screening analysis of potential exposure to these products in a typical small office environment suggest caution should be used when operating many of the printer and filament combinations in poorly ventilated spaces or without the aid of combined gas and particle filtration systems.
2015-11-01
provided by a stand-alone desktop or hand held computing device. This introduces into the discussion a large number of mobile , tactical command...control, communications, and computer (C4) systems across the Services. A couple of examples are mobile command posts mounted on the back of an M1152... infrastructure (DCPI). This term encompasses on-site backup generators, switchgear, uninterruptible power supplies (UPS), power distribution units
Evaluating computer capabilities in a primary care practice-based research network.
Ariza, Adolfo J; Binns, Helen J; Christoffel, Katherine Kaufer
2004-01-01
We wanted to assess computer capabilities in a primary care practice-based research network and to understand how receptive the practices were to new ideas for automation of practice activities and research. This study was conducted among members of the Pediatric Practice Research Group (PPRG). A survey to assess computer capabilities was developed to explore hardware types, software programs, Internet connectivity and data transmission; views on privacy and security; and receptivity to future electronic data collection approaches. Of the 40 PPRG practices participating in the study during the autumn of 2001, all used IBM-compatible systems. Of these, 45% used stand-alone desktops, 40% had networked desktops, and approximately 15% used laptops and minicomputers. A variety of software packages were used, with most practices (82%) having software for some aspect of patient care documentation, patient accounting (90%), business support (60%), and management reports and analysis (97%). The main obstacles to expanding use of computers in patient care were insufficient staff training (63%) and privacy concerns (82%). If provided with training and support, most practices indicated they were willing to consider an array of electronic data collection options for practice-based research activities. There is wide variability in hardware and software use in the pediatric practice setting. Implementing electronic data collection in the PPRG would require a substantial start-up effort and ongoing training and support at the practice site.
Developing and validating an instrument for measuring mobile computing self-efficacy.
Wang, Yi-Shun; Wang, Hsiu-Yuan
2008-08-01
IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.
Economic analysis of cloud-based desktop virtualization implementation at a hospital
2012-01-01
Background Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting. PMID:23110661
Economic analysis of cloud-based desktop virtualization implementation at a hospital.
Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee
2012-10-30
Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.
Technical Writing Teachers and the Challenges of Desktop Publishing.
ERIC Educational Resources Information Center
Kalmbach, James
1988-01-01
Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)
Freiberger, Manuel; Egger, Herbert; Liebmann, Manfred; Scharfetter, Hermann
2011-11-01
Image reconstruction in fluorescence optical tomography is a three-dimensional nonlinear ill-posed problem governed by a system of partial differential equations. In this paper we demonstrate that a combination of state of the art numerical algorithms and a careful hardware optimized implementation allows to solve this large-scale inverse problem in a few seconds on standard desktop PCs with modern graphics hardware. In particular, we present methods to solve not only the forward but also the non-linear inverse problem by massively parallel programming on graphics processors. A comparison of optimized CPU and GPU implementations shows that the reconstruction can be accelerated by factors of about 15 through the use of the graphics hardware without compromising the accuracy in the reconstructed images.
Automated micromanipulation desktop station based on mobile piezoelectric microrobots
NASA Astrophysics Data System (ADS)
Fatikow, Sergej
1996-12-01
One of the main problems of present-day research on microsystem technology (MST) is to assemble a whole micro- system from different microcomponents. This paper presents a new concept of an automated micromanipulation desktop- station including piezoelectrically driven microrobots placed on a high-precise x-y-stage of a light microscope, a CCD-camera as a local sensor subsystem, a laser sensor unit as a global sensor subsystem, a parallel computer system with C167 microcontrollers, and a Pentium PC equipped additionally with an optical grabber. The microrobots can perform high-precise manipulations (with an accuracy of up to 10 nm) and a nondestructive transport (at a speed of about 3 cm/sec) of very small objects under the microscope. To control the desktop-station automatically, an advanced control system that includes a task planning level and a real-time execution level is being developed. The main function of the task planning sub-system is to interpret the implicit action plan and to generate a sequence of explicit operations which are sent to the execution level of the control system. The main functions of the execution control level are the object recognition, image processing and feedback position control of the microrobot and the microscope stage.
Cross-Platform User Interface of E-Learning Applications
ERIC Educational Resources Information Center
Stoces, Michal; Masner, Jan; Jarolímek, Jan; Šimek, Pavel; Vanek, Jirí; Ulman, Miloš
2015-01-01
The paper discusses the development of Web educational services for specific groups. A key feature is to allow the display and use of educational materials and training services to the widest possible set of different devices, especially in the browser classic desktop computers, notebooks, tablets, mobile phones and also on different readers for…
Blocking of Goal-Location Learning Based on Shape
ERIC Educational Resources Information Center
Alexander, Tim; Wilson, Stuart P.; Wilson, Paul N.
2009-01-01
Using desktop, computer-simulated virtual environments (VEs), the authors conducted 5 experiments to investigate blocking of learning about a goal location based on Shape B as a consequence of preliminary training to locate that goal using Shape A. The shapes were large 2-dimensional horizontal figures on the ground. Blocking of spatial learning…
High Resolution Displays In The Apple Macintosh And IBM PC Environments
NASA Astrophysics Data System (ADS)
Winegarden, Steven
1989-07-01
High resolution displays are one of the key elements that distinguish user oriented document finishing or publishing stations. A number of factors have been involved in bringing these to the desktop environment. At Sigma Designs we have concentrated on enhancing the capabilites of IBM PCs and compatibles and Apple Macintosh computer systems.
Notebooks, Handhelds, and Software in Physical Education (Grades 5-8)
ERIC Educational Resources Information Center
Mohnsen, Bonnie
2005-01-01
Heart monitors, pedometers, and now virtual reality-based equipment (e.g., Cyberbikes, "Dance Dance Revolution") have been embraced by physical educators as technologies worth using in the physical education program; however, the use of computers (be it a desktop, notebook, or handheld) in the physical education instructional program, has not been…
Epilepsy Forewarning Using A Hand-Held Device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hively, LM
2005-02-21
Over the last decade, ORNL has developed and patented a novel approach for forewarning of a large variety of machine and biomedical events. The present implementation uses desktop computers to analyze archival data. This report describes the next logical step in this effort, namely use of a hand-held device for the analysis.
Full Immersive Virtual Environment Cave[TM] in Chemistry Education
ERIC Educational Resources Information Center
Limniou, M.; Roberts, D.; Papadopoulos, N.
2008-01-01
By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…
Reading, Writing, and Documentation and Managing the Development of User Documentation.
ERIC Educational Resources Information Center
Lindberg, Wayne; Hoffman, Terrye
1987-01-01
The first of two articles addressing the issue of user documentation for computer software discusses the need to teach users how to read documentation. The second presents a guide for writing documentation that is based on the instructional systems design model, and makes suggestions for the desktop publishing of user manuals. (CLB)
Learning Computer Hardware by Doing: Are Tablets Better than Desktops?
ERIC Educational Resources Information Center
Raven, John; Qalawee, Mohamed; Atroshi, Hanar
2016-01-01
In this world of rapidly evolving technologies, educational institutions often struggle to keep up with change. Change often requires a state of readiness at both the micro and macro levels. This paper looks at a tertiary institution that undertook a significant technology change initiative by introducing tablet based components for teaching a…
Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner
ERIC Educational Resources Information Center
Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.
2004-01-01
The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.
WLANs for the 21st Century Library
ERIC Educational Resources Information Center
Calamari, Cal
2009-01-01
As educational and research needs have changed, libraries have changed as well. They must meet ever-increasing demand for access to online media, subscriptions to archives, video, audio, and other content. The way a user/patron accesses this information has also changed. Gone are the days of a few hardwired desktops or computer carts. While…
Infrastructure Suitability Assessment Modeling for Cloud Computing Solutions
2011-09-01
Virtualization vs . Para-Virtualization .......................................................10 Figure 4. Modeling alternatives in relation to model...the conceptual difference between full virtualization and para-virtualization. Figure 3. Full Virtualization vs . Para-Virtualization 2. XEN...Besides Microsoft’s own client implementations, dubbed “Remote Desktop Con- nection Client” for Windows® and Apple ® operating systems, various open
Redesigning a Library Space for Collaborative Learning
ERIC Educational Resources Information Center
Gabbard, Ralph B.; Kaiser, Anthony; Kaunelis, David
2007-01-01
The reference desk at Indiana State University's (ISU) library offers an excellent view of student work areas on the first floor. From this vantage point, the reference librarians noticed students, especially in the evening and on weekends, huddled together in small groups, with one student at the keyboard of a laptop or desktop computer. The…
The Promise of Zoomable User Interfaces
ERIC Educational Resources Information Center
Bederson, Benjamin B.
2011-01-01
Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…
Growth-simulation model for lodgepole pine in central Oregon.
Walter G. Dahms
1983-01-01
A growth-simulation model for central Oregon lodgepole pine (Pinus contorta Dougl.) has been constructed by combining data from temporary and permanent sample plots. The model is similar to a conventional yield table with the added capacity for dealing with the stand-density variable. The simulator runs on a desk-top computer.
The Influence of Textual Cues on First Impressions of an Email Sender
ERIC Educational Resources Information Center
Marlow, Shannon L.; Lacerenza, Christina N.; Iwig, Chelsea
2018-01-01
The present study experimentally manipulated the gender of an email sender, closing salutation, and sending mode (i.e., email sent via desktop computer/laptop as compared with email sent via a mobile device) to determine if these specific cues influence first impressions of the sender's competence, professionalism, positive affect, and negative…
ERIC Educational Resources Information Center
Bucknall, Ruary
1996-01-01
Overview of the interactive technologies used by the Northern Territory Secondary Correspondence School in Australia: print media utilizing desktop publishing and electronic transfer; telephone or H-F radio; interactive television; and interactive computing. More fully describes its interactive CD-ROM courses. Emphasizes that the programs are…
From Floppies to Flash--Your Guide to Removable Media
ERIC Educational Resources Information Center
Berdinka, Matthew J.
2005-01-01
Technology that once involved a scary, mysterious machine the size of a small house now fits on desktops and commonly appears in offices, schools, and homes. Computers allow for processing, storing and transmitting data between two or more people virtually anywhere in the world. They also allow users to save documents, presentations, photos and…
Briggs, Andrew; Straker, Leon; Greig, Alison
2004-06-10
The objective of this study was to quantitatively analyse the sitting posture of school children interacting with both old (book) and new (laptop and desktop computers) information technologies to test the hypothesis that posture is effected by the type of information technology (IT) used. A mixed model design was used to test the effect of IT type (within subjects) and age and gender (between subjects). The sitting posture of 32 children aged 4-17 years was measured whilst they read from a book, laptop, and desktop computer at a standard school chair and desk. Video images were captured and then digitized to calculate mean angles for head tilt, neck flexion, trunk flexion, and gaze angle. Posture was found to be influenced by IT type (p < 0.001), age (p < 0.001) and gender (p = 0.024) and significantly correlated to the stature of the participants. Measurement of resting posture and the maximal range of motion of the upper and lower cervical spines in the sagittal plane was also undertaken. The biophysical impact and the suitability of the three different information technologies are discussed.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.
Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.
Transformation of personal computers and mobile phones into genetic diagnostic systems.
Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom
2014-09-16
Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.
Transformation of Personal Computers and Mobile Phones into Genetic Diagnostic Systems
2014-01-01
Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone—devices that have become readily accessible in developing countries—into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite. PMID:25223929
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
NASA Astrophysics Data System (ADS)
van Leunen, J. A. J.; Dreessen, J.
1984-05-01
The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
Influence of direct computer experience on older adults' attitudes toward computers.
Jay, G M; Willis, S L
1992-07-01
This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.
Desktop Publishing Choices: Making an Appropriate Decision.
ERIC Educational Resources Information Center
Crawford, Walt
1991-01-01
Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…
Integrated IMA (Information Mission Areas) IC (Information Center) Guide
1989-06-01
COMPUTER AIDED DESIGN / COMPUTER AIDED MANUFACTURE 8-8 8.3.7 LIQUID CRYSTAL DISPLAY PANELS 8-8 8.3.8 ARTIFICIAL INTELLIGENCE APPLIED TO VI 8-9 8.4...2 10.3.1 DESKTOP PUBLISHING 10-3 10.3.2 INTELLIGENT COPIERS 10-5 10.3.3 ELECTRONIC ALTERNATIVES TO PRINTED DOCUMENTS 10-5 10.3.4 ELECTRONIC FORMS...Optical Disk LCD Units Storage Image Scanners Graphics Forms Output Generation Copiers Devices Software Optical Disk Intelligent Storage Copiers Work Group
CERN's Common Unix and X Terminal Environment
NASA Astrophysics Data System (ADS)
Cass, Tony
The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.
Data Transfers Among the HP-75, HP-86, and HP-9845 Microcomputers.
1983-01-01
AD-A139 438 DAT TRANSFERS AMONG THE HP-75 HP-86 AND HP-9845 / MICROCOMPUTENS(U) AIR FORCE INST OF TECH WNIOHT-PATTERSON AFN OH D P CONNOR 1983...hereafter called the ඓ") and the HP-86 (hereafter called the ඞ"). The computers are to be used for classroom instruction and research at SOC. On...the main campus another Hewlett-Packard desktop computer, the HP-9845 (hereafter called the "), is already in use; it controls and processes data
2006-09-01
required directional control for each thruster due to their high precision and equivalent power and computer interface requirements to those for the...Universal Serial Bus) ports, LPT (Line Printing Terminal) and KVM (Keyboard-Video- Mouse) interfaces. Additionally, power is supplied to the computer through...of the IDE cable to the Prometheus Development Kit ACC-IDEEXT. Connect a small drive power connector from the desktop ATX power supply to the ACC
Video control system for a drilling in furniture workpiece
NASA Astrophysics Data System (ADS)
Khmelev, V. L.; Satarov, R. N.; Zavyalova, K. V.
2018-05-01
During last 5 years, Russian industry has being starting to be a robotic, therefore scientific groups got new tasks. One of new tasks is machine vision systems, which should solve problem of automatic quality control. This type of systems has a cost of several thousand dollars each. The price is impossible for regional small business. In this article, we describe principle and algorithm of cheap video control system, which one uses web-cameras and notebook or desktop computer as a computing unit.
Computer virus information update CIAC-2301
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orvis, W.J.
1994-01-15
While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information inmore » this document was extracted from CIAC`s Virus database.« less
Neural simulations on multi-core architectures.
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.
Neural Simulations on Multi-Core Architectures
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
Using Desk-Top Publishing to Develop Literacy.
ERIC Educational Resources Information Center
Wray, David; Medwell, Jane
1989-01-01
Examines the learning benefits which may accrue from using desk-top publishing techniques with children, especially in terms of the development of literacy skills. Analyzes desk-top publishing as an extension of word processing and describes some ways of using desk-top publishing in the classroom. (RS)
A Fine-Tuned Look at White Space Variation in Desktop Publishing.
ERIC Educational Resources Information Center
Knupfer, Nancy Nelson; McIsaac, Marina Stock
This investigation of the use of white space in print-based, computer-generated text focused on the point at which the white space interferes with reading speed and comprehension. It was hypothesized that reading speed and comprehension would be significantly greater when text was wrapped tightly around the graphic than when it had one-half inch…
Simulation Packages Expand Aircraft Design Options
NASA Technical Reports Server (NTRS)
2013-01-01
In 2001, NASA released a new approach to computational fluid dynamics that allows users to perform automated analysis on complex vehicle designs. In 2010, Palo Alto, California-based Desktop Aeronautics acquired a license from Ames Research Center to sell the technology. Today, the product assists organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets.
iChat[TM] Do You? Using Desktop Web Conferencing in Education
ERIC Educational Resources Information Center
Bell, Randy L.; Garofalo, Joe
2006-01-01
Videoconferencing is not a new technology and it has been widely used in educational settings since the mid-1980s. Videoconferencing has evolved into the integration of personal computers to what is now referred to as Web conferencing. In the mid-1990s, Internet Protocol (IP) was introduced into the mainstream but the educational community has…
3D Printing in Technology and Engineering Education
ERIC Educational Resources Information Center
Martin, Robert L.; Bowden, Nicholas S.; Merrill, Chris
2014-01-01
In the past five years, there has been tremendous growth in the production and use of desktop 3D printers. This growth has been driven by the increasing availability of inexpensive computing and electronics technologies. The ability to rapidly share ideas and intelligence over the Internet has also played a key role in the growth. Growth is also…
The Role of Theory and Technology in Learning Video Production: The Challenge of Change
ERIC Educational Resources Information Center
Shewbridge, William; Berge, Zane L.
2004-01-01
The video production field has evolved beyond being exclusively relevant to broadcast television. The convergence of low-cost consumer cameras and desktop computer editing has led to new applications of video in a wide range of areas, including the classroom. This presents educators with an opportunity to rethink how students learn video…
ERIC Educational Resources Information Center
Gamble-Risley, Michelle
2006-01-01
In the past, projection systems were large, heavy, and unwieldy and cost $3,000 to $5,000. Setup was fraught with the challenges of multiple wires plugged into the backs of desktop computers, often causing confusion about what went where. Systems were sometimes so difficult to set up that teachers had to spend pre-class time putting them together.…
Training Learners to Use Quizlet Vocabulary Activities on Mobile Phones in Vietnam with Facebook
ERIC Educational Resources Information Center
Tran, Phuong
2016-01-01
Mobile phone ownership among university students in Vietnam has reached almost 100%, exceeding that of Internet-capable desktop computers. This has made them increasingly popular to allow learners to carry out learning activities outside of the classroom, but some studies have suggested that learners are not always willing to engage in activities…
CALLing All Foreign Language Teachers: Computer-Assisted Language Learning in the Classroom
ERIC Educational Resources Information Center
Erben, Tony, Ed.; Sarieva, Iona, Ed.
2008-01-01
This book is a comprehensive guide to help foreign language teachers use technology in their classrooms. It offers the best ways to integrate technology into teaching for student-centered learning. CALL Activities include: Email; Building a Web site; Using search engines; Powerpoint; Desktop publishing; Creating sound files; iMovie; Internet chat;…
Teacher Associations: Extending Our Advocacy Reach
ERIC Educational Resources Information Center
Boitnott, Kitty
2012-01-01
School library media specialists have seen more fundamental changes in their jobs and in their roles within their schools than any other group of education professionals. The author started her first job as a school librarian when there were no desktop computers, and card catalogs were the order of the day. Over the course of the thirty-seven…
Programs for road network planning.
Ward W. Carson; Dennis P. Dykstra
1978-01-01
This paper describes four computer programs developed to assist logging engineers to plan transportation in a forest. The objective of these programs, to be used together, is to find the shortest path through a transportation network from a point of departure to a destination. Three of the programs use the digitizing and plotting capabilities of a programable desk-top...
Fire characteristics charts for fire behavior and U.S. fire danger rating
Faith Ann Heinsch; Pat Andrews
2010-01-01
The fire characteristics chart is a graphical method of presenting U.S. National Fire Danger Rating indices or primary surface or crown fire behavior characteristics. A desktop computer application has been developed to produce fire characteristics charts in a format suitable for inclusion in reports and presentations. Many options include change of scales, colors,...
ERIC Educational Resources Information Center
Fehn, Bruce; Johnson, Melanie; Smith, Tyson
2010-01-01
Elementary and secondary school history students demonstrate a great deal of enthusiasm for making documentary films. With free and easy-to-use software, as well as vast online, archival resources containing images and sounds, students can sit at a computer and make serious and engaging documentary productions. With students affectively engaged by…
Designing a Mobile Training System in Rural Areas with Bayesian Factor Models
ERIC Educational Resources Information Center
Omidi Najafabadi, Maryam; Mirdamadi, Seyed Mehdi; Payandeh Najafabadi, Amir Teimour
2014-01-01
The facts that the wireless technologies (1) are more convenient; and (2) need less skill than desktop computers, play a crucial role to decrease digital gap in rural areas. This study employed the Bayesian Confirmatory Factor Analysis (CFA) to design a mobile training system in rural areas of Iran. It categorized challenges, potential, and…
ERIC Educational Resources Information Center
Lawless-Reljic, Sabine Karine
2010-01-01
Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
..., Winston-Salem, North Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR... from Staffing Solutions, South East, and Omni Resources and Recovery. The notices were published in the... firm. The workers are engaged in employment related to the production of desktop computers. New...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
..., APN and ICONMA, Winston-Salem, North Carolina. The notice was published in the Federal Register on... Seaton Corporation. The notices were published on the Federal Register on April 19, 2010 (75 FR 20385... workers are engaged in employment related to the production of desktop computers. New information shows...
Ethnography at a Distance: Globally Mobile Parents Choosing International Schools
ERIC Educational Resources Information Center
Forsey, Martin; Breidenstein, Georg; Krüger, Oliver; Roch, Anna
2015-01-01
The research we report on was conducted from our computer desktops. We have not met the people we have studied; they are part of what Eichhorn described as a "textual community", gathered around the threads of online conversations associated with a website servicing the needs of English-language speakers in Germany. The thread in…
When Neurons Meet Electrons: Three Trends That Are Sparking Change in Computer Publishing.
ERIC Educational Resources Information Center
Cranney, Charles
1992-01-01
Three important trends in desktop publishing include (1) use of multiple media in presentation of information; (2) networking; and (3) "hot links" (integrated file-exchange formats). It is also important for college publications professionals to be familiar with sources of information about technological change and to be able to sort out the…
ERIC Educational Resources Information Center
Anderson, Mary Alice, Ed.
This notebook is a compilation of 53 lesson plans for grades 6-12, written by various authors and focusing on the integration of technology into the curriculum. Lesson plans include topics such as online catalog searching, electronic encyclopedias, CD-ROM databases, exploring the Internet, creating a computer slide show, desktop publishing, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... desktop computers and mobile devices, we expect significant innovation to continue in the provision of... inside of buildings.'' 47. Discussion. Publicly available reports, such as a March 2011 study from J. D... installed Wi-Fi access points, and a growing number of mobile devices (e.g., smartphones, laptops, and...
An Ethical Dilemma: Talking about Plagiarism and Academic Integrity in the Digital Age
ERIC Educational Resources Information Center
Thomas, Ebony Elizabeth; Sassi, Kelly
2011-01-01
Today, many students not only access the Internet through desktop and laptop computers at home or at school but also have copious amounts of information at their fingertips via portable devices (e.g., iPods, iPads, netbooks, smartphones). While some teachers welcome the proliferation of portable technologies and easy wireless Internet access, and…
ERIC Educational Resources Information Center
Goins, L. Keith, Ed.
This proceedings includes the following papers: "Dealing with Discipline Problems in Schools" (Allen); "Developing Global Awareness" (Arnold); "Desktop Publishing Using WordPerfect 6.0 for Windows" (Broughton); "Learn and Earn" (Cauley); "Using the Computer to Teach Merchandising Math"…
Code White: A Signed Code Protection Mechanism for Smartphones
2010-09-01
analogous to computer security is the use of antivirus (AV) software . 12 AV software is a brute force approach to security. The software ...these users, numerous malicious programs have also surfaced. And while smartphones have desktop-like capabilities to execute software , they do not...11 2.3.1 Antivirus and Mobile Phones ............................................................... 11 2.3.2
Basics of Desktop Publishing. Second Edition.
ERIC Educational Resources Information Center
Beeby, Ellen; Crummett, Jerrie
This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…
Desktop Publishing: A Brave New World and Publishing from the Desktop.
ERIC Educational Resources Information Center
Lormand, Robert; Rowe, Jane J.
1988-01-01
The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…
Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC
NASA Astrophysics Data System (ADS)
Alruwaili, Manal
With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
Analysis of helium-ion scattering with a desktop computer
NASA Astrophysics Data System (ADS)
Butler, J. W.
1986-04-01
This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.
Ito, C; Satoh, I; Michiya, H; Kitayama, Y; Miyazaki, K; Ota, S; Satoh, H; Sakurai, T; Shirato, H; Miyasaka, K
1997-01-01
A computerised nursing support system (CNSS) linked to the hospital information system (HIS) was developed and has been in use for one year, in order to reduce the workload of nurses. CNSS consists of (1) a hand held computer for each nurse (2) desk-top computers in the nurses' station and doctors' rooms (3) a data server (4) an interface with the main hospital information system. Nurses enter vital signs, food intake and other information about the patients into the hand held computer at the bed-side. The information is then sent automatically to the CNSS data server, which also receives patients' details (prescribed medicines etc.) from the HIS. Nurses and doctors can see all the information on the desk-top and hand held computers. This system was introduced in May 1995 into a university hospital ward with 40 beds. A questionnaire was completed by 23 nurses before and after the introduction of CNSS. The mean time required to post vital data was significantly reduced from 121 seconds to 54 seconds (p < 0.01). After three months 30% of nurses felt CNSS had reduced their workload, while 30% felt it had complicated their work; after five months 70% noted a reduction and 0% reported that CNSS had made their work more complex. The study therefore concludes that the interface between a computerised nursing support system and the hospital information system reduced the workload of nurses.
Electronic patient data confidentiality practices among surgical trainees: questionnaire study.
Mole, Damian J; Fox, Colin; Napolitano, Giulio
2006-10-01
The objective of this work was to evaluate the safeguards implemented by surgical trainees to protect the confidentiality of electronic patient data through a structured questionnaire sent to Northern Ireland surgical trainees. A group of 32 basic and higher surgical trainees attending a meeting of the Northern Ireland Association of Surgeons-in-Training were invited to complete a questionnaire regarding their computer use, UK Data Protection Act, 1988 registration and electronic data confidentiality practices. Of these 32 trainees, 29 returned completed questionnaires of whom 26 trainees regularly stored sensitive patient data for audit or research purposes on a computer. Only one person was registered under the Data Protection Act, 1988. Of the computers used to store and analyse sensitive data, only 3 of 14 desktops, 8 of 19 laptops and 3 of 14 hand-held computers forced a password logon. Of the 29 trainees, 16 used the same password for all machines, and 25 of 27 passwords were less than 8 characters long. Two respondents declined to reveal details of their secure passwords. Half of all trainees had never adjusted their internet security settings, despite all 14 desktops, 16 of 19 laptops and 5 of 14 hand-helds being routinely connected to the internet. Of the 29 trainees, 28 never encrypted their sensitive data files. Ten trainees had sent unencrypted sensitive patient data over the internet, using a non-secure server. Electronic data confidentiality practices amongst Northern Ireland surgical trainees are unsafe. Simple practical measures to safeguard confidentiality are recommended.
Desktop Video Productions. ICEM Guidelines Publications No. 6.
ERIC Educational Resources Information Center
Taufour, P. A.
Desktop video consists of integrating the processing of the video signal in a microcomputer. This definition implies that desktop video can take multiple forms such as virtual editing or digital video. Desktop video, which does not imply any particular technology, has been approached in different ways in different technical fields. It remains a…
ERIC Educational Resources Information Center
Lee, Paul
This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…
Schatz, Philip; Moser, Rosemarie Scolaro; Solomon, Gary S.; Ott, Summer D.; Karpf, Robin
2012-01-01
Context: Limited data are available regarding the prevalence and nature of invalid computerized baseline neurocognitive test data. Objective: To identify the prevalence of invalid baselines on the desktop and online versions of ImPACT and to document the utility of correcting for left-right (L-R) confusion on the desktop version of ImPACT. Design: Cross-sectional study of independent samples of high school (HS) and collegiate athletes who completed the desktop or online versions of ImPACT. Participants or Other Participants: A total of 3769 HS (desktop = 1617, online = 2152) and 2130 collegiate (desktop = 742, online = 1388) athletes completed preseason baseline assessments. Main Outcome Measure(s): Prevalence of 5 ImPACT validity indicators, with correction for L-R confusion (reversing left and right mouse-click responses) on the desktop version, by test version and group. Chi-square analyses were conducted for sex and attentional or learning disorders. Results: At least 1 invalid indicator was present on 11.9% (desktop) versus 6.3% (online) of the HS baselines and 10.2% (desktop) versus 4.1% (online) of collegiate baselines; correcting for L-R confusion (desktop) decreased this overall prevalence to 8.4% (HS) and 7.5% (collegiate). Online Impulse Control scores alone yielded 0.4% (HS) and 0.9% (collegiate) invalid baselines, compared with 9.0% (HS) and 5.4% (collegiate) on the desktop version; correcting for L-R confusion (desktop) decreased the prevalence of invalid Impulse Control scores to 5.4% (HS) and 2.6% (collegiate). Male athletes and HS athletes with attention deficit or learning disorders who took the online version were more likely to have at least 1 invalid indicator. Utility of additional invalidity indicators is reported. Conclusions: The online ImPACT version appeared to yield fewer invalid baseline results than did the desktop version. Identification of L-R confusion reduces the prevalence of invalid baselines (desktop only) and the potency of Impulse Control as a validity indicator. We advise test administrators to be vigilant in identifying invalid baseline results as part of routine concussion management and prevention programs. PMID:22892410
Gaudez, Clarisse; Cail, François
2016-11-01
This study compared muscular and postural stresses, performance and subject preference in women aged 18-40 years using a standard mouse, a vertical mouse and a slanted mouse in three different computer workstation positions. Four tasks were analysed: pointing, pointing-clicking, pointing-clicking-dragging and grasping-pointing the mouse after typing. Flexor digitorum superficialis (FDS) and extensor carpi radialis (ECR) activities were greater using the standard mouse compared to the vertical or slanted mouse. In all cases, the wrist position remained in the comfort zone recommended by standard ISO 11228-3. The vertical mouse was less comfortable and more difficult to use than the other two mice. FDS and ECR activities, shoulder abduction and wrist extension were greater when the mouse was placed next to the keyboard. Performance and subject preference were better with the unrestricted mouse positioning on the desktop. Grasping the mouse after typing was the task that caused the greatest stress. Practitioner Summary: In women, the slanted mouse and the unrestricted mouse positioning on the desktop provide a good blend of stresses, performance and preference. Unrestricted mouse positioning requires no keyboard, which is rare in practice. Placing the mouse in front of the keyboard, rather than next to it, reduced the physical load.
An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung
2011-01-01
In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less
ERIC Educational Resources Information Center
Lazerick, Beth
1990-01-01
This article describes desktop publishing and discusses the features and classroom uses of one of the newest desktop publishing programs. Several desktop publishing projects for teachers and students are suggested. (IAH)
An analysis of running skyline load path.
Ward W. Carson; Charles N. Mann
1971-01-01
This paper is intended for those who wish to prepare an algorithm to determine the load path of a running skyline. The mathematics of a simplified approach to this running skyline design problem are presented. The approach employs assumptions which reduce the complexity of the problem to the point where it can be solved on desk-top computers of limited capacities. The...
Army Communicator. Volume 34, Number 2
2009-01-01
tunneled into the NIPRNet traffic. The encryption hides the contents of the SIPRNet data through a process that randomizes the bit patterns...and technologies such as desktop applications, Virtual Private Network, Blackberry support, and the training and troubleshoot- ing of complex computer...to your own Standing Operating Procedure and then contract for services off the backside to a local Strategic Entry Point or tunnel through
Green Desktop Computing at the University of Oxford
ERIC Educational Resources Information Center
Noble, Howard; Curtis, Daniel; Tang, Kang
2009-01-01
The government of the United Kingdom has set a target to reduce CO2 emissions by at least 34 percent from 1990 levels by 2020. The Carbon Reduction Commitment (CRC) will require all large public and private sector organizations across the U.K. to cut carbon emissions and report total CO2 emissions annually so that the data can be published in a…
NASA Technical Reports Server (NTRS)
Westmeyer, Paul A. (Inventor); Wertenberg, Russell F. (Inventor); Krage, Frederick J. (Inventor); Riegel, Jack F. (Inventor)
2017-01-01
An authentication procedure utilizes multiple independent sources of data to determine whether usage of a device, such as a desktop computer, is authorized. When a comparison indicates an anomaly from the base-line usage data, the system, provides a notice that access of the first device is not authorized.
ERIC Educational Resources Information Center
Winans, Glen T.
This paper presents a descriptive review of how the Provost's Office of the College of Letters and Science at the University of California, Santa Barbara (UCSB) implemented 330 microcomputers in the 34 academic departments from July 1984 through June 1986. The decision to implement stand-alone microcomputers was based on four concerns: increasing…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
... Carolina. The workers are engaged in employment related to the production of desktop computers. The notice was published in the Federal Register on April 23, 2010 (75 FR 21361). The notices were amended on.... The notices were published in the Federal Register on April 19, 2010 (75 FR 20385), September 13, 2010...
ERIC Educational Resources Information Center
Gayle, Susan, Ed.
These proceedings address the appropriate uses of technology in education, including papers and summaries of presentations on the following topics: community partnerships; desktop publishing; English as a Second Language/English to Speakers of Other Languages (ESL/ESOL); cognitive issues in multimedia; higher education applications; social studies…
Defining protein electrostatic recognition processes
NASA Astrophysics Data System (ADS)
Getzoff, Elizabeth D.; Roberts, Victoria A.
The objective is to elucidate the nature of electrostatic forces controlling protein recognition processes by using a tightly coupled computational and interactive computer graphics approach. The TURNIP program was developed to determine the most favorable precollision orientations for two molecules by systematic search of all orientations and evaluation of the resulting electrostatic interactions. TURNIP was applied to the transient interaction between two electron transfer metalloproteins, plastocyanin and cytochrome c. The results suggest that the productive electron-transfer complex involves interaction of the positive region of cytochrome c with the negative patch of plastocyanin, consistent with experimental data. Application of TURNIP to the formation of the stable complex between the HyHEL-5 antibody and its protein antigen lysozyme showed that long-distance electrostatic forces guide lysozyme toward the HyHEL-5 binding site, but do not fine tune its orientation. Determination of docked antigen/antibody complexes requires including steric as well as electrostatic interactions, as was done for the U10 mutant of the anti-phosphorylcholine antibody S107. The graphics program Flex, a convenient desktop workstation program for visualizing molecular dynamics and normal mode motions, was enhanced. Flex now has a user interface and was rewritten to use standard graphics libraries, so as to run on most desktop workstations.
Monte Carlo simulation of electrothermal atomization on a desktop personal computer
NASA Astrophysics Data System (ADS)
Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.
1996-07-01
Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.
Use phase signals to promote lifetime extension for Windows PCs.
Hickey, Stewart; Fitzpatrick, Colin; O'Connell, Maurice; Johnson, Michael
2009-04-01
This paper proposes a signaling methodology for personal computers. Signaling may be viewed as an ecodesign strategy that can positively influence the consumer to consumer (C2C) market process. A number of parameters are identified that can provide the basis for signal implementation. These include operating time, operating temperature, operating voltage, power cycle counts, hard disk drive (HDD) self-monitoring, and reporting technology (SMART) attributes and operating system (OS) event information. All these parameters are currently attainable or derivable via embedded technologies in modern desktop systems. A case study detailing a technical implementation of how the development of signals can be achieved in personal computers that incorporate Microsoft Windows operating systems is presented. Collation of lifetime temperature data from a system processor is demonstrated as a possible means of characterizing a usage profile for a desktop system. In addition, event log data is utilized for devising signals indicative of OS quality. The provision of lifetime usage data in the form of intuitive signals indicative of both hardware and software quality can in conjunction with consumer education facilitate an optimal remarketing strategy for used systems. This implementation requires no additional hardware.
Wesemann, Christian; Muallah, Jonas; Mah, James; Bumann, Axel
2017-01-01
The primary objective of this study was to compare the accuracy and time efficiency of an indirect and direct digitalization workflow with that of a three-dimensional (3D) printer in order to identify the most suitable method for orthodontic use. A master model was measured with a coordinate measuring instrument. The distances measured were the intercanine width, the intermolar width, and the dental arch length. Sixty-four scans were taken with each of the desktop scanners R900 and R700 (3Shape), the intraoral scanner TRIOS Color Pod (3Shape), and the Promax 3D Mid cone beam computed tomography (CBCT) unit (Planmeca). All scans were measured with measuring software. One scan was selected and printed 37 times on the D35 stereolithographic 3D printer (Innovation MediTech). The printed models were measured again using the coordinate measuring instrument. The most accurate results were obtained by the R900. The R700 and the TRIOS intraoral scanner showed comparable results. CBCT-3D-rendering with the Promax 3D Mid CBCT unit revealed significantly higher accuracy with regard to dental casts than dental impressions. 3D printing offered a significantly higher level of deviation than digitalization with desktop scanners or an intraoral scanner. The chairside time required for digital impressions was 27% longer than for conventional impressions. Conventional impressions, model casting, and optional digitization with desktop scanners remains the recommended workflow process. For orthodontic demands, intraoral scanners are a useful alternative for full-arch scans. For prosthodontic use, the scanning scope should be less than one quadrant and three additional teeth.
Oak Ridge Institutional Cluster Autotune Test Drive Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jibonananda, Sanyal; New, Joshua Ryan
2014-02-01
The Oak Ridge Institutional Cluster (OIC) provides general purpose computational resources for the ORNL staff to run computation heavy jobs that are larger than desktop applications but do not quite require the scale and power of the Oak Ridge Leadership Computing Facility (OLCF). This report details the efforts made and conclusions derived in performing a short test drive of the cluster resources on Phase 5 of the OIC. EnergyPlus was used in the analysis as a candidate user program and the overall software environment was evaluated against anticipated challenges experienced with resources such as the shared memory-Nautilus (JICS) and Titanmore » (OLCF). The OIC performed within reason and was found to be acceptable in the context of running EnergyPlus simulations. The number of cores per node and the availability of scratch space per node allow non-traditional desktop focused applications to leverage parallel ensemble execution. Although only individual runs of EnergyPlus were executed, the software environment on the OIC appeared suitable to run ensemble simulations with some modifications to the Autotune workflow. From a standpoint of general usability, the system supports common Linux libraries, compilers, standard job scheduling software (Torque/Moab), and the OpenMPI library (the only MPI library) for MPI communications. The file system is a Panasas file system which literature indicates to be an efficient file system.« less
NASA Technical Reports Server (NTRS)
Rogers, R. H. (Principal Investigator)
1980-01-01
The results achieved during the first eight months of a program to transfer LANDSAT technology to practicing professionals in the private and public sectors (grass roots) through community colleges and other locally available institutions are reported. The approach offers hands-on interactive analysis training and demonstrations through the use of color desktop computer terminals communicating with a host computer by telephone lines. The features of the terminals and associated training materials are reviewed together with plans for their use in training and demonstration projects.
Leonardi, Rosalia; Maiorana, Francesco; Giordano, Daniela
2008-06-01
Many of us use and maintain files on more than 1 computer--a desktop part of the time, and a notebook, a palmtop, or removable devices at other times. It can be easy to forget which device contains the latest version of a particular file, and time-consuming searches often ensue. One way to solve this problem is to use software that synchronizes the files. This allows users to maintain updated versions of the same file in several locations.
Designing Interaction for Next Generation Personal Computing
NASA Astrophysics Data System (ADS)
de Michelis, Giorgio; Loregian, Marco; Moderini, Claudio; Marti, Patrizia; Colombo, Cesare; Bannon, Liam; Storni, Cristiano; Susani, Marco
Over two decades of research in the field of Interaction Design and Computer Supported Cooperative Work convinced us that the current design of workstations no longer fits users’ needs. It is time to design new personal computers based on metaphors alternative to the desktop one. With this SIG, we are seeking to involve international HCI professionals into the challenges of designing products that are radically new and tackling the many different issues of modern knowledge workers. We would like to engage a wider cross-section of the community: our focus will be on issues of development and participation and the impact of different values in our work.
Scalable Molecular Dynamics with NAMD
Phillips, James C.; Braun, Rosemary; Wang, Wei; Gumbart, James; Tajkhorshid, Emad; Villa, Elizabeth; Chipot, Christophe; Skeel, Robert D.; Kalé, Laxmikant; Schulten, Klaus
2008-01-01
NAMD is a parallel molecular dynamics code designed for high-performance simulation of large biomolecular systems. NAMD scales to hundreds of processors on high-end parallel platforms, as well as tens of processors on low-cost commodity clusters, and also runs on individual desktop and laptop computers. NAMD works with AMBER and CHARMM potential functions, parameters, and file formats. This paper, directed to novices as well as experts, first introduces concepts and methods used in the NAMD program, describing the classical molecular dynamics force field, equations of motion, and integration methods along with the efficient electrostatics evaluation algorithms employed and temperature and pressure controls used. Features for steering the simulation across barriers and for calculating both alchemical and conformational free energy differences are presented. The motivations for and a roadmap to the internal design of NAMD, implemented in C++ and based on Charm++ parallel objects, are outlined. The factors affecting the serial and parallel performance of a simulation are discussed. Next, typical NAMD use is illustrated with representative applications to a small, a medium, and a large biomolecular system, highlighting particular features of NAMD, e.g., the Tcl scripting language. Finally, the paper provides a list of the key features of NAMD and discusses the benefits of combining NAMD with the molecular graphics/sequence analysis software VMD and the grid computing/collaboratory software BioCoRE. NAMD is distributed free of charge with source code at www.ks.uiuc.edu. PMID:16222654
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
Stern, E J; Westenberg, L
1995-05-01
Desktop computer hardware and software provide many new and accessible avenues for increased academic productivity, but some activities may have legal implications. The advent of technologies such as scanners, the ever-increasing number of electronic bulletin boards, and the development of the "information superhighway" affect the concept of copyright and require authors and publishers to reconsider their legal rights and obligations when they create or publish new works or modify existing ones. For example, with desktop scanners, almost any image, published or otherwise, can be copied, enhanced, and manipulated. Moreover, many radiologists have access to copyrighted digital radiologic teaching file images, such as those from the University of Iowa or the University of Washington, which are available (and "downloadable") on the Internet. Because "downloading" (or "uploading") a document or image is essentially making a copy of that document or image, copyright laws and the rights that they afford authors are involved.
NASA Astrophysics Data System (ADS)
Sauermann, Leo; Kiesel, Malte; Schumacher, Kinga; Bernardi, Ansgar
In diesem Beitrag wird gezeigt, wie der Arbeitsplatz der Zukunft aussehen könnte und wo das Semantic Web neue Möglichkeiten eröffnet. Dazu werden Ansätze aus dem Bereich Semantic Web, Knowledge Representation, Desktop-Anwendungen und Visualisierung vorgestellt, die es uns ermöglichen, die bestehenden Daten eines Benutzers neu zu interpretieren und zu verwenden. Dabei bringt die Kombination von Semantic Web und Desktop Computern besondere Vorteile - ein Paradigma, das unter dem Titel Semantic Desktop bekannt ist. Die beschriebenen Möglichkeiten der Applikationsintegration sind aber nicht auf den Desktop beschränkt, sondern können genauso in Web-Anwendungen Verwendung finden.
The prevalence of computer-related musculoskeletal complaints in female college students.
Hamilton, Audra G; Jacobs, Karen; Orsmond, Gael
2005-01-01
The purpose of this study was to determine the prevalence of computer-related musculoskeletal complaints in female college students. This research also explored whether the number of hours per day spent using a computer, type of computer used (laptop vs. desktop), or academic major was related to the presence of musculoskeletal complaints. Additionally, "job strain", a measure of job stress which can affect the physical health of an individual, was measured to determine whether students feel stress from the job of "student" and if so, whether it contributed to these complaints. Two surveys, The Boston University Computer and Health Survey and the Job Content Questionnaire [9], were distributed to 111 female college students to measure musculoskeletal complaints and job strain. Seventy-two surveys were returned. Chi-square and logistical regression were used to analyze the data. The results indicated that 80.6% of the participants reported computer-related musculoskeletal complaints in the two weeks prior to completing the survey, although none of the examined factors were associated with the complaints. It is notable, however, that 82% of the students reported spending 0-6 hours/day using a computer, with almost 28% reporting 4-6 hours/day of usage. Eleven percent of the participants reported using the computer more than 8 hours/day. Of those students who use a laptop computer for all computer use, 90.1% reported musculoskeletal complaints. The students reported that they did not experience job strain. Further studies should be performed using a survey specifically intended for college students. The majority of female college students in this study reported musculoskeletal discomfort during or after computer use. Although a statistical correlation could not be made, students using laptop computers reported a higher incidence of musculoskeletal symptoms than those using desktop computers. Additionally, female college students did not seem to experience job strain. Future research should continue on larger, more diverse samples of students to better understand the prevalence and contributors of musculoskeletal complaints, how college students experience job strain (stress), and whether these two factors are related.
High Temperature Thermoplastic Additive Manufacturing Using Low-Cost, Open-Source Hardware
NASA Technical Reports Server (NTRS)
Gardner, John M.; Stelter, Christopher J.; Yashin, Edward A.; Siochi, Emilie J.
2016-01-01
Additive manufacturing (or 3D printing) via Fused Filament Fabrication (FFF), also known as Fused Deposition Modeling (FDM), is a process where material is placed in specific locations layer-by-layer to create a complete part. Printers designed for FFF build parts by extruding a thermoplastic filament from a nozzle in a predetermined path. Originally developed for commercial printers, 3D printing via FFF has become accessible to a much larger community of users since the introduction of Reprap printers. These low-cost, desktop machines are typically used to print prototype parts or novelty items. As the adoption of desktop sized 3D printers broadens, there is increased demand for these machines to produce functional parts that can withstand harsher conditions such as high temperature and mechanical loads. Materials meeting these requirements tend to possess better mechanical properties and higher glass transition temperatures (Tg), thus requiring printers with high temperature printing capability. This report outlines the problems and solutions, and includes a detailed description of the machine design, printing parameters, and processes specific to high temperature thermoplastic 3D printing.
shinyheatmap: Ultra fast low memory heatmap web interface for big data genomics.
Khomtchouk, Bohdan B; Hennessy, James R; Wahlestedt, Claes
2017-01-01
Transcriptomics, metabolomics, metagenomics, and other various next-generation sequencing (-omics) fields are known for their production of large datasets, especially across single-cell sequencing studies. Visualizing such big data has posed technical challenges in biology, both in terms of available computational resources as well as programming acumen. Since heatmaps are used to depict high-dimensional numerical data as a colored grid of cells, efficiency and speed have often proven to be critical considerations in the process of successfully converting data into graphics. For example, rendering interactive heatmaps from large input datasets (e.g., 100k+ rows) has been computationally infeasible on both desktop computers and web browsers. In addition to memory requirements, programming skills and knowledge have frequently been barriers-to-entry for creating highly customizable heatmaps. We propose shinyheatmap: an advanced user-friendly heatmap software suite capable of efficiently creating highly customizable static and interactive biological heatmaps in a web browser. shinyheatmap is a low memory footprint program, making it particularly well-suited for the interactive visualization of extremely large datasets that cannot typically be computed in-memory due to size restrictions. Also, shinyheatmap features a built-in high performance web plug-in, fastheatmap, for rapidly plotting interactive heatmaps of datasets as large as 105-107 rows within seconds, effectively shattering previous performance benchmarks of heatmap rendering speed. shinyheatmap is hosted online as a freely available web server with an intuitive graphical user interface: http://shinyheatmap.com. The methods are implemented in R, and are available as part of the shinyheatmap project at: https://github.com/Bohdan-Khomtchouk/shinyheatmap. Users can access fastheatmap directly from within the shinyheatmap web interface, and all source code has been made publicly available on Github: https://github.com/Bohdan-Khomtchouk/fastheatmap.
NASA Astrophysics Data System (ADS)
Zackay, Barak; Ofek, Eran O.
2017-01-01
Astronomical radio signals are subjected to phase dispersion while traveling through the interstellar medium. To optimally detect a short-duration signal within a frequency band, we have to precisely compensate for the unknown pulse dispersion, which is a computationally demanding task. We present the “fast dispersion measure transform” algorithm for optimal detection of such signals. Our algorithm has a low theoretical complexity of 2{N}f{N}t+{N}t{N}{{Δ }}{{log}}2({N}f), where Nf, Nt, and NΔ are the numbers of frequency bins, time bins, and dispersion measure bins, respectively. Unlike previously suggested fast algorithms, our algorithm conserves the sensitivity of brute-force dedispersion. Our tests indicate that this algorithm, running on a standard desktop computer and implemented in a high-level programming language, is already faster than the state-of-the-art dedispersion codes running on graphical processing units (GPUs). We also present a variant of the algorithm that can be efficiently implemented on GPUs. The latter algorithm’s computation and data-transport requirements are similar to those of a two-dimensional fast Fourier transform, indicating that incoherent dedispersion can now be considered a nonissue while planning future surveys. We further present a fast algorithm for sensitive detection of pulses shorter than the dispersive smearing limits of incoherent dedispersion. In typical cases, this algorithm is orders of magnitude faster than enumerating dispersion measures and coherently dedispersing by convolution. We analyze the computational complexity of pulsed signal searches by radio interferometers. We conclude that, using our suggested algorithms, maximally sensitive blind searches for dispersed pulses are feasible using existing facilities. We provide an implementation of these algorithms in Python and MATLAB.
Choosing the Right Desktop Publisher.
ERIC Educational Resources Information Center
Eiser, Leslie
1988-01-01
Investigates the many different desktop publishing packages available today. Lists the steps to desktop publishing. Suggests which package to use with specific hardware available. Compares several packages for IBM, Mac, and Apple II based systems. (MVL)
The Printout: Desktop Pulishing in the Classroom.
ERIC Educational Resources Information Center
Balajthy, Ernest; Link, Gordon
1988-01-01
Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)
Desktop Publishing: Changing Technology, Changing Occupations.
ERIC Educational Resources Information Center
Stanton, Michael
1991-01-01
Describes desktop publishing (DTP) and its place in corporations. Lists job titles of those working in desktop publishing and describes DTP as it is taught at secondary and postsecondary levels and by private trainers. (JOW)
Algorithms of GPU-enabled reactive force field (ReaxFF) molecular dynamics.
Zheng, Mo; Li, Xiaoxia; Guo, Li
2013-04-01
Reactive force field (ReaxFF), a recent and novel bond order potential, allows for reactive molecular dynamics (ReaxFF MD) simulations for modeling larger and more complex molecular systems involving chemical reactions when compared with computation intensive quantum mechanical methods. However, ReaxFF MD can be approximately 10-50 times slower than classical MD due to its explicit modeling of bond forming and breaking, the dynamic charge equilibration at each time-step, and its one order smaller time-step than the classical MD, all of which pose significant computational challenges in simulation capability to reach spatio-temporal scales of nanometers and nanoseconds. The very recent advances of graphics processing unit (GPU) provide not only highly favorable performance for GPU enabled MD programs compared with CPU implementations but also an opportunity to manage with the computing power and memory demanding nature imposed on computer hardware by ReaxFF MD. In this paper, we present the algorithms of GMD-Reax, the first GPU enabled ReaxFF MD program with significantly improved performance surpassing CPU implementations on desktop workstations. The performance of GMD-Reax has been benchmarked on a PC equipped with a NVIDIA C2050 GPU for coal pyrolysis simulation systems with atoms ranging from 1378 to 27,283. GMD-Reax achieved speedups as high as 12 times faster than Duin et al.'s FORTRAN codes in Lammps on 8 CPU cores and 6 times faster than the Lammps' C codes based on PuReMD in terms of the simulation time per time-step averaged over 100 steps. GMD-Reax could be used as a new and efficient computational tool for exploiting very complex molecular reactions via ReaxFF MD simulation on desktop workstations. Copyright © 2013 Elsevier Inc. All rights reserved.
Promises and Realities of Desktop Publishing.
ERIC Educational Resources Information Center
Thompson, Patricia A.; Craig, Robert L.
1991-01-01
Examines the underlying assumptions of the rhetoric of desktop publishing promoters. Suggests four criteria to help educators provide insights into issues and challenges concerning desktop publishing technology that design students will face on the job. (MG)
Making the Leap to Desktop Publishing.
ERIC Educational Resources Information Center
Schleifer, Neal
1986-01-01
Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)
ERIC Educational Resources Information Center
Ezekiel, Aaron B.
At the University of New Mexico, stakeholders from the Computer and Information Resources and Technology (CIRT) department, Financial Systems, the Health Sciences Center, and the General Libraries, were involved in deciding on the goals of a project to replace Telnet with a suite of network middleware and productivity software on campus computer…
NASA Technical Reports Server (NTRS)
2000-01-01
Kennedy Space Center's need to conduct real-time monitoring of Space Shuttle operations led to the development of Netlander Inc.'s JTouch system. The technology behind JTouch allows engineers to view Space Shuttle and ground support data from any desktop computer using a web browser. Companies can make use of JTouch to better monitor locations scattered around the world, increasing decision-making speed and reducing travel costs for site visits.
ERIC Educational Resources Information Center
Benedict, Lucille; Pence, Harry E.
2012-01-01
Increasing numbers of college students own cell phones, and many of these phones are smartphones, which include features such as still and video cameras, global positioning systems, Internet access, and computers as powerful as the desktop models of only a few years ago. A number of chemical educators are already using these devices for education.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
... Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR 21361). The notice was... notice was published in the Federal Register on April 19, 2010 (75 FR 20385) At the request of the State... engaged in employment related to the production of desktop computers. New information shows that workers...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-16
... Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR 21361). The notice was... notice was published in the Federal Register on April 19, 2010 (75 FR 20385). At the request of the State... engaged in employment related to the production of desktop computers. New information shows that workers...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR 21361). The notice was... notice was published in the Federal Register on April 19, 2010 (75 FR 20385) At the request of the State... engaged in employment related to the production of desktop computers. New information shows that workers...
Design and Implementation of a Motor Incremental Shaft Encoder
2008-09-01
SDC Student Design Center VHDL Verilog Hardware Description Language VSC Voltage Source Converters ZCE Zero Crossing Event xiii EXECUTIVE...student to make accurate predictions of voltage source converters ( VSC ) behavior via software simulation; these simulated results could also be... VSC ), and several other off-the-shelf components, a circuit board interface between FPGA and the power source, and a desktop computer [1]. Now, the
All Aboard the Internet. E to Go: The Internet in Your Pocket
ERIC Educational Resources Information Center
Descy, Don E.
2006-01-01
Some remember when they accessed the Internet through the telephone on a 300-baud modem. It was slow and it was not easy, but it gave them a view of the world. Now there is the web, idiot-proof email, and practically unlimited wireless connectivity. Having gone from desktop computers to laptops, now a major push is moving connectivity into the…
Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home
ERIC Educational Resources Information Center
Zilka, Gila Cohen
2016-01-01
Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a) the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school); and (b) the digital…
Tool Integration Framework for Bio-Informatics
2007-04-01
Java NetBeans [11] based Integrated Development Environment (IDE) for developing modules and packaging computational tools. The framework is extremely...integrate an Eclipse front-end for Desktop Integration. Eclipse was chosen over Netbeans owing to a higher acceptance, better infrastructure...5.0. This version of Dashboard ran with NetBeans IDE 3.6 requiring Java Runtime 1.4 on a machine with Windows XP. The toolchain is executed by
Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments
2012-06-01
Institute of Standards and Technology NPS Naval Postgraduate School OCONUS Outside of the Continental United States ONE- NET OCONUS Navy Enterprise... framework of technology that allows all interested systems, inside and outside of an organization, to expose and access well-defined services, and...was established to manage the Navy’s three largest enterprise networks; the OCONUS Navy Enterprise 22 Network (ONE- NET ), the Navy-Marine Corps
Electronic Patient Data Confidentiality Practices Among Surgical Trainees: Questionnaire Study
Mole, Damian J; Fox, Colin; Napolitano, Giulio
2006-01-01
INTRODUCTION The objective of this work was to evaluate the safeguards implemented by surgical trainees to protect the confidentiality of electronic patient data through a structured questionnaire sent to Northern Ireland surgical trainees. PARTICIPANTS AND METHODS A group of 32 basic and higher surgical trainees attending a meeting of the Northern Ireland Association of Surgeons-in-Training were invited to complete a questionnaire regarding their computer use, UK Data Protection Act, 1988 registration and electronic data confidentiality practices. RESULTS Of these 32 trainees, 29 returned completed questionnaires of whom 26 trainees regularly stored sensitive patient data for audit or research purposes on a computer. Only one person was registered under the Data Protection Act, 1988. Of the computers used to store and analyse sensitive data, only 3 of 14 desktops, 8 of 19 laptops and 3 of 14 hand-held computers forced a password logon. Of the 29 trainees, 16 used the same password for all machines, and 25 of 27 passwords were less than 8 characters long. Two respondents declined to reveal details of their secure passwords. Half of all trainees had never adjusted their internet security settings, despite all 14 desktops, 16 of 19 laptops and 5 of 14 hand-helds being routinely connected to the internet. Of the 29 trainees, 28 never encrypted their sensitive data files. Ten trainees had sent unencrypted sensitive patient data over the internet, using a non-secure server. CONCLUSIONS Electronic data confidentiality practices amongst Northern Ireland surgical trainees are unsafe. Simple practical measures to safeguard confidentiality are recommended. PMID:17059715
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2017-08-01
Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.
Computational algorithms for simulations in atmospheric optics.
Konyaev, P A; Lukin, V P
2016-04-20
A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.
Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)
NASA Astrophysics Data System (ADS)
Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.
2017-02-01
In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.
Dynamic Collaboration Infrastructure for Hydrologic Science
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.
2016-12-01
Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.
Quantifying Pollutant Emissions from Office Equipment Phase IReport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, R.L.; Destaillats, H.; Hodgson, A.T.
2006-12-01
Although office equipment has been a focal point for governmental efforts to promote energy efficiency through programs such as Energy Star, little is known about the relationship between office equipment use and indoor air quality. This report provides results of the first phase (Phase I) of a study in which the primary objective is to measure emissions of organic pollutants and particulate matter from a selected set of office equipment typically used in residential and office environments. The specific aims of the overall research effort are: (1) use screening-level measurements to identify and quantify the concentrations of air pollutants ofmore » interest emitted by major categories of distributed office equipment in a controlled environment; (2) quantify the emissions of air pollutants from generally representative, individual machines within each of the major categories in a controlled chamber environment using well defined protocols; (3) characterize the effects of ageing and use on emissions for individual machines spanning several categories; (4) evaluate the importance of operational factors that can be manipulated to reduce pollutant emissions from office machines; and (5) explore the potential relationship between energy consumption and pollutant emissions for machines performing equivalent tasks. The study includes desktop computers (CPU units), computer monitors, and three categories of desktop printing devices. The printer categories are: (1) printers and multipurpose devices using color inkjet technology; (2) low- to medium output printers and multipurpose devices employing monochrome or color laser technology; and (3) high-output monochrome and color laser printers. The literature review and screening level experiments in Phase 1 were designed to identify substances of toxicological significance for more detailed study. In addition, these screening level measurements indicate the potential relative importance of different categories of office equipment with respect to human exposures. The more detailed studies of the next phase of research (Phase II) are meant to characterize changes in emissions with time and may identify factors that can be modified to reduce emissions. These measurements may identify 'win-win' situations in which low energy consumption machines have lower pollutant emissions. This information will be used to compare machines to determine if some are substantially better than their peers with respect to their emissions of pollutants.« less
EPA Region 8, Memo on Desktop Printer Ink Cartridges Policy & Voluntary Printer Turn-in
This memo requests EPA Region 8 users to voluntarily turn-in their desktop printers and notifies users of the Region 8 policy to not provide maintenance or ink and toner cartridges for desktop printers.
MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop
... gov/responsivefull.html MedlinePlus® Everywhere: Access from Your Phone, Tablet or Desktop To use the sharing features ... consistent user experience from a desktop, tablet, or phone. All users, regardless of how they access MedlinePlus, ...
The application of cloud computing to scientific workflows: a study of cost and performance.
Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S
2013-01-28
The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.
Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo
2016-01-01
Background Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. Methods and Results The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman’s correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Conclusion Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary. PMID:27391028
Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo
2016-01-01
Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.
A visualization environment for supercomputing-based applications in computational mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pavlakos, C.J.; Schoof, L.A.; Mareda, J.F.
1993-06-01
In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.
Real-time control system for adaptive resonator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flath, L; An, J; Brase, J
2000-07-24
Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.
Rhinoplasty perioperative database using a personal digital assistant.
Kotler, Howard S
2004-01-01
To construct a reliable, accurate, and easy-to-use handheld computer database that facilitates the point-of-care acquisition of perioperative text and image data specific to rhinoplasty. A user-modified database (Pendragon Forms [v.3.2]; Pendragon Software Corporation, Libertyville, Ill) and graphic image program (Tealpaint [v.4.87]; Tealpaint Software, San Rafael, Calif) were used to capture text and image data, respectively, on a Palm OS (v.4.11) handheld operating with 8 megabytes of memory. The handheld and desktop databases were maintained secure using PDASecure (v.2.0) and GoldSecure (v.3.0) (Trust Digital LLC, Fairfax, Va). The handheld data were then uploaded to a desktop database of either FileMaker Pro 5.0 (v.1) (FileMaker Inc, Santa Clara, Calif) or Microsoft Access 2000 (Microsoft Corp, Redmond, Wash). Patient data were collected from 15 patients undergoing rhinoplasty in a private practice outpatient ambulatory setting. Data integrity was assessed after 6 months' disk and hard drive storage. The handheld database was able to facilitate data collection and accurately record, transfer, and reliably maintain perioperative rhinoplasty data. Query capability allowed rapid search using a multitude of keyword search terms specific to the operative maneuvers performed in rhinoplasty. Handheld computer technology provides a method of reliably recording and storing perioperative rhinoplasty information. The handheld computer facilitates the reliable and accurate storage and query of perioperative data, assisting the retrospective review of one's own results and enhancement of surgical skills.