Competition in Defense Acquisitions
2008-05-14
NASA employees to maintain desktop assets No way to track costs, no standardization, not tracking service quality NASA’s Outsourcing Desktop...assets to the private sector. ODIN Goals Cut desktop computing costs Increase service quality Achieve interoperability and standardization Focus...not tracking service quality NASA’s Outsourcing Desktop Initiative (ODIN) transferred the responsibility for providing and managing the vast
Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts
ERIC Educational Resources Information Center
Jacobson, Jeffery
2013-01-01
In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…
An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.
Szeto, Grace P; Lee, Raymond
2002-04-01
To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation
Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0
NASA Technical Reports Server (NTRS)
Perry, John; Stroud, C. W.
1986-01-01
A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.
Desktop publishing and validation of custom near visual acuity charts.
Marran, Lynn; Liu, Lei; Lau, George
2008-11-01
Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.
A Platform-Independent Plugin for Navigating Online Radiology Cases.
Balkman, Jason D; Awan, Omer A
2016-06-01
Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.
Validation of tablet-based evaluation of color fundus images
Christopher, Mark; Moga, Daniela C.; Russell, Stephen R.; Folk, James C.; Scheetz, Todd; Abràmoff, Michael D.
2012-01-01
Purpose To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer to recommendations made using a standard desktop display. Methods A tablet computer (iPad) and a desktop PC with a high-definition color display were compared. For each platform, two retinal specialists independently rated 1200 color fundus images from patients at risk for DR using an annotation program, Truthseeker. The specialists determined whether each image had referable DR, and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet- and desktop display-based referral ratings were compared using cross-platform, intra-observer kappa as the primary outcome measure. Additionally, inter-observer kappa, sensitivity, specificity, and area under ROC (AUC) were determined. Results A high level of cross-platform, intra-observer agreement was found for the DR referral ratings between the platforms (κ=0.778), and for the two graders, (κ=0.812). Inter-observer agreement was similar for the two platforms (κ=0.544 and κ=0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an AUC of 0.950 compared to desktop display-based ratings. Conclusions In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR. PMID:22495326
Improved Distance Learning Environment For Marine Forces Reserve
2016-09-01
keyboard, to 20 form a desktop computer . Laptop computers share similar components but add mobility to the user. If additional desktop computers ...for stationary computing devices such as desktop PCs and laptops include the Microsoft Windows, Mac OS, and Linux families of OSs 44 (Hopkins...opportunities to all Marines. For active duty Marines, government-provided desktops and laptops (GPDLs) typically support DL T&E or learning resource
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2013 CFR
2013-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2012 CFR
2012-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
48 CFR 352.239-71 - Standard for encryption language.
Code of Federal Regulations, 2014 CFR
2014-10-01
... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... limited to) desktop computers, integrated desktop computers, laptop/notebook/ netbook computers, and... computer, and 65% of U.S. households owning a notebook, laptop, or netbook computer, in 2013.\\4\\ Coverage... recently published studies. In these studies, the average annual energy use for a desktop computer was...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2010 CFR
2010-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2011 CFR
2011-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
48 CFR 352.239-70 - Standard for security configurations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...
Evaluating virtual hosted desktops for graphics-intensive astronomy
NASA Astrophysics Data System (ADS)
Meade, B. F.; Fluke, C. J.
2018-04-01
Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
36 CFR 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
Task Scheduling in Desktop Grids: Open Problems
NASA Astrophysics Data System (ADS)
Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny
2017-12-01
We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.
Align and conquer: moving toward plug-and-play color imaging
NASA Astrophysics Data System (ADS)
Lee, Ho J.
1996-03-01
The rapid evolution of the low-cost color printing and image capture markets has precipitated a huge increase in the use of color imagery by casual end users on desktop systems, as opposed to traditional professional color users working with specialized equipment. While the cost of color equipment and software has decreased dramatically, the underlying system-level problems associated with color reproduction have remained the same, and in many cases are more difficult to address in a casual environment than in a professional setting. The proliferation of color imaging technologies so far has resulted in a wide availability of component solutions which work together poorly. A similar situation in the desktop computing market has led to the various `Plug-and-Play' standards, which provide a degree of interoperability between a range of products on disparate computing platforms. This presentation will discuss some of the underlying issues and emerging trends in the desktop and consumer digital color imaging markets.
36 CFR § 1194.26 - Desktop and portable computers.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Desktop and portable computers. § 1194.26 Section § 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...
Desktop Computing Integration Project
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
NASA Technical Reports Server (NTRS)
Harrison, Cecil A.
1986-01-01
The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.
Situated Computing: The Next Frontier for HCI Research
2002-01-01
population works and lives with information. Most individuals interact with information through a single portal: a personal desktop or laptop...of single devices, nor will one person necessarily own each device. This leap of imagination requires that human-computer interaction (HCI...wireless technologies, including Bluetooth [16], IrDA [22] (Infrared Data Association- standards for infrared communications) and HomeRF TM [21
75 FR 25185 - Broadband Initiatives Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...
Distributed-Memory Computing With the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)
NASA Technical Reports Server (NTRS)
Riley, Christopher J.; Cheatwood, F. McNeil
1997-01-01
The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA), a Navier-Stokes solver, has been modified for use in a parallel, distributed-memory environment using the Message-Passing Interface (MPI) standard. A standard domain decomposition strategy is used in which the computational domain is divided into subdomains with each subdomain assigned to a processor. Performance is examined on dedicated parallel machines and a network of desktop workstations. The effect of domain decomposition and frequency of boundary updates on performance and convergence is also examined for several realistic configurations and conditions typical of large-scale computational fluid dynamic analysis.
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
Desktop Publishing Made Simple.
ERIC Educational Resources Information Center
Wentling, Rose Mary
1989-01-01
The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)
Running GUI Applications on Peregrine from OSX | High-Performance Computing
Learn how to use Virtual Network Computing to access a Linux graphical desktop environment on Peregrine local port (on, e.g., your laptop), starts a VNC server process that manages a virtual desktop on your virtual desktop. This is persistent, so remember it-you will use this password whenever accessing
ERIC Educational Resources Information Center
Hall, H. L.
1988-01-01
Reports on the advantages and disadvantages of desktop publishing, using the Apple Macintosh and "Pagemaker" software, to produce a high school yearbook. Asserts that while desktop publishing may be initially more time consuming for those unfamiliar with computers, desktop publishing gives high school journalism staffs more control over…
Managing Information Technology in Academic Medical Centers: A "Multicultural" Experience.
ERIC Educational Resources Information Center
Friedman, Charles P.; Corn, Milton; Krumrey, Arthur; Perry, David R.; Stevens, Ronald H.
1998-01-01
Examines how beliefs and concerns of academic medicine's diverse professional cultures affect management of information technology. Two scenarios, one dealing with standardization of desktop personal computers and the other with publication of syllabi on an institutional intranet, form the basis for an exercise in which four prototypical members…
Multimedia architectures: from desktop systems to portable appliances
NASA Astrophysics Data System (ADS)
Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.
1997-01-01
Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.
Pages from the Desktop: Desktop Publishing Today.
ERIC Educational Resources Information Center
Crawford, Walt
1994-01-01
Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…
Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.
ERIC Educational Resources Information Center
Crawford, Walt
1987-01-01
Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…
Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.
Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P
2017-11-01
Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P < .001). Participants demonstrated no significant differences in lipid-layer grade and tear meniscus height between the two environments (all P > .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P < .001), which was associated with a higher proportion of subjects reporting greater comfort relative to baseline (36% vs. 5%, P < .001). Even with a modest increase in relative humidity locally, the desktop humidifier shows potential to improve tear-film stability and subjective comfort during computer use.Trial registration no: ACTRN12617000326392.
2007-04-01
judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit
Gaudez, Clarisse; Cail, François
2016-11-01
This study compared muscular and postural stresses, performance and subject preference in women aged 18-40 years using a standard mouse, a vertical mouse and a slanted mouse in three different computer workstation positions. Four tasks were analysed: pointing, pointing-clicking, pointing-clicking-dragging and grasping-pointing the mouse after typing. Flexor digitorum superficialis (FDS) and extensor carpi radialis (ECR) activities were greater using the standard mouse compared to the vertical or slanted mouse. In all cases, the wrist position remained in the comfort zone recommended by standard ISO 11228-3. The vertical mouse was less comfortable and more difficult to use than the other two mice. FDS and ECR activities, shoulder abduction and wrist extension were greater when the mouse was placed next to the keyboard. Performance and subject preference were better with the unrestricted mouse positioning on the desktop. Grasping the mouse after typing was the task that caused the greatest stress. Practitioner Summary: In women, the slanted mouse and the unrestricted mouse positioning on the desktop provide a good blend of stresses, performance and preference. Unrestricted mouse positioning requires no keyboard, which is rare in practice. Placing the mouse in front of the keyboard, rather than next to it, reduced the physical load.
Seat Interfaces for Aircrew Performance and Safety
2010-01-01
Quantum -II Desktop System consists of a keyboard and hardware accessories (electrodes, cables, etc.), and interfaces with a desktop computer via software...segment. Resistance and reactance data was collected to estimate blood volume changes. The Quantum -II Desktop system collected continuous data of...Approved for public release; distribution unlimited. 88 ABW Cleared 03/13/2015; 88ABW-2015-1053. mockup also included a laptop computer , a
Desktop Technology for Newspapers: Use of the Computer Tool.
ERIC Educational Resources Information Center
Wilson, Howard Alan
This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…
What's New in Software? Mastery of the Computer through Desktop Publishing.
ERIC Educational Resources Information Center
Hedley, Carolyn N.; Ellsworth, Nancy J.
1993-01-01
Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)
Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner
ERIC Educational Resources Information Center
Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.
2004-01-01
The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.
Goostrey, Sonya; Treleaven, Julia; Johnston, Venerina
2014-05-01
This study evaluated the impact on neck movement and muscle activity of placing documents in three commonly used locations: in-line, flat desktop left of the keyboard and laterally placed level with the computer screen. Neck excursion during three standard head movements between the computer monitor and each document location and neck extensor and upper trapezius muscle activity during a 5 min typing task for each of the document locations was measured in 20 healthy participants. Results indicated that muscle activity and neck flexion were least when documents were placed laterally suggesting it may be the optimal location. The desktop option produced both the greatest neck movement and muscle activity in all muscle groups. The in-line document location required significantly more neck flexion but less lateral flexion and rotation than the laterally placed document. Evaluation of other holders is needed to guide decision making for this commonly used office equipment. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
48 CFR 252.204-7011 - Alternative Line Item Structure.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...
Desktop Publishing: Its Impact on Community College Journalism.
ERIC Educational Resources Information Center
Grzywacz-Gray, John; And Others
1987-01-01
Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)
Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms
Hasbrouck, W.P.
1983-01-01
Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.
omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling
Phan, John H.; Kothari, Sonal; Wang, May D.
2016-01-01
Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062
Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues
NASA Astrophysics Data System (ADS)
Chakravarthy, Srinivas R.; Rumyantsev, Alexander
2018-03-01
Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.
Disaster easily averted? Data confidentiality and the hospital desktop computer.
Sethi, Neeraj; Lane, Gethin; Newton, Sophie; Egan, Philip; Ghosh, Samit
2014-05-01
We specifically identified the hospital desktop computer as a potential source of breaches in confidentiality. We aimed to evaluate if there was accessible, unprotected, confidential information stored on the desktop screen on computers in a district general hospital and if so, how a teaching intervention could improve this situation. An unannounced spot check of 59 ward computers was performed. Data were collected regarding how many had confidential information stored on the desktop screen without any password protection. An online learning module was mandated for healthcare staff and a second cycle of inspection performed. A district general hospital. Two doctors conducted the audit. Computers in clinical areas were assessed. All clinical staff with computer access underwent the online learning module. An online learning module regarding data protection and confidentiality. In the first cycle, 55% of ward computers had easily accessible patient or staff confidential information stored on their desktop screen. This included handovers, referral letters, staff sick leave lists, audits and nursing reports. The majority (85%) of computers accessed were logged in under a generic username and password. The intervention produced an improvement in the second cycle findings with only 26% of computers being found to have unprotected confidential information stored on them. The failure to comply with appropriate confidential data protection regulations is a persistent problem. Education produces some improvement but we also propose a systemic approach to solving this problem. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Development of a small-scale computer cluster
NASA Astrophysics Data System (ADS)
Wilhelm, Jay; Smith, Justin T.; Smith, James E.
2008-04-01
An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.
Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... services and manage networked resources for client devices such as desktop and laptop computers. These... include desktop or laptop computers, which are not primarily accessed via network connections. DOE seeks... Determination of Computer Servers as a Covered Consumer Product AGENCY: Office of Energy Efficiency and...
Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…
Desktop Publishing for Counselors.
ERIC Educational Resources Information Center
Lucking, Robert; Mitchum, Nancy
1990-01-01
Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…
Desktop Publishing: A Powerful Tool for Advanced Composition Courses.
ERIC Educational Resources Information Center
Sullivan, Patricia
1988-01-01
Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)
Argonne National Laboratory High Energy Physics Division Windows Desktops Problem Report Service Request Password Help New Users Back to HEP Computing Email on ANL Exchange: See Windows Clients section (Outlook or Thunderbird recommended) Web Browsers: Web Browsers for Windows Desktops Software: Available
The Next Generation of Lab and Classroom Computing - The Silver Lining
2016-12-01
desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The research... infrastructure , VDI, hardware cost, software cost, manpower, availability, cloud computing, private cloud, bring your own device, BYOD, thin client...virtual desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The
Campus Computing 1990: The EDUCOM/USC Survey of Desktop Computing in Higher Education.
ERIC Educational Resources Information Center
Green, Kenneth C.; Eastman, Skip
The National Survey of Desktop Computer Use in Higher Education was conducted in the spring and summer of 1990 by the Center for Scholarly Technology at the University of Southern California, in cooperation with EDUCOM and with support from 15 corporate sponsors. The survey was designed to collect information about campus planning, policies, and…
NASA Technical Reports Server (NTRS)
Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan
1998-01-01
Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.
Desk-top publishing using IBM-compatible computers.
Grencis, P W
1991-01-01
This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.
Video Conferencing: The Next Wave for International Business Communication.
ERIC Educational Resources Information Center
Sondak, Norman E.; Sondak, Eileen M.
This paper suggests that desktop computer-based video conferencing, with high fidelity sound, and group software support, is emerging as a major communications option. Briefly addressed are the following critical factors that are propelling the computer-based video conferencing revolution: (1) widespread availability of desktop computers…
Accelerating Monte Carlo simulations with an NVIDIA ® graphics processor
NASA Astrophysics Data System (ADS)
Martinsen, Paul; Blaschke, Johannes; Künnemeyer, Rainer; Jordan, Robert
2009-10-01
Modern graphics cards, commonly used in desktop computers, have evolved beyond a simple interface between processor and display to incorporate sophisticated calculation engines that can be applied to general purpose computing. The Monte Carlo algorithm for modelling photon transport in turbid media has been implemented on an NVIDIA ® 8800 GT graphics card using the CUDA toolkit. The Monte Carlo method relies on following the trajectory of millions of photons through the sample, often taking hours or days to complete. The graphics-processor implementation, processing roughly 110 million scattering events per second, was found to run more than 70 times faster than a similar, single-threaded implementation on a 2.67 GHz desktop computer. Program summaryProgram title: Phoogle-C/Phoogle-G Catalogue identifier: AEEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 51 264 No. of bytes in distributed program, including test data, etc.: 2 238 805 Distribution format: tar.gz Programming language: C++ Computer: Designed for Intel PCs. Phoogle-G requires a NVIDIA graphics card with support for CUDA 1.1 Operating system: Windows XP Has the code been vectorised or parallelized?: Phoogle-G is written for SIMD architectures RAM: 1 GB Classification: 21.1 External routines: Charles Karney Random number library. Microsoft Foundation Class library. NVIDA CUDA library [1]. Nature of problem: The Monte Carlo technique is an effective algorithm for exploring the propagation of light in turbid media. However, accurate results require tracing the path of many photons within the media. The independence of photons naturally lends the Monte Carlo technique to implementation on parallel architectures. Generally, parallel computing can be expensive, but recent advances in consumer grade graphics cards have opened the possibility of high-performance desktop parallel-computing. Solution method: In this pair of programmes we have implemented the Monte Carlo algorithm described by Prahl et al. [2] for photon transport in infinite scattering media to compare the performance of two readily accessible architectures: a standard desktop PC and a consumer grade graphics card from NVIDIA. Restrictions: The graphics card implementation uses single precision floating point numbers for all calculations. Only photon transport from an isotropic point-source is supported. The graphics-card version has no user interface. The simulation parameters must be set in the source code. The desktop version has a simple user interface; however some properties can only be accessed through an ActiveX client (such as Matlab). Additional comments: The random number library used has a LGPL ( http://www.gnu.org/copyleft/lesser.html) licence. Running time: Runtime can range from minutes to months depending on the number of photons simulated and the optical properties of the medium. References:http://www.nvidia.com/object/cuda_home.html. S. Prahl, M. Keijzer, Sl. Jacques, A. Welch, SPIE Institute Series 5 (1989) 102.
Hasbrouck, W.P.
1983-01-01
Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.
MDA-image: an environment of networked desktop computers for teleradiology/pathology.
Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P
1991-04-01
MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.
Writing Essays on a Laptop or a Desktop Computer: Does It Matter?
ERIC Educational Resources Information Center
Ling, Guangming; Bridgeman, Brent
2013-01-01
To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…
Neural simulations on multi-core architectures.
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.
Neural Simulations on Multi-Core Architectures
Eichner, Hubert; Klug, Tobias; Borst, Alexander
2009-01-01
Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393
Illuminator, a desktop program for mutation detection using short-read clonal sequencing.
Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T
2011-10-01
Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.
Cloud Computing and Virtual Desktop Infrastructures in Afloat Environments
2012-06-01
Institute of Standards and Technology NPS Naval Postgraduate School OCONUS Outside of the Continental United States ONE- NET OCONUS Navy Enterprise... framework of technology that allows all interested systems, inside and outside of an organization, to expose and access well-defined services, and...was established to manage the Navy’s three largest enterprise networks; the OCONUS Navy Enterprise 22 Network (ONE- NET ), the Navy-Marine Corps
A comparison of the postures assumed when using laptop computers and desktop computers.
Straker, L; Jones, K J; Miller, J
1997-08-01
This study evaluated the postural implications of using a laptop computer. Laptop computer screens and keyboards are joined, and are therefore unable to be adjusted separately in terms of screen height and distance, and keyboard height and distance. The posture required for their use is likely to be constrained, as little adjustment can be made for the anthropometric differences of users. In addition to the postural constraints, the study looked at discomfort levels and performance when using laptops as compared with desktops. Statistical analysis showed significantly greater neck flexion and head tilt with laptop use. The other body angles measured (trunk, shoulder, elbow, wrist, and scapula and neck protraction/retraction) showed no statistical differences. The average discomfort experienced after using the laptop for 20 min, although appearing greater than the discomfort experienced after using the desktop, was not significantly greater. When using the laptop, subjects tended to perform better than when using the desktop, though not significantly so. Possible reasons for the results are discussed and implications of the findings outlined.
Hasbrouck, W.P.
1983-01-01
Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.
Open Radio Communications Architecture Core Framework V1.1.0 Volume 1 Software Users Manual
2005-02-01
on a PC utilizing the KDE desktop that comes with Red Hat Linux . The default desktop for most Red Hat Linux installations is the GNOME desktop. The...SCA) v2.2. The software was designed for a desktop computer running the Linux operating system (OS). It was developed in C++, uses ACE/TAO for CORBA...middleware, Xerces for the XML parser, and Red Hat Linux for the Operating System. The software is referred to as, Open Radio Communication
ERIC Educational Resources Information Center
Zilka, Gila
2017-01-01
The viewing and browsing habits of Israeli children age 8-12 are the subject of this study. The participants did not have a computer at home and were given either a desktop or hybrid computer for home use. Television viewing and internet surfing habits were described, examining whether the children did so with their parents, family members, and…
Freiberger, Manuel; Egger, Herbert; Liebmann, Manfred; Scharfetter, Hermann
2011-11-01
Image reconstruction in fluorescence optical tomography is a three-dimensional nonlinear ill-posed problem governed by a system of partial differential equations. In this paper we demonstrate that a combination of state of the art numerical algorithms and a careful hardware optimized implementation allows to solve this large-scale inverse problem in a few seconds on standard desktop PCs with modern graphics hardware. In particular, we present methods to solve not only the forward but also the non-linear inverse problem by massively parallel programming on graphics processors. A comparison of optimized CPU and GPU implementations shows that the reconstruction can be accelerated by factors of about 15 through the use of the graphics hardware without compromising the accuracy in the reconstructed images.
Creative Computer Detective: The Basics of Teaching Desktop Publishing.
ERIC Educational Resources Information Center
Slothower, Jodie
Teaching desktop publishing (dtp) in college journalism classes is most effective when the instructor integrates into specific courses four types of software--a word processor, a draw program, a paint program and a layout program. In a course on design and layout, the instructor can demonstrate with the computer how good design can be created and…
26 CFR 1.179-5 - Time and manner of making election.
Code of Federal Regulations, 2010 CFR
2010-04-01
... desktop computer costing $1,500. On Taxpayer's 2003 Federal tax return filed on April 15, 2004, Taxpayer elected to expense under section 179 the full cost of the laptop computer and the full cost of the desktop... provided by the Internal Revenue Code, the regulations under the Code, or other guidance published in the...
Application of desktop computers in nuclear engineering education
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graves, H.W. Jr.
1990-01-01
Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solutionmore » to the problem being solved, and yet be flexible enough to accommodate most problem solution options.« less
Desktop Manufacturing Technologies.
ERIC Educational Resources Information Center
Snyder, Mark
1991-01-01
Desktop manufacturing is the use of data from a computer-assisted design system to construct actual models of an object. Emerging processes are stereolithography, laser sintering, ballistic particle manufacturing, laminated object manufacturing, and photochemical machining. (SK)
Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?
ERIC Educational Resources Information Center
Ling, Guangming
2016-01-01
To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…
ERIC Educational Resources Information Center
Fryer, Wesley
2004-01-01
There has long been a power struggle between techies and teachers over classroom computer desktops. IT personnel tend to believe allowing "inept" educators to have unfettered access to their computer's hard drive is an open invitation for trouble. Conversely, teachers often perceive tech support to be "uncaring" adversaries standing in the way of…
1956-10-12
A photo of the control stick used on the Iron Cross Attitude Simulator. Although it resembled today's desktop computer flight sticks, its operation was different. As with a standard control stick, moving it back and forth raised and lowered the nose resulting in changes in pitch. Moving the stick to the right or left raised or lowered the wing, resulted in changes in roll. This control stick had a third axis, not found in standard control sticks. Twisting the stick to the right or left caused the airplane's nose to move horizontally in the same direction, resulting in changes in yaw.
Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Huang, H. K.
1989-05-01
A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.
Computer usage and national energy consumption: Results from a field-metering study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Desroches, Louis-Benoit; Fuchs, Heidi; Greenblatt, Jeffery
The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Baymore » Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power of power supplies to computing needs, and improving the efficiency of individual components.« less
Accuracy of remote chest X-ray interpretation using Google Glass technology.
Spaedy, Emily; Christakopoulos, Georgios E; Tarar, Muhammad Nauman J; Christopoulos, Georgios; Rangan, Bavana V; Roesle, Michele; Ochoa, Cristhiaan D; Yarbrough, William; Banerjee, Subhash; Brilakis, Emmanouil S
2016-09-15
We sought to explore the accuracy of remote chest X-ray reading using hands-free, wearable technology (Google Glass, Google, Mountain View, California). We compared interpretation of twelve chest X-rays with 23 major cardiopulmonary findings by faculty and fellows from cardiology, radiology, and pulmonary-critical care via: (1) viewing the chest X-ray image on the Google Glass screen; (2) viewing a photograph of the chest X-ray taken using Google Glass and interpreted on a mobile device; (3) viewing the original chest X-ray on a desktop computer screen. One point was given for identification of each correct finding and a subjective rating of user experience was recorded. Fifteen physicians (5 faculty and 10 fellows) participated. The average chest X-ray reading score (maximum 23 points) as viewed through the Google Glass, Google Glass photograph on a mobile device, and the original X-ray viewed on a desktop computer was 14.1±2.2, 18.5±1.5 and 21.3±1.7, respectively (p<0.0001 between Google Glass and mobile device, p<0.0001 between Google Glass and desktop computer and p=0.0004 between mobile device and desktop computer). Of 15 physicians, 11 (73.3%) felt confident in detecting findings using the photograph taken by Google Glass as viewed on a mobile device. Remote chest X-ray interpretation using hands-free, wearable technology (Google Glass) is less accurate than interpretation using a desktop computer or a mobile device, suggesting that further technical improvements are needed before widespread application of this novel technology. Published by Elsevier Ireland Ltd.
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.
2013-01-01
This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…
ERIC Educational Resources Information Center
Technology & Learning, 2008
2008-01-01
When it comes to IT, there has always been an important link between data center control and client flexibility. As computing power increases, so do the potentially crippling threats to security, productivity and financial stability. This article talks about Dell's On-Demand Desktop Streaming solution which is designed to centralize complete…
Office Computer Software: A Comprehensive Review of Software Programs.
ERIC Educational Resources Information Center
Secretary, 1992
1992-01-01
Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)
Davis, J P; Akella, S; Waddell, P H
2004-01-01
Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.
Pinthong, Watthanai; Muangruen, Panya
2016-01-01
Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC) is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC) as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST) to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software. PMID:27547555
Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States
ERIC Educational Resources Information Center
Sung, Eunmo; Mayer, Richard E.
2012-01-01
College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…
Briggs, Andrew; Straker, Leon; Greig, Alison
2004-06-10
The objective of this study was to quantitatively analyse the sitting posture of school children interacting with both old (book) and new (laptop and desktop computers) information technologies to test the hypothesis that posture is effected by the type of information technology (IT) used. A mixed model design was used to test the effect of IT type (within subjects) and age and gender (between subjects). The sitting posture of 32 children aged 4-17 years was measured whilst they read from a book, laptop, and desktop computer at a standard school chair and desk. Video images were captured and then digitized to calculate mean angles for head tilt, neck flexion, trunk flexion, and gaze angle. Posture was found to be influenced by IT type (p < 0.001), age (p < 0.001) and gender (p = 0.024) and significantly correlated to the stature of the participants. Measurement of resting posture and the maximal range of motion of the upper and lower cervical spines in the sagittal plane was also undertaken. The biophysical impact and the suitability of the three different information technologies are discussed.
Qiu, Jianfeng; Wang, Guozhu; Min, Jiao; Wang, Xiaoyan; Wang, Pengcheng
2013-12-21
Our aim was to measure the performance of desktop magnetic resonance imaging (MRI) systems using specially designed phantoms, by testing imaging parameters and analysing the imaging quality. We designed multifunction phantoms with diameters of 18 and 60 mm for desktop MRI scanners in accordance with the American Association of Physicists in Medicine (AAPM) report no. 28. We scanned the phantoms with three permanent magnet 0.5 T desktop MRI systems, measured the MRI image parameters, and analysed imaging quality by comparing the data with the AAPM criteria and Chinese national standards. Image parameters included: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, signal-to-noise ratio (SNR), and image uniformity. The image parameters of three desktop MRI machines could be measured using our specially designed phantoms, and most parameters were in line with MRI quality control criterion, including: resonance frequency, high contrast spatial resolution, low contrast object detectability, slice thickness, geometrical distortion, image uniformity and slice position accuracy. However, SNR was significantly lower than in some references. The imaging test and quality control are necessary for desktop MRI systems, and should be performed with the applicable phantom and corresponding standards.
Using Avizo Software on the Peregrine System | High-Performance Computing |
be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can
Desktop Publishing: A New Frontier for Instructional Technologists.
ERIC Educational Resources Information Center
Bell, Norman T.; Warner, James W.
1986-01-01
Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)
CIM for 300-mm semiconductor fab
NASA Astrophysics Data System (ADS)
Luk, Arthur
1997-08-01
Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.
Thermal Tracker: The Secret Lives of Bats and Birds Revealed
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Offshore wind developers and stakeholders can accelerate the sustainable, widespread deployment of offshore wind using a new open-source software program, called ThermalTracker. Researchers can now collect the data they need to better understand the potential effects of offshore wind turbines on bird and bat populations. This plug and play software can be used with any standard desktop computer, thermal camera, and statistical software to identify species and behaviors of animals in offshore locations.
Galileo battery testing and the impact of test automation
NASA Technical Reports Server (NTRS)
Pertuch, W. T.; Dils, C. T.
1985-01-01
Test complexity, changes of test specifications, and the demand for tight control of tests led to the development of automated testing used for Galileo and other projects. The use of standardized interfacing, i.e., IEEE-488, with desktop computers and test instruments, resulted in greater reliability, repeatability, and accuracy of both control and data reporting. Increased flexibility of test programming has reduced costs by permitting a wide spectrum of test requirements at one station rather than many stations.
MICROPROCESSOR-BASED DATA-ACQUISITION SYSTEM FOR A BOREHOLE RADAR.
Bradley, Jerry A.; Wright, David L.
1987-01-01
An efficient microprocessor-based system is described that permits real-time acquisition, stacking, and digital recording of data generated by a borehole radar system. Although the system digitizes, stacks, and records independently of a computer, it is interfaced to a desktop computer for program control over system parameters such as sampling interval, number of samples, number of times the data are stacked prior to recording on nine-track tape, and for graphics display of the digitized data. The data can be transferred to the desktop computer during recording, or it can be played back from a tape at a latter time. Using the desktop computer, the operator observes results while recording data and generates hard-copy graphics in the field. Thus, the radar operator can immediately evaluate the quality of data being obtained, modify system parameters, study the radar logs before leaving the field, and rerun borehole logs if necessary. The system has proven to be reliable in the field and has increased productivity both in the field and in the laboratory.
Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P
2017-08-01
Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.
Desktop computer graphics for RMS/payload handling flight design
NASA Technical Reports Server (NTRS)
Homan, D. J.
1984-01-01
A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.
A VM-shared desktop virtualization system based on OpenStack
NASA Astrophysics Data System (ADS)
Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie
2018-04-01
With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.
The Nimrod computational workbench: a case study in desktop metacomputing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abramson, D.; Sosic, R.; Foster, I.
The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.
Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists
Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco
2013-01-01
Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.
Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
2017-10-30
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.
An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less
Desktop Virtualization: Applications and Considerations
ERIC Educational Resources Information Center
Hodgman, Matthew R.
2013-01-01
As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…
Modeling of Heat Transfer and Ablation of Refractory Material Due to Rocket Plume Impingement
NASA Technical Reports Server (NTRS)
Harris, Michael F.; Vu, Bruce T.
2012-01-01
CR Tech's Thermal Desktop-SINDA/FLUINT software was used in the thermal analysis of a flame deflector design for Launch Complex 39B at Kennedy Space Center, Florida. The analysis of the flame deflector takes into account heat transfer due to plume impingement from expected vehicles to be launched at KSC. The heat flux from the plume was computed using computational fluid dynamics provided by Ames Research Center in Moffet Field, California. The results from the CFD solutions were mapped onto a 3-D Thermal Desktop model of the flame deflector using the boundary condition mapping capabilities in Thermal Desktop. The ablation subroutine in SINDA/FLUINT was then used to model the ablation of the refractory material.
Basics of Desktop Publishing. Teacher Edition.
ERIC Educational Resources Information Center
Beeby, Ellen
This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…
Desktop Virtualization in Action: Simplicity Is Power
ERIC Educational Resources Information Center
Fennell, Dustin
2010-01-01
Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…
Kiely, Daniel J; Stephanson, Kirk; Ross, Sue
2011-10-01
Low-cost laparoscopic box trainers built using home computers and webcams may provide residents with a useful tool for practice at home. This study set out to evaluate the image quality of low-cost laparoscopic box trainers compared with a commercially available model. Five low-cost laparoscopic box trainers including the components listed were compared in random order to one commercially available box trainer: A (high-definition USB 2.0 webcam, PC laptop), B (Firewire webcam, Mac laptop), C (high-definition USB 2.0 webcam, Mac laptop), D (standard USB webcam, PC desktop), E (Firewire webcam, PC desktop), and F (the TRLCD03 3-DMEd Standard Minimally Invasive Training System). Participants observed still image quality and performed a peg transfer task using each box trainer. Participants rated still image quality, image quality with motion, and whether the box trainer had sufficient image quality to be useful for training. Sixteen residents in obstetrics and gynecology took part in the study. The box trainers showing no statistically significant difference from the commercially available model were A, B, C, D, and E for still image quality; A for image quality with motion; and A and B for usefulness of the simulator based on image quality. The cost of the box trainers A-E is approximately $100 to $160 each, not including a computer or laparoscopic instruments. Laparoscopic box trainers built from a high-definition USB 2.0 webcam with a PC (box trainer A) or from a Firewire webcam with a Mac (box trainer B) provide image quality comparable with a commercial standard.
Desktop Security ... Now More than Ever
ERIC Educational Resources Information Center
Huber, Joe
2005-01-01
Desktop security is the foundation of your overall security plan in K-12 education. National Educational Technology Standards (NETS) mainly states that students at all grade levels should know to make changes in the default settings for the operating system and its applications.
ERIC Educational Resources Information Center
Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd
2008-01-01
This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…
ERIC Educational Resources Information Center
Tenopir, Carol
2004-01-01
With wireless connectivity and small laptop computers, people are no longer tied to the desktop for online searching. Handheld personal digital assistants (PDAs) offer even greater portability. So far, the most common uses of PDAs are as calendars and address books, or to interface with a laptop or desktop machine. More advanced PDAs, like…
Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).
ERIC Educational Resources Information Center
Guthrie, Jim
1995-01-01
Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…
WEB-BASED MODELING OF A FERTILIZER SOLUTION SPILL IN THE OHIO RIVER
Environmental computer models are usually desktop models. Some web-enabled models are beginning to appear where the user can use a browser to run the models on a central web server. Several issues arise when a desktop model is transferred to a web architecture. This paper discuss...
CD-based image archival and management on a hybrid radiology intranet.
Cox, R D; Henri, C J; Bret, P M
1997-08-01
This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.
Defining protein electrostatic recognition processes
NASA Astrophysics Data System (ADS)
Getzoff, Elizabeth D.; Roberts, Victoria A.
The objective is to elucidate the nature of electrostatic forces controlling protein recognition processes by using a tightly coupled computational and interactive computer graphics approach. The TURNIP program was developed to determine the most favorable precollision orientations for two molecules by systematic search of all orientations and evaluation of the resulting electrostatic interactions. TURNIP was applied to the transient interaction between two electron transfer metalloproteins, plastocyanin and cytochrome c. The results suggest that the productive electron-transfer complex involves interaction of the positive region of cytochrome c with the negative patch of plastocyanin, consistent with experimental data. Application of TURNIP to the formation of the stable complex between the HyHEL-5 antibody and its protein antigen lysozyme showed that long-distance electrostatic forces guide lysozyme toward the HyHEL-5 binding site, but do not fine tune its orientation. Determination of docked antigen/antibody complexes requires including steric as well as electrostatic interactions, as was done for the U10 mutant of the anti-phosphorylcholine antibody S107. The graphics program Flex, a convenient desktop workstation program for visualizing molecular dynamics and normal mode motions, was enhanced. Flex now has a user interface and was rewritten to use standard graphics libraries, so as to run on most desktop workstations.
Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms
ERIC Educational Resources Information Center
Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick
2009-01-01
This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…
Meaning-Making in Online Language Learner Interactions via Desktop Videoconferencing
ERIC Educational Resources Information Center
Satar, H. Müge
2016-01-01
Online language learning and teaching in multimodal contexts has been identified as one of the key research areas in computer-aided learning (CALL) (Lamy, 2013; White, 2014). This paper aims to explore meaning-making in online language learner interactions via desktop videoconferencing (DVC) and in doing so illustrate multimodal transcription and…
Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.
ERIC Educational Resources Information Center
Knupfer, Nancy Nelson; McIsaac, Marina Stock
1989-01-01
Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)
Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.
ERIC Educational Resources Information Center
Danziger, Pamela N.
This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…
Micromagnetics on high-performance workstation and mobile computational platforms
NASA Astrophysics Data System (ADS)
Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.
2015-05-01
The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.
Oak Ridge Institutional Cluster Autotune Test Drive Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jibonananda, Sanyal; New, Joshua Ryan
2014-02-01
The Oak Ridge Institutional Cluster (OIC) provides general purpose computational resources for the ORNL staff to run computation heavy jobs that are larger than desktop applications but do not quite require the scale and power of the Oak Ridge Leadership Computing Facility (OLCF). This report details the efforts made and conclusions derived in performing a short test drive of the cluster resources on Phase 5 of the OIC. EnergyPlus was used in the analysis as a candidate user program and the overall software environment was evaluated against anticipated challenges experienced with resources such as the shared memory-Nautilus (JICS) and Titanmore » (OLCF). The OIC performed within reason and was found to be acceptable in the context of running EnergyPlus simulations. The number of cores per node and the availability of scratch space per node allow non-traditional desktop focused applications to leverage parallel ensemble execution. Although only individual runs of EnergyPlus were executed, the software environment on the OIC appeared suitable to run ensemble simulations with some modifications to the Autotune workflow. From a standpoint of general usability, the system supports common Linux libraries, compilers, standard job scheduling software (Torque/Moab), and the OpenMPI library (the only MPI library) for MPI communications. The file system is a Panasas file system which literature indicates to be an efficient file system.« less
Lockheed Martin Idaho Technologies Company information management technology architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, M.J.; Lau, P.K.S.
1996-05-01
The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies aremore » inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other information technology infrastructure components necessary for a more integrated information technology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.« less
ERIC Educational Resources Information Center
Ausburn, Lynna J.; Ausburn, Floyd B.
2004-01-01
Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younkin, James R; Kuhn, Michael J; Gradle, Colleen
New Brunswick Laboratory (NBL) has a numerous inventory containing thousands of plutonium and uranium certified reference materials. The current manual inventory process is well established but is a lengthy process which requires significant oversight and double checking to ensure correctness. Oak Ridge National Laboratory has worked with NBL to develop and deploy a new inventory system which utilizes handheld computers with barcode scanners and radio frequency identification (RFID) readers termed the Tagged Item Inventory System (TIIS). Certified reference materials are identified by labels which incorporate RFID tags and barcodes. The label printing process and RFID tag association process are integratedmore » into the main desktop software application. Software on the handheld computers syncs with software on designated desktop machines and the NBL inventory database to provide a seamless inventory process. This process includes: 1) identifying items to be inventoried, 2) downloading the current inventory information to the handheld computer, 3) using the handheld to read item and location labels, and 4) syncing the handheld computer with a designated desktop machine to analyze the results, print reports, etc. The security of this inventory software has been a major concern. Designated roles linked to authenticated logins are used to control access to the desktop software while password protection and badge verification are used to control access to the handheld computers. The overall system design and deployment at NBL will be presented. The performance of the system will also be discussed with respect to a small piece of the overall inventory. Future work includes performing a full inventory at NBL with the Tagged Item Inventory System and comparing performance, cost, and radiation exposures to the current manual inventory process.« less
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
Digitized molecular diagnostics: reading disk-based bioassays with standard computer drives.
Li, Yunchao; Ou, Lily M L; Yu, Hua-Zhong
2008-11-01
We report herein a digital signal readout protocol for screening disk-based bioassays with standard optical drives of ordinary desktop/notebook computers. Three different types of biochemical recognition reactions (biotin-streptavidin binding, DNA hybridization, and protein-protein interaction) were performed directly on a compact disk in a line array format with the help of microfluidic channel plates. Being well-correlated with the optical darkness of the binding sites (after signal enhancement by gold nanoparticle-promoted autometallography), the reading error levels of prerecorded audio files can serve as a quantitative measure of biochemical interaction. This novel readout protocol is about 1 order of magnitude more sensitive than fluorescence labeling/scanning and has the capability of examining multiplex microassays on the same disk. Because no modification to either hardware or software is needed, it promises a platform technology for rapid, low-cost, and high-throughput point-of-care biomedical diagnostics.
Acceleration of FDTD mode solver by high-performance computing techniques.
Han, Lin; Xi, Yanping; Huang, Wei-Ping
2010-06-21
A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.
Modems and More: The Computer Branches Out.
ERIC Educational Resources Information Center
Dyrli, Odvard Egil
1986-01-01
Surveys new "peripherals," electronic devices that attach to computers. Devices such as videodisc players, desktop laser printers, large screen projectors, and input mechanisms that circumvent the keyboard dramatically expand the computer's instructional uses. (Author/LHW)
Clark, Toshimasa J; McNeeley, Michael F; Maki, Jeffrey H
2014-04-01
The Liver Imaging Reporting and Data System (LI-RADS) can enhance communication between radiologists and clinicians if applied consistently. We identified an institutional need to improve liver imaging report standardization and developed handheld and desktop software to serve this purpose. We developed two complementary applications that implement the LI-RADS schema. A mobile application for iOS devices written in the Objective-C language allows for rapid characterization of hepatic observations under a variety of circumstances. A desktop application written in the Java language allows for comprehensive observation characterization and standardized report text generation. We chose the applications' languages and feature sets based on the computing resources of target platforms, anticipated usage scenarios, and ease of application installation, deployment, and updating. Our primary results are the publication of the core source code implementing the LI-RADS algorithm and the availability of the applications for use worldwide via our website, http://www.liradsapp.com/. The Java application is free open-source software that can be integrated into nearly any vendor's reporting system. The iOS application is distributed through Apple's iTunes App Store. Observation categorizations of both programs have been manually validated to be correct. The iOS application has been used to characterize liver tumors during multidisciplinary conferences of our institution, and several faculty members, fellows, and residents have adopted the generated text of Java application into their diagnostic reports. Although these two applications were developed for the specific reporting requirements of our liver tumor service, we intend to apply this development model to other diseases as well. Through semiautomated structured report generation and observation characterization, we aim to improve patient care while increasing radiologist efficiency. Published by Elsevier Inc.
Smart Desktops for Teachers. ECS Issue Paper: Technology.
ERIC Educational Resources Information Center
Palaich, Robert M.; Good, Dixie Griffin; Stout, Connie; Vickery, Emily
This report presents the results of a study of how emerging technologies can help educators deliver standards-based education to K-12 students. The first section of the report provides background on the new technology offerings and defines smart desktop systems. The second section lists critical questions for decisionmakers related to general…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... DoD published a proposed rule in the Federal Register at 76 FR 21847 on April 19, 2011, to add DFARS..., the contract line item may be for a desktop computer, but the actual items delivered, invoiced, and..., Desktop with 20 EA CPU, Monitor, Keyboard and Mouse. Alternative line-item structure offer where monitors...
NASA Astrophysics Data System (ADS)
Druken, K. A.; Trenham, C. E.; Steer, A.; Evans, B. J. K.; Richards, C. J.; Smillie, J.; Allen, C.; Pringle, S.; Wang, J.; Wyborn, L. A.
2016-12-01
The Australian National Computational Infrastructure (NCI) provides access to petascale data in climate, weather, Earth observations, and genomics, and terascale data in astronomy, geophysics, ecology and land use, as well as social sciences. The data is centralized in a closely integrated High Performance Computing (HPC), High Performance Data (HPD) and cloud facility. Despite this, there remain significant barriers for many users to find and access the data: simply hosting a large volume of data is not helpful if researchers are unable to find, access, and use the data for their particular need. Use cases demonstrate we need to support a diverse range of users who are increasingly crossing traditional research discipline boundaries. To support their varying experience, access needs and research workflows, NCI has implemented an integrated data platform providing a range of services that enable users to interact with our data holdings. These services include: - A GeoNetwork catalog built on standardized Data Management Plans to search collection metadata, and find relevant datasets; - Web data services to download or remotely access data via OPeNDAP, WMS, WCS and other protocols; - Virtual Desktop Infrastructure (VDI) built on a highly integrated on-site cloud with access to both the HPC peak machine and research data collections. The VDI is a fully featured environment allowing visualization, code development and analysis to take place in an interactive desktop environment; and - A Learning Management System (LMS) containing User Guides, Use Case examples and Jupyter Notebooks structured into courses, so that users can self-teach how to use these facilities with examples from our system across a range of disciplines. We will briefly present these components, and discuss how we engage with data custodians and consumers to develop standardized data structures and services that support the range of needs. We will also highlight some key developments that have improved user experience in utilizing the services, particularly enabling transdisciplinary science. This work combines with other developments at NCI to increase the confidence of scientists from any field to undertake research and analysis on these important data collections regardless of their preferred work environment or level of skill.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
....usda.gov . SUPPLEMENTARY INFORMATION: A. Background A proposed rule was published in the Federal.... Computers or other technical equipment means central processing units, laptops, desktops, computer mouses...
Creating a Parallel Version of VisIt for Microsoft Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlock, B J; Biagas, K S; Rawson, P L
2011-12-07
VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less
An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.
2011-12-01
The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.
Bufton, Marcia J; Marklin, Richard W; Nagurka, Mark L; Simoneau, Guy G
2006-08-15
This study aimed to compare and analyse rubber-dome desktop, spring-column desktop and notebook keyboards in terms of key stiffness and fingertip typing force. The spring-column keyboard resulted in the highest mean peak contact force (0.86N), followed by the rubber dome desktop (0.68N) and the notebook (0.59N). All these differences were statistically significant. Likewise, the spring-column keyboard registered the highest fingertip typing force and the notebook keyboard the lowest. A comparison of forces showed the notebook (rubber dome) keyboard had the highest fingertip-to-peak contact force ratio (overstrike force), and the spring-column generated the least excess force (as a ratio of peak contact force). The results of this study could aid in optimizing computer key design that could possibly reduce subject discomfort and fatigue.
76 FR 70861 - Promoting Efficient Spending
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-15
... heads to take even more aggressive steps to ensure the Government is a good steward of taxpayer money...., mobile phones, smartphones, desktop and laptop computers, and tablet personal computers) issued to...
Torterolo, Livia; Ruffino, Francesco
2012-01-01
In the proposed demonstration we will present DCV (Desktop Cloud Visualization): a unique technology that allows users to remote access 2D and 3D interactive applications over a standard network. This allows geographically dispersed doctors work collaboratively and to acquire anatomical or pathological images and visualize them for further investigations.
The Printout: Computers and Reading in the United Kingdom.
ERIC Educational Resources Information Center
Ewing, James M.
1988-01-01
Offers an overview of some reading and language arts computer projects in the United Kingdom, including language teaching and intelligent knowledge-based systems, assessment of written style by computer, and desktop publishing in the primary school. (ARH)
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria
2018-03-01
Recently, computer, mobile phone and Internet use has increased. This study aimed to determine the possible relation between self-reported wrist and finger symptoms (aches, pain or numbness) and using computers/mobile phones, and to analyze how the symptoms are specifically associated with utilizing desktop computers, portable computers or mini-computers and mobile phones. A questionnaire was sent to 15,000 working-age Finns (age 18-65). Via a questionnaire, 723 persons reported wrist and finger symptoms often or more with use. Over 80% use mobile phones daily and less than 30% use desktop computers or the Internet daily at leisure, e.g., over 89.8% quite often or often experienced pain, numbness or aches in the neck, and 61.3% had aches in the hips and the lower back. Only 33.7% connected their symptoms to computer use. In the future, the development of new devices and Internet services should incorporate the ergonomics of the hands and wrists.
Desktop supercomputer: what can it do?
NASA Astrophysics Data System (ADS)
Bogdanov, A.; Degtyarev, A.; Korkhov, V.
2017-12-01
The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.
Farias Zuniga, Amanda M; Côté, Julie N
2017-06-01
The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.
A virtual computer lab for distance biomedical technology education.
Locatis, Craig; Vega, Anibal; Bhagwat, Medha; Liu, Wei-Li; Conde, Jose
2008-03-13
The National Library of Medicine's National Center for Biotechnology Information offers mini-courses which entail applying concepts in biochemistry and genetics to search genomics databases and other information sources. They are highly interactive and involve use of 3D molecular visualization software that can be computationally taxing. Methods were devised to offer the courses at a distance so as to provide as much functionality of a computer lab as possible, the venue where they are normally taught. The methods, which can be employed with varied videoconferencing technology and desktop sharing software, were used to deliver mini-courses at a distance in pilot applications where students could see demonstrations by the instructor and the instructor could observe and interact with students working at their remote desktops. Student ratings of the learning experience and comments to open ended questions were similar to those when the courses are offered face to face. The real time interaction and the instructor's ability to access student desktops from a distance in order to provide individual assistance and feedback were considered invaluable. The technologies and methods mimic much of the functionality of computer labs and may be usefully applied in any context where content changes frequently, training needs to be offered on complex computer applications at a distance in real time, and where it is necessary for the instructor to monitor students as they work.
Comfort with Computers in the Library.
ERIC Educational Resources Information Center
Agati, Joseph
2002-01-01
Sets forth a list of do's and don't's when integrating aesthetics, functionality, and technology into college library computer workstation furniture. The article discusses workstation access for both portable computer users and for staff, whose needs involve desktop computers that are possibly networked with printers and other peripherals. (GR)
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2012 CFR
2012-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2014 CFR
2014-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2013 CFR
2013-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2011 CFR
2011-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2010 CFR
2010-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
Temporal and spatial organization of doctors' computer usage in a UK hospital department.
Martins, H M G; Nightingale, P; Jones, M R
2005-06-01
This paper describes the use of an application accessible via distributed desktop computing and wireless mobile devices in a specialist department of a UK acute hospital. Data (application logs, in-depth interviews, and ethnographic observation) were simultaneously collected to study doctors' work via this application, when and where they accessed different areas of it, and from what computing devices. These show that the application is widely used, but in significantly different ways over time and space. For example, physicians and surgeons differ in how they use the application and in their choice of mobile or desktop computing. Consultants and junior doctors in the same teams also seem to access different sources of patient information, at different times, and from different locations. Mobile technology was used almost exclusively during the morning by groups of clinicians, predominantly for ward rounds.
Towards a Taxonomy of Metaphorical Graphical User Interfaces: Demands and Implementations.
ERIC Educational Resources Information Center
Cates, Ward Mitchell
The graphical user interface (GUI) has become something of a standard for instructional programs in recent years. One type of GUI is the metaphorical type. For example, the Macintosh GUI is based on the "desktop" metaphor where objects one manipulates within the GUI are implied to be objects one might find in a real office's desktop.…
Assessment of drug information resource preferences of pharmacy students and faculty
Hanrahan, Conor T.; Cole, Sabrina W.
2014-01-01
A 39-item survey instrument was distributed to faculty and students at Wingate University School of Pharmacy to assess student and faculty drug information (DI) resource use and access preferences. The response rate was 81% (n = 289). Faculty and professional year 2 to 4 students preferred access on laptop or desktop computers (67% and 75%, respectively), followed by smartphones (27% and 22%, respectively). Most faculty and students preferred using Lexicomp Online for drug information (53% and 74%, respectively). Results indicate that DI resources use is similar between students and faculty; laptop or desktop computers are the preferred platforms for accessing drug information. PMID:24860270
Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.
2015-12-01
Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.
Dynamic Collaboration Infrastructure for Hydrologic Science
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.
2016-12-01
Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.
Where the Cloud Meets the Commons
ERIC Educational Resources Information Center
Ipri, Tom
2011-01-01
Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…
76 FR 43278 - Privacy Act; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... computer (PC). The Security Management Officer's office remains locked when not in use. RETENTION AND... records to include names, addresses, social security numbers, service computation dates, leave usage data... that resides on a desktop computer. RETRIEVABILITY: Records maintained in file folders are indexed and...
Graphics processing unit (GPU)-based computation of heat conduction in thermally anisotropic solids
NASA Astrophysics Data System (ADS)
Nahas, C. A.; Balasubramaniam, Krishnan; Rajagopal, Prabhu
2013-01-01
Numerical modeling of anisotropic media is a computationally intensive task since it brings additional complexity to the field problem in such a way that the physical properties are different in different directions. Largely used in the aerospace industry because of their lightweight nature, composite materials are a very good example of thermally anisotropic media. With advancements in video gaming technology, parallel processors are much cheaper today and accessibility to higher-end graphical processing devices has increased dramatically over the past couple of years. Since these massively parallel GPUs are very good in handling floating point arithmetic, they provide a new platform for engineers and scientists to accelerate their numerical models using commodity hardware. In this paper we implement a parallel finite difference model of thermal diffusion through anisotropic media using the NVIDIA CUDA (Compute Unified device Architecture). We use the NVIDIA GeForce GTX 560 Ti as our primary computing device which consists of 384 CUDA cores clocked at 1645 MHz with a standard desktop pc as the host platform. We compare the results from standard CPU implementation for its accuracy and speed and draw implications for simulation using the GPU paradigm.
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-01-01
Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367
2014-01-01
Next generation sequencing (NGS) of metagenomic samples is becoming a standard approach to detect individual species or pathogenic strains of microorganisms. Computer programs used in the NGS community have to balance between speed and sensitivity and as a result, species or strain level identification is often inaccurate and low abundance pathogens can sometimes be missed. We have developed Taxoner, an open source, taxon assignment pipeline that includes a fast aligner (e.g. Bowtie2) and a comprehensive DNA sequence database. We tested the program on simulated datasets as well as experimental data from Illumina, IonTorrent, and Roche 454 sequencing platforms. We found that Taxoner performs as well as, and often better than BLAST, but requires two orders of magnitude less running time meaning that it can be run on desktop or laptop computers. Taxoner is slower than the approaches that use small marker databases but is more sensitive due the comprehensive reference database. In addition, it can be easily tuned to specific applications using small tailored databases. When applied to metagenomic datasets, Taxoner can provide a functional summary of the genes mapped and can provide strain level identification. Taxoner is written in C for Linux operating systems. The code and documentation are available for research applications at http://code.google.com/p/taxoner. PMID:25077800
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-06-15
The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.
Pongor, Lőrinc S; Vera, Roberto; Ligeti, Balázs
2014-01-01
Next generation sequencing (NGS) of metagenomic samples is becoming a standard approach to detect individual species or pathogenic strains of microorganisms. Computer programs used in the NGS community have to balance between speed and sensitivity and as a result, species or strain level identification is often inaccurate and low abundance pathogens can sometimes be missed. We have developed Taxoner, an open source, taxon assignment pipeline that includes a fast aligner (e.g. Bowtie2) and a comprehensive DNA sequence database. We tested the program on simulated datasets as well as experimental data from Illumina, IonTorrent, and Roche 454 sequencing platforms. We found that Taxoner performs as well as, and often better than BLAST, but requires two orders of magnitude less running time meaning that it can be run on desktop or laptop computers. Taxoner is slower than the approaches that use small marker databases but is more sensitive due the comprehensive reference database. In addition, it can be easily tuned to specific applications using small tailored databases. When applied to metagenomic datasets, Taxoner can provide a functional summary of the genes mapped and can provide strain level identification. Taxoner is written in C for Linux operating systems. The code and documentation are available for research applications at http://code.google.com/p/taxoner.
Feasibility of video codec algorithms for software-only playback
NASA Astrophysics Data System (ADS)
Rodriguez, Arturo A.; Morse, Ken
1994-05-01
Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.
Template Interfaces for Agile Parallel Data-Intensive Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.
Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less
Automation of electromagnetic compatability (EMC) test facilities
NASA Technical Reports Server (NTRS)
Harrison, C. A.
1986-01-01
Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.
Addressing Small Computers in the First OS Course
ERIC Educational Resources Information Center
Nutt, Gary
2006-01-01
Small computers are emerging as important components of the contemporary computing scene. Their operating systems vary from specialized software for an embedded system to the same style of OS used on a generic desktop or server computer. This article describes a course in which systems are classified by their hardware capability and the…
75 FR 32915 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-10
... used to authenticate authorized desktop and laptop computer users. Computer servers are scanned monthly... data is also used for management and statistical reports and studies. Routine uses of records... duties. The computer files are password protected with access restricted to authorized users. Records are...
Emotion scents: a method of representing user emotions on GUI widgets
NASA Astrophysics Data System (ADS)
Cernea, Daniel; Weber, Christopher; Ebert, Achim; Kerren, Andreas
2013-01-01
The world of desktop interfaces has been dominated for years by the concept of windows and standardized user interface (UI) components. Still, while supporting the interaction and information exchange between the users and the computer system, graphical user interface (GUI) widgets are rather one-sided, neglecting to capture the subjective facets of the user experience. In this paper, we propose a set of design guidelines for visualizing user emotions on standard GUI widgets (e.g., buttons, check boxes, etc.) in order to enrich the interface with a new dimension of subjective information by adding support for emotion awareness as well as post-task analysis and decision making. We highlight the use of an EEG headset for recording the various emotional states of the user while he/she is interacting with the widgets of the interface. We propose a visualization approach, called emotion scents, that allows users to view emotional reactions corresponding to di erent GUI widgets without in uencing the layout or changing the positioning of these widgets. Our approach does not focus on highlighting the emotional experience during the interaction with an entire system, but on representing the emotional perceptions and reactions generated by the interaction with a particular UI component. Our research is motivated by enabling emotional self-awareness and subjectivity analysis through the proposed emotionenhanced UI components for desktop interfaces. These assumptions are further supported by an evaluation of emotion scents.
Optimizing R with SparkR on a commodity cluster for biomedical research.
Sedlmayr, Martin; Würfl, Tobias; Maier, Christian; Häberle, Lothar; Fasching, Peter; Prokosch, Hans-Ulrich; Christoph, Jan
2016-12-01
Medical researchers are challenged today by the enormous amount of data collected in healthcare. Analysis methods such as genome-wide association studies (GWAS) are often computationally intensive and thus require enormous resources to be performed in a reasonable amount of time. While dedicated clusters and public clouds may deliver the desired performance, their use requires upfront financial efforts or anonymous data, which is often not possible for preliminary or occasional tasks. We explored the possibilities to build a private, flexible cluster for processing scripts in R based on commodity, non-dedicated hardware of our department. For this, a GWAS-calculation in R on a single desktop computer, a Message Passing Interface (MPI)-cluster, and a SparkR-cluster were compared with regards to the performance, scalability, quality, and simplicity. The original script had a projected runtime of three years on a single desktop computer. Optimizing the script in R already yielded a significant reduction in computing time (2 weeks). By using R-MPI and SparkR, we were able to parallelize the computation and reduce the time to less than three hours (2.6 h) on already available, standard office computers. While MPI is a proven approach in high-performance clusters, it requires rather static, dedicated nodes. SparkR and its Hadoop siblings allow for a dynamic, elastic environment with automated failure handling. SparkR also scales better with the number of nodes in the cluster than MPI due to optimized data communication. R is a popular environment for clinical data analysis. The new SparkR solution offers elastic resources and allows supporting big data analysis using R even on non-dedicated resources with minimal change to the original code. To unleash the full potential, additional efforts should be invested to customize and improve the algorithms, especially with regards to data distribution. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...
Code of Federal Regulations, 2013 CFR
2013-01-01
..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...
The Natural Link between Teaching History and Computer Skills.
ERIC Educational Resources Information Center
Farnworth, George M.
1992-01-01
Suggests that, because both history and computers are information based, there is an natural link between the two. Argues that history teachers should exploit the technology to help students to understand history while they become computer literate. Points out uses for databases, word processing, desktop publishing, and telecommunications in…
Training in pathology informatics: implementation at the University of Pittsburgh.
Harrison, James H; Stewart, Jimmie
2003-08-01
Pathology informatics is generally recognized as an important component of pathology training, but the scope, form, and goals of informatics training vary substantially between pathology residency programs. The Training and Education Committee of the Association for Pathology Informatics (API TEC) has developed a standard set of knowledge and skills objectives that are recommended for inclusion in pathology informatics training and may serve to standardize and formalize training programs in this area. The University of Pittsburgh (Pittsburgh, Pa) core rotation in pathology informatics includes most of these goals and is offered as an implementation model for pathology informatics training. The core rotation in pathology informatics is a 3-week, full-time rotation including didactic sessions and hands-on laboratories. Topics include general desktop computing and the Internet, but the primary focus of the rotation is vocabulary and concepts related to enterprise and pathology information systems, pathology practice, and research. The total contact time is 63 hours, and a total of 19 faculty and staff contribute. Pretests and posttests are given at the start and end of the rotation. Performance and course evaluation data were collected for 3 years (a total of 21 residents). The rotation implements 84% of the knowledge objectives and 94% of the skills objectives recommended by the API TEC. Residents scored an average of about 20% on the pretest and about 70% on the posttest for an average increase during the course of 50%. Posttest scores did not correlate with pretest scores or self-assessed computer skill level. The size of the pretest/posttest difference correlated negatively with the pretest scores and self-assessed computing skill level. Pretest scores were generally low regardless of whether residents were familiar with desktop computing and productivity applications, indicating that even residents who are computer "savvy" have limited knowledge of pathology informatics topics. Posttest scores showed that all residents' knowledge increased substantially during the course and that residents who were computing novices were not disadvantaged. In fact, novices tended to have higher pretest/posttest differences, indicating that the rotation effectively supported initially less knowledgeable residents in "catching up" to their peers and achieving an appropriate competency level. This rotation provides a formal training model that implements the API TEC recommendations with demonstrated success.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
An Ecological Framework for Cancer Communication: Implications for Research
Intille, Stephen S; Zabinski, Marion F
2005-01-01
The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614
NASA Astrophysics Data System (ADS)
Fisher, W. I.
2017-12-01
The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.
A hardware-in-the-loop simulation program for ground-based radar
NASA Astrophysics Data System (ADS)
Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna
2011-06-01
A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.
Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.
ERIC Educational Resources Information Center
Ogg, Harold C.
This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…
Code of Federal Regulations, 2011 CFR
2011-01-01
... home computer systems of an employee; or (4) Whether the information is active or inactive. (k) Record... (e.g., e-mail, databases, spreadsheets, PowerPoint presentations, electronic reporting systems... information is stored or located, including network servers, desktop or laptop computers and handheld...
ERIC Educational Resources Information Center
Zhao, Jensen J.; Ray, Charles M.; Dye, Lee J.; Davis, Rodney
1998-01-01
Executives (n=63) and office-systems educators (n=88) recommended for workers the following categories of computer end-user skills: hardware, operating systems, word processing, spreadsheets, database, desktop publishing, and presentation. (SK)
Tablet PCs: A Physical Educator's New Clipboard
ERIC Educational Resources Information Center
Nye, Susan B.
2010-01-01
Computers in education have come a long way from the abacus of 5,000 years ago to the desktop and laptop computers of today. Computers have transformed the educational environment, and with each new iteration of smaller and more powerful machines come additional advantages for teaching practices. The Tablet PC is one. Tablet PCs are fully…
ERIC Educational Resources Information Center
Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.
2006-01-01
Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…
The Role of Wireless Computing Technology in the Design of Schools.
ERIC Educational Resources Information Center
Nair, Prakash
This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…
Ultra-Scale Computing for Emergency Evacuation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng
2010-01-01
Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, itmore » is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.« less
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Definitions. 23.701... DRUG-FREE WORKPLACE Contracting for Environmentally Preferable Products and Services 23.701 Definitions. As used in this subpart— Computer monitor means a video display unit used with a computer. Desktop...
Lock It Up! Computer Security.
ERIC Educational Resources Information Center
Wodarz, Nan
1997-01-01
The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…
Desktop chaotic systems: Intuition and visualization
NASA Technical Reports Server (NTRS)
Bright, Michelle M.; Melcher, Kevin J.; Qammar, Helen K.; Hartley, Tom T.
1993-01-01
This paper presents a dynamic study of the Wildwood Pendulum, a commercially available desktop system which exhibits a strange attractor. The purpose of studying this chaotic pendulum is twofold: to gain insight in the paradigmatic approach of modeling, simulating, and determining chaos in nonlinear systems; and to provide a desktop model of chaos as a visual tool. For this study, the nonlinear behavior of this chaotic pendulum is modeled, a computer simulation is performed, and an experimental performance is measured. An assessment of the pendulum in the phase plane shows the strange attractor. Through the use of a box-assisted correlation dimension methodology, the attractor dimension is determined for both the model and the experimental pendulum systems. Correlation dimension results indicate that the pendulum and the model are chaotic and their fractal dimensions are similar.
ERIC Educational Resources Information Center
Green, Kenneth C.
This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…
ERIC Educational Resources Information Center
Porec, Carol J.
1989-01-01
Describes how "The Children's Writing and Publishing Center" (a desktop publishing program for elementary students) combines word processing with computer graphics and motivates students to write letters. (MM)
ERIC Educational Resources Information Center
Brink, Dan
1987-01-01
Reviews the current state of printing software and printing hardware compatibility and capacity. Discusses the changing relationship between author and publisher resulting from the advent of desktop publishing. (LMO)
A Librarian Without Books:Systems Librarianship in Astronomy
NASA Astrophysics Data System (ADS)
Kneale, R. A.
2007-10-01
The author discusses one aspect of the changing nature of librarianship by focusing on a high-tech microcosm of an already high-tech profession, that of systems librarianship. She is the Systems Librarian for the Advanced Technology Solar Telescope (ATST) project, based in Tucson, Arizona. The project is engaged in the design and development of a 4-meter solar telescope, planned for the summit of Haleakalā, Maui, Hawai'i. Most of the day-to-day tasks at ATST involve software in one form or another; the author makes heavy use of Remote Desktop and Virtual Network Computing (VNC) to manage installations on eight different servers (four Windows, four Unix) in two states, plus staff desktops (Windows XP) from the comfy chair in front of her computer.
Collaborative visual analytics of radio surveys in the Big Data era
NASA Astrophysics Data System (ADS)
Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.
2017-06-01
Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.
Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki
2009-01-01
We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.
LTCP 2D Graphical User Interface. Application Description and User's Guide
NASA Technical Reports Server (NTRS)
Ball, Robert; Navaz, Homayun K.
1996-01-01
A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2011 CFR
2011-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2013 CFR
2013-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
7 CFR 2.98 - Director, Management Services.
Code of Federal Regulations, 2012 CFR
2012-01-01
... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...
Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N
2017-03-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.
Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.
2016-01-01
High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692
48 CFR 52.223-16 - Acquisition of EPEAT-Registered Personal Computer Products.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) and liquid crystal display (LCD). Desktop computer means a computer where the main unit is intended to... periods of time either with or without a direct connection to an AC power source. Notebooks must utilize... products that, at the time of submission of proposals and at the time of award, were EPEAT® bronze...
Using "Audacity" and One Classroom Computer to Experiment with Timbre
ERIC Educational Resources Information Center
Smith, Kenneth H.
2011-01-01
One computer, one class, and one educator can be an effective combination to engage students as a group in music composition, performance, and analysis. Having one desktop computer and a television monitor in the music classroom is not an uncommon or new scenario, especially in a time when many school budgets are being cut. This article…
Numerical Optimization Using Desktop Computers
1980-09-11
concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong
2017-01-01
Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936
Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2017-04-01
Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.
Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?
Billah, Syed Masum; Ashok, Vikas; Porter, Donald E.; Ramakrishnan, IV
2017-01-01
Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments. PMID:28782061
Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?
Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V
2017-05-01
Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.
NASA Astrophysics Data System (ADS)
Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.
2013-12-01
A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)
A hierarchical network-based algorithm for multi-scale watershed delineation
NASA Astrophysics Data System (ADS)
Castronova, Anthony M.; Goodall, Jonathan L.
2014-11-01
Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.
Technology in Education: Research Says!!
ERIC Educational Resources Information Center
Canuel, Ron
2011-01-01
A large amount of research existed in the field of technology in the classroom; however, almost all was focused on the impact of desktop computers and the infamous "school computer room". However, the activities in a classroom represent a multitude of behaviours and interventions, including personal dynamics, classroom management and…
USDA-ARS?s Scientific Manuscript database
Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...
Distributing Data from Desktop to Hand-Held Computers
NASA Technical Reports Server (NTRS)
Elmore, Jason L.
2005-01-01
A system of server and client software formats and redistributes data from commercially available desktop to commercially available hand-held computers via both wired and wireless networks. This software is an inexpensive means of enabling engineers and technicians to gain access to current sensor data while working in locations in which such data would otherwise be inaccessible. The sensor data are first gathered by a data-acquisition server computer, then transmitted via a wired network to a data-distribution computer that executes the server portion of the present software. Data in all sensor channels -- both raw sensor outputs in millivolt units and results of conversion to engineering units -- are made available for distribution. Selected subsets of the data are transmitted to each hand-held computer via the wired and then a wireless network. The selection of the subsets and the choice of the sequences and formats for displaying the data is made by means of a user interface generated by the client portion of the software. The data displayed on the screens of hand-held units can be updated at rates from 1 to
"Software Tools" to Improve Student Writing.
ERIC Educational Resources Information Center
Oates, Rita Haugh
1987-01-01
Reviews several software packages that analyze text readability, check for spelling and style problems, offer desktop publishing capabilities, teach interviewing skills, and teach grammar using a computer game. (SRT)
Desktop system for accounting, audit, and research in A&E.
Taylor, C J; Brain, S G; Bull, F; Crosby, A C; Ferguson, D G
1997-01-01
The development of a database for audit, research, and accounting in accident and emergency (A&E) is described. The system uses a desktop computer, an optical scanner, sophisticated optical mark reader software, and workload management data. The system is highly flexible, easy to use, and at a cost of around 16,000 pounds affordable for larger departments wishing to move towards accounting. For smaller departments, it may be an alternative to full computerisation. Images Figure 1 Figure 2 Figure 3 Figure 5 Figure 6 PMID:9132200
Bringing the medical library to the office desktop.
Brown, S R; Decker, G; Pletzke, C J
1991-01-01
This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.
Stress echocardiography with smartphone: real-time remote reading for regional wall motion.
Scali, Maria Chiara; de Azevedo Bellagamba, Clarissa Carmona; Ciampi, Quirino; Simova, Iana; de Castro E Silva Pretto, José Luis; Djordjevic-Dikic, Ana; Dodi, Claudio; Cortigiani, Lauro; Zagatina, Angela; Trambaiolo, Paolo; Torres, Marco R; Citro, Rodolfo; Colonna, Paolo; Paterni, Marco; Picano, Eugenio
2017-11-01
The diffusion of smart-phones offers access to the best remote expertise in stress echo (SE). To evaluate the reliability of SE based on smart-phone filming and reading. A set of 20 SE video-clips were read in random sequence with a multiple choice six-answer test by ten readers from five different countries (Italy, Brazil, Serbia, Bulgaria, Russia) of the "SE2020" study network. The gold standard to assess accuracy was a core-lab expert reader in agreement with angiographic verification (0 = wrong, 1 = right). The same set of 20 SE studies were read, in random order and >2 months apart, on desktop Workstation and via smartphones by ten remote readers. Image quality was graded from 1 = poor but readable, to 3 = excellent. Kappa (k) statistics was used to assess intra- and inter-observer agreement. The image quality was comparable in desktop workstation vs. smartphone (2.0 ± 0.5 vs. 2.4 ± 0.7, p = NS). The average reading time per case was similar for desktop versus smartphone (90 ± 39 vs. 82 ± 54 s, p = NS). The overall diagnostic accuracy of the ten readers was similar for desktop workstation vs. smartphone (84 vs. 91%, p = NS). Intra-observer agreement (desktop vs. smartphone) was good (k = 0.81 ± 0.14). Inter-observer agreement was good and similar via desktop or smartphone (k = 0.69 vs. k = 0.72, p = NS). The diagnostic accuracy and consistency of SE reading among certified readers was high and similar via desktop workstation or via smartphone.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
SkinScan©: A PORTABLE LIBRARY FOR MELANOMA DETECTION ON HANDHELD DEVICES
Wadhawan, Tarun; Situ, Ning; Lancaster, Keith; Yuan, Xiaojing; Zouridakis, George
2011-01-01
We have developed a portable library for automated detection of melanoma termed SkinScan© that can be used on smartphones and other handheld devices. Compared to desktop computers, embedded processors have limited processing speed, memory, and power, but they have the advantage of portability and low cost. In this study we explored the feasibility of running a sophisticated application for automated skin cancer detection on an Apple iPhone 4. Our results demonstrate that the proposed library with the advanced image processing and analysis algorithms has excellent performance on handheld and desktop computers. Therefore, deployment of smartphones as screening devices for skin cancer and other skin diseases can have a significant impact on health care delivery in underserved and remote areas. PMID:21892382
The social computing room: a multi-purpose collaborative visualization environment
NASA Astrophysics Data System (ADS)
Borland, David; Conway, Michael; Coposky, Jason; Ginn, Warren; Idaszak, Ray
2010-01-01
The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.
GREEN SUPERCOMPUTING IN A DESKTOP BOX
DOE Office of Scientific and Technical Information (OSTI.GOV)
HSU, CHUNG-HSING; FENG, WU-CHUN; CHING, AVERY
2007-01-17
The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicatedmore » computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.« less
A Survey of Students Participating in a Computer-Assisted Education Programme
ERIC Educational Resources Information Center
Yel, Elif Binboga; Korhan, Orhan
2015-01-01
This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…
ERIC Educational Resources Information Center
Chester, Ivan
2007-01-01
CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…
The Human-Computer Interaction of Cross-Cultural Gaming Strategy
ERIC Educational Resources Information Center
Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander
2015-01-01
This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…
Interpretation of Coronary Angiograms Recorded Using Google Glass: A Comparative Analysis.
Duong, Thao; Wosik, Jedrek; Christakopoulos, Georgios E; Martínez Parachini, José Roberto; Karatasakis, Aris; Tarar, Muhammad Nauman Javed; Resendes, Erica; Rangan, Bavana V; Roesle, Michele; Grodin, Jerrold; Abdullah, Shuaib M; Banerjee, Subhash; Brilakis, Emmanouil S
2015-10-01
Google Glass (Google, Inc) is a voice-activated, hands-free, optical head-mounted display device capable of taking pictures, recording videos, and transmitting data via wi-fi. In the present study, we examined the accuracy of coronary angiogram interpretation, recorded using Google Glass. Google Glass was used to record 15 angiograms with 17 major findings and the participants were asked to interpret those recordings on: (1) an iPad (Apple, Inc); or (2) a desktop computer. Interpretation was compared with the original angiograms viewed on a desktop. Ten physicians (2 interventional cardiologists and 8 cardiology fellows) participated. One point was assigned for each correct finding, for a maximum of 17 points. The mean angiogram interpretation score for Google Glass angiogram recordings viewed on an iPad or a desktop vs the original angiograms viewed on a desktop was 14.9 ± 1.1, 15.2 ± 1.8, and 15.9 ± 1.1, respectively (P=.06 between the iPad and the original angiograms, P=.51 between the iPad and recordings viewed on a desktop, and P=.43 between the recordings viewed on a desktop and the original angiograms). In a post-study survey, one of the 10 physicians (10%) was "neutral" with the quality of the recordings using Google Glass, 6 physicians (60%) were "somewhat satisfied," and 3 physicians (30%) were "very satisfied." This small pilot study suggests that the quality of coronary angiogram video recordings obtained using Google Glass may be adequate for recognition of major findings, supporting its expanding use in telemedicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, B.
1997-07-01
Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., electronic tapes and back-up tapes, optical discs, CD-ROMS, and DVDs), and voicemail records; (2) Where the information is stored or located, including network servers, desktop or laptop computers and handheld...
A framework for interactive visualization of digital medical images.
Koehring, Andrew; Foo, Jung Leng; Miyano, Go; Lobe, Thom; Winer, Eliot
2008-10-01
The visualization of medical images obtained from scanning techniques such as computed tomography and magnetic resonance imaging is a well-researched field. However, advanced tools and methods to manipulate these data for surgical planning and other tasks have not seen widespread use among medical professionals. Radiologists have begun using more advanced visualization packages on desktop computer systems, but most physicians continue to work with basic two-dimensional grayscale images or not work directly with the data at all. In addition, new display technologies that are in use in other fields have yet to be fully applied in medicine. It is our estimation that usability is the key aspect in keeping this new technology from being more widely used by the medical community at large. Therefore, we have a software and hardware framework that not only make use of advanced visualization techniques, but also feature powerful, yet simple-to-use, interfaces. A virtual reality system was created to display volume-rendered medical models in three dimensions. It was designed to run in many configurations, from a large cluster of machines powering a multiwalled display down to a single desktop computer. An augmented reality system was also created for, literally, hands-on interaction when viewing models of medical data. Last, a desktop application was designed to provide a simple visualization tool, which can be run on nearly any computer at a user's disposal. This research is directed toward improving the capabilities of medical professionals in the tasks of preoperative planning, surgical training, diagnostic assistance, and patient education.
Comparison of bias analysis strategies applied to a large data set.
Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M
2014-07-01
Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.
ERIC Educational Resources Information Center
Bozzone, Meg A.
1997-01-01
Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)
Smith, W.K.
1982-01-01
The mathematical method of determining in-situ stresses by overcoring, using either the U.S. Bureau of Mines Borehole Deformation Gage or the Commonwealth Scientific and Industrial Research Organisation Hollow Inclusion Stress Cell, is summarized, and data reduction programs for each type of instrument, written in BASIC, are presented. The BASIC programs offer several advantages over previously available FORTRAN programs. They can be executed on a desk-top microcomputer at or near the field site, allowing the investigator to assess the quality of the data and make decisions on the need for additional testing while the crew is still in the field. Also, data input is much simpler than with currently available FORTRAN programs; either English or SI units can be used; and standard deviations of the principal stresses are computed as well as those of the geographic components.
Cazzaniga, Paolo; Nobile, Marco S.; Besozzi, Daniela; Bellini, Matteo; Mauri, Giancarlo
2014-01-01
The introduction of general-purpose Graphics Processing Units (GPUs) is boosting scientific applications in Bioinformatics, Systems Biology, and Computational Biology. In these fields, the use of high-performance computing solutions is motivated by the need of performing large numbers of in silico analysis to study the behavior of biological systems in different conditions, which necessitate a computing power that usually overtakes the capability of standard desktop computers. In this work we present coagSODA, a CUDA-powered computational tool that was purposely developed for the analysis of a large mechanistic model of the blood coagulation cascade (BCC), defined according to both mass-action kinetics and Hill functions. coagSODA allows the execution of parallel simulations of the dynamics of the BCC by automatically deriving the system of ordinary differential equations and then exploiting the numerical integration algorithm LSODA. We present the biological results achieved with a massive exploration of perturbed conditions of the BCC, carried out with one-dimensional and bi-dimensional parameter sweep analysis, and show that GPU-accelerated parallel simulations of this model can increase the computational performances up to a 181× speedup compared to the corresponding sequential simulations. PMID:25025072
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
Streamlined, Inexpensive 3D Printing of the Brain and Skull.
Naftulin, Jason S; Kimchi, Eyal Y; Cash, Sydney S
2015-01-01
Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3-4 in consumable plastic filament as described, and the total process takes 14-17 hours, almost all of which is unsupervised (preprocessing = 4-6 hr; printing = 9-11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1-5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes.
JavaScript Access to DICOM Network and Objects in Web Browser.
Drnasin, Ivan; Grgić, Mislav; Gogić, Goran
2017-10-01
Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.
A Streaming Language Implementation of the Discontinuous Galerkin Method
NASA Technical Reports Server (NTRS)
Barth, Timothy; Knight, Timothy
2005-01-01
We present a Brook streaming language implementation of the 3-D discontinuous Galerkin method for compressible fluid flow on tetrahedral meshes. Efficient implementation of the discontinuous Galerkin method using the streaming model of computation introduces several algorithmic design challenges. Using a cycle-accurate simulator, performance characteristics have been obtained for the Stanford Merrimac stream processor. The current Merrimac design achieves 128 Gflops per chip and the desktop board is populated with 16 chips yielding a peak performance of 2 Teraflops. Total parts cost for the desktop board is less than $20K. Current cycle-accurate simulations for discretizations of the 3-D compressible flow equations yield approximately 40-50% of the peak performance of the Merrimac streaming processor chip. Ongoing work includes the assessment of the performance of the same algorithm on the 2 Teraflop desktop board with a target goal of achieving 1 Teraflop performance.
Utilization of KSC Present Broadband Communications Data System for Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2002-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
ERIC Educational Resources Information Center
Peng, Hsinyi; Chou, Chien; Chang, Chun-Yu
2008-01-01
Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend toward ubiquitous computing is evolving. In this study, we re-visit the interactivity concept and its applications for interactive function design in a ubiquitous-learning system (ULS). Further, we compare interactivity dimensions and…
Taking the Plunge: Districts Leap into Virtualization
ERIC Educational Resources Information Center
Demski, Jennifer
2010-01-01
Moving from a traditional desktop computing environment to a virtualized solution is a daunting task. In this article, the author presents case histories of three districts that have made the conversion to virtual computing to learn about their experiences: What prompted them to make the move, and what were their objectives? Which obstacles prove…
Computer assisted yarding cost analysis.
Ronald W. Mifflin
1980-01-01
Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...
Desktop Social Science: Coming of Age.
ERIC Educational Resources Information Center
Dwyer, David C.; And Others
Beginning in 1985, Apple Computer, Inc. and several school districts began a collaboration to examine the impact of intensive computer use on instruction and learning in K-12 classrooms. This paper follows the development of a Macintosh II-based management and retrieval system for text data undertaken to store and retrieve oral reflections of…
Refocusing the Vision: The Future of Instructional Technology
ERIC Educational Resources Information Center
Pence, Harry E.; McIntosh, Steven
2011-01-01
Two decades ago, many campuses mobilized a major effort to deal with a clear problem; faculty and students needed access to desktop computing technologies. Now the situation is much more complex. Responding to the current challenges, like mobile computing and social networking, will be ore difficult but equally important. There is a clear need for…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-12
...., desktop, laptop, handheld or other computer types) containing protected personal identifiers or PHI is... as the National Indian Women's Resource Center, to conduct analytical and evaluation studies. 8... SYSTEM: STORAGE: File folders, ledgers, card files, microfiche, microfilm, computer tapes, disk packs...
Utilization of KSC Present Broadband Communications Data System For Digital Video Services
NASA Technical Reports Server (NTRS)
Andrawis, Alfred S.
2001-01-01
This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.
Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan
2016-06-27
The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.
Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Arnegard, Ruth J.; Comstock, J. R., Jr.
1991-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
The multi-attribute task battery for human operator workload and strategic behavior research
NASA Technical Reports Server (NTRS)
Comstock, J. Raymond, Jr.; Arnegard, Ruth J.
1992-01-01
The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.
DataPlus - a revolutionary applications generator for DOS hand-held computers
David Dean; Linda Dean
2000-01-01
DataPlus allows the user to easily design data collection templates for DOS-based hand-held computers that mimic clipboard data sheets. The user designs and tests the application on the desktop PC and then transfers it to a DOS field computer. Other features include: error checking, missing data checks, and sensor input from RS-232 devices such as bar code wands,...
2-D Animation's Not Just for Mickey Mouse.
ERIC Educational Resources Information Center
Weinman, Lynda
1995-01-01
Discusses characteristics of two-dimensional (2-D) animation; highlights include character animation, painting issues, and motion graphics. Sidebars present Silicon Graphics animations tools and 2-D animation programs for the desktop computer. (DGM)
Gobba, Fabriziomaria
2017-01-01
The aim of this research is to study the symptoms and use of computers/mobile phones of individuals nearing retirement age (≥55 years). A questionnaire was sent to 15,000 Finns (aged 18–65). People who were ≥55 years of age were compared to the rest of the population. Six thousand one hundred and twenty-one persons responded to the questionnaire; 1226 of them were ≥55 years of age. Twenty-four percent of the ≥55-year-old respondents used desktop computers daily for leisure; 47.8% of them frequently experienced symptoms in the neck, and 38.5% in the shoulders. Workers aged ≥55 years had many more physical symptoms than younger people, except with respect to symptoms of the neck. Female daily occupational users of desktop computers had more physical symptoms in the neck. It is essential to take into account that, for people aged ≥55 years, the use of technology can be a sign of wellness. However, physical symptoms in the neck can be associated with the use of computers. PMID:28991182
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
MICA: desktop software for comprehensive searching of DNA databases
Stokes, William A; Glick, Benjamin S
2006-01-01
Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays) that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software. PMID:17018144
Johnson, Timothy C.; Versteeg, Roelof J.; Ward, Andy; Day-Lewis, Frederick D.; Revil, André
2010-01-01
Electrical geophysical methods have found wide use in the growing discipline of hydrogeophysics for characterizing the electrical properties of the subsurface and for monitoring subsurface processes in terms of the spatiotemporal changes in subsurface conductivity, chargeability, and source currents they govern. Presently, multichannel and multielectrode data collections systems can collect large data sets in relatively short periods of time. Practitioners, however, often are unable to fully utilize these large data sets and the information they contain because of standard desktop-computer processing limitations. These limitations can be addressed by utilizing the storage and processing capabilities of parallel computing environments. We have developed a parallel distributed-memory forward and inverse modeling algorithm for analyzing resistivity and time-domain induced polar-ization (IP) data. The primary components of the parallel computations include distributed computation of the pole solutions in forward mode, distributed storage and computation of the Jacobian matrix in inverse mode, and parallel execution of the inverse equation solver. We have tested the corresponding parallel code in three efforts: (1) resistivity characterization of the Hanford 300 Area Integrated Field Research Challenge site in Hanford, Washington, U.S.A., (2) resistivity characterization of a volcanic island in the southern Tyrrhenian Sea in Italy, and (3) resistivity and IP monitoring of biostimulation at a Superfund site in Brandywine, Maryland, U.S.A. Inverse analysis of each of these data sets would be limited or impossible in a standard serial computing environment, which underscores the need for parallel high-performance computing to fully utilize the potential of electrical geophysical methods in hydrogeophysical applications.
Using the Power of Media to Communicate Science: A Question of Style?
ERIC Educational Resources Information Center
Imhof, Heidi
1991-01-01
Discusses educational effects of the style, content, and quality inherent in several multimedia and desktop-publishing products available to science teachers, including books, interactive software, videos, and computer simulations. (JJK)
... used to track you on all kinds of internet-connected devices that have browsers, such as smart phones, tablets, laptop and desktop computers. How does tracking in mobile apps occur? When you access mobile applications, companies don’t have access to ...
Greenhouse Gas Emissions Model (GEM) for Medium- and Heavy-Duty Vehicle Compliance
EPA’s Greenhouse Gas Emissions Model (GEM) is a free, desktop computer application that estimates the greenhouse gas (GHG) emissions and fuel efficiency performance of specific aspects of heavy-duty vehicles.
Demonstrate provider accessibility with desktop and online services.
2001-10-01
It's available on personal computers with a CD or through Internet access. Assess instantly the accessibility of your provider network or the most promising areas to establish a health service with new GIS tools.
Model driven development of clinical information sytems using openEHR.
Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim
2011-01-01
openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.
A web interface for easy flexible protein-protein docking with ATTRACT.
de Vries, Sjoerd J; Schindler, Christina E M; Chauvot de Beauchêne, Isaure; Zacharias, Martin
2015-02-03
Protein-protein docking programs can give valuable insights into the structure of protein complexes in the absence of an experimental complex structure. Web interfaces can facilitate the use of docking programs by structural biologists. Here, we present an easy web interface for protein-protein docking with the ATTRACT program. While aimed at nonexpert users, the web interface still covers a considerable range of docking applications. The web interface supports systematic rigid-body protein docking with the ATTRACT coarse-grained force field, as well as various kinds of protein flexibility. The execution of a docking protocol takes up to a few hours on a standard desktop computer. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Pest management in Douglas-fir seed orchards: a microcomputer decision method
James B. Hoy; Michael I. Haverty
1988-01-01
The computer program described provides a Douglas-fir seed orchard manager (user) with a quantitative method for making insect pest management decisions on a desk-top computer. The decision system uses site-specific information such as estimates of seed crop size, insect attack rates, insecticide efficacy and application costs, weather, and crop value. At sites where...
Computerized preparation of a scientific poster.
Lugo, M; Speaker, M G; Cohen, E J
1989-01-01
We prepared an attractive and effective poster using a Macintosh computer and Laserwriter and Imagewriter II printers. The advantages of preparing the poster in this fashion were increased control of the final product, decreased cost, and a sense of artistic satisfaction. Although we employed only the above mentioned computer, the desktop publishing techniques described can be used with other systems.
Detecting Target Data in Network Traffic
2017-03-01
COMPUTER SCIENCE from the NAVAL POSTGRADUATE SCHOOL March 2017 Approved by: Michael McCarrin Thesis Co-Advisor Robert Beverly Thesis Co-Advisor Peter...Denning Chair, Department of Computer Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Data exfiltration over a n etwork p oses a t hreat...phone. Further, Guri et al. were able to use these GSM frequencies to obtain information from a desktop computer by manipulating memory to produce GSM
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-01-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL. PMID:9681165
Desktop document delivery using portable document format (PDF) files and the Web.
Shipman, J P; Gembala, W L; Reeder, J M; Zick, B A; Rainwater, M J
1998-07-01
Desktop access to electronic full-text literature was rated one of the most desirable services in a client survey conducted by the University of Washington Libraries. The University of Washington Health Sciences Libraries (UW HSL) conducted a ten-month pilot test from August 1996 to May 1997 to determine the feasibility of delivering electronic journal articles via the Internet to remote faculty. Articles were scanned into Adobe Acrobat Portable Document Format (PDF) files and delivered to individuals using Multipurpose Internet Mail Extensions (MIME) standard e-mail attachments and the Web. Participants retrieved scanned articles and used the Adobe Acrobat Reader software to view and print files. The pilot test required a special programming effort to automate the client notification and file deletion processes. Test participants were satisfied with the pilot test despite some technical difficulties. Desktop delivery is now offered as a routine delivery method from the UW HSL.
of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005
National Mobile Inventory Model (NMIM)
The National Mobile Inventory Model (NMIM) is a free, desktop computer application developed by EPA to help you develop estimates of current and future emission inventories for on-road motor vehicles and nonroad equipment. To learn more search the archive
Lumped Parameter Model (LPM) for Light-Duty Vehicles
EPA’s Lumped Parameter Model (LPM) is a free, desktop computer application that estimates the effectiveness (CO2 Reduction) of various technology combinations or “packages,” in a manner that accounts for synergies between technologies.
Centrifuge: rapid and sensitive classification of metagenomic sequences
Song, Li; Breitwieser, Florian P.
2016-01-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649
Los Alamos radiation transport code system on desktop computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.
The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less
Parallel Processing of Images in Mobile Devices using BOINC
NASA Astrophysics Data System (ADS)
Curiel, Mariela; Calle, David F.; Santamaría, Alfredo S.; Suarez, David F.; Flórez, Leonardo
2018-04-01
Medical image processing helps health professionals make decisions for the diagnosis and treatment of patients. Since some algorithms for processing images require substantial amounts of resources, one could take advantage of distributed or parallel computing. A mobile grid can be an adequate computing infrastructure for this problem. A mobile grid is a grid that includes mobile devices as resource providers. In a previous step of this research, we selected BOINC as the infrastructure to build our mobile grid. However, parallel processing of images in mobile devices poses at least two important challenges: the execution of standard libraries for processing images and obtaining adequate performance when compared to desktop computers grids. By the time we started our research, the use of BOINC in mobile devices also involved two issues: a) the execution of programs in mobile devices required to modify the code to insert calls to the BOINC API, and b) the division of the image among the mobile devices as well as its merging required additional code in some BOINC components. This article presents answers to these four challenges.
Lemaire, Edward; Greene, G
2003-01-01
We produced continuing education material in physical rehabilitation using a variety of electronic media. We compared four methods of delivering the learning modules: in person with a computer projector, desktop videoconferencing, Web pages and CD-ROM. Health-care workers at eight community hospitals and two nursing homes were asked to participate in the project. A total of 394 questionnaires were received for all modalities: 73 for in-person sessions, 50 for desktop conferencing, 227 for Web pages and 44 for CD-ROM. This represents a 100% response rate from the in-person, desktop conferencing and CD-ROM groups; the response rate for the Web group is unknown, since the questionnaires were completed online. Almost all participants found the modules to be helpful in their work. The CD-ROM group gave significantly higher ratings than the Web page group, although all four learning modalities received high ratings. A combination of all four modalities would be required to provide the best possible learning opportunity.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
NASA Astrophysics Data System (ADS)
He, Yong; Xue, Guang-Huai; Fu, Jian-Zhong
2014-11-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-01-01
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods. PMID:25427880
Cybersickness and desktop simulations: field of view effects and user experience
NASA Astrophysics Data System (ADS)
Toet, Alexander; de Vries, Sjoerd C.; van Emmerik, Martijn L.; Bos, Jelte E.
2008-04-01
We used a desktop computer game environment to study the effect Field-of-View (FOV) on cybersickness. In particular, we examined the effect of differences between the internal FOV (iFOV, the FOV which the graphics generator is using to render its images) and the external FOV (eFOV, the FOV of the presented images as seen from the physical viewpoint of the observer). Somewhat counter-intuitively, we find that congruent iFOVs and eFOVs lead to a higher incidence of cybersickness. A possible explanation is that the incongruent conditions were too extreme, thereby reducing the experience of vection. We also studied the user experience (appraisal) of this virtual environment as a function of the degree of cybersickness. We find that cybersick participants experience the simulated environment as less pleasant and more arousing, and possibly also as more distressing. Our present findings have serious implications for desktop simulations used both in military and in civilian training, instruction and planning applications.
Open source OCR framework using mobile devices
NASA Astrophysics Data System (ADS)
Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan
2008-02-01
Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.
Fabrication of low cost soft tissue prostheses with the desktop 3D printer.
He, Yong; Xue, Guang-huai; Fu, Jian-zhong
2014-11-27
Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.
Architectures for single-chip image computing
NASA Astrophysics Data System (ADS)
Gove, Robert J.
1992-04-01
This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.
Practical advantages of evolutionary computation
NASA Astrophysics Data System (ADS)
Fogel, David B.
1997-10-01
Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.
Hands in space: gesture interaction with augmented-reality interfaces.
Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai
2014-01-01
Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.
3 CFR 13589 - Executive Order 13589 of November 9, 2011. Promoting Efficient Spending
Code of Federal Regulations, 2012 CFR
2012-01-01
... more aggressive steps to ensure the Government is a good steward of taxpayer money. Sec. 2. Agency... the number of IT devices (e.g., mobile phones, smartphones, desktop and laptop computers, and tablet...
Computerized Fortune Cookies--a Classroom Treat.
ERIC Educational Resources Information Center
Reissman, Rose
1996-01-01
Discusses the use of fortune cookie fortunes in a middle school class combined with computer graphics, desktop publishing, and word processing technology to create writing assignments, games, and discussions. Topics include cultural contexts, and students creating their own fortunes. (LRW)
75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-19
... Carolina. The notice will be published soon in the Federal Register. At the request of the State Agency... production of desktop computers. The company reports that workers leased from Staffing Solutions, South East...
ERIC Educational Resources Information Center
National Science Foundation, Washington, DC.
This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…
Use an Interactive Whiteboard: Get a Handle on How This Technology Can Spice up the Classroom
ERIC Educational Resources Information Center
Branzburg, Jeffrey
2006-01-01
Interactive whiteboards are desirable peripherals these days. When hooked up to a computer, the whiteboard's screen becomes a "live" computer desktop, which can be tapped to pull down menus, highlight, and move or open files. Users can also circle relevant sections on the projected image, draw geometric figures, and underline. Then they can save…
Information Technology: A Road to the Future? To Promote Academic Justice and Excellence Series.
ERIC Educational Resources Information Center
Gilbert, Steven W.; Green, Kenneth C.
This publication is intended to provide college faculty and staff with a guide to information technology issues in higher education. Mid-Way through the 1990s, higher education confronts the second phase of the information technology (IT) revolution, a shift in emphasis from the computer as a desktop tool to the computer as a communications…
WinHPC System User Basics | High-Performance Computing | NREL
guidance for starting to use this high-performance computing (HPC) system at NREL. Also see WinHPC policies ) when you are finished. Simply quitting Remote Desktop will keep your session active and using resources node). 2. Log in with your NREL.gov username/password. Remember to log out when finished. Mac 1. If you
Powering Down from the Bottom up: Greener Client Computing
ERIC Educational Resources Information Center
O'Donnell, Tom
2009-01-01
A decade ago, people wanting to practice "green computing" recycled their printer paper, turned their personal desktop systems off from time to time, and tried their best to donate old equipment to a nonprofit instead of throwing it away. A campus IT department can shave a few watts off just about any IT process--the real trick is planning and…
Kodak's Photo CD and Proposed Photo YCC Color Standard.
ERIC Educational Resources Information Center
Urrows, Henry; Urrows, Elizabeth
1991-01-01
Describes new technology being developed by Eastman Kodak for storing 35mm color photos on compact disk (CD) and discusses its applications for desktop publishing. Benefits of photo CD and costs are examined, a proposed universal color standard that is an improved way to represent color digitally is explained, and software is discussed. (LRW)
NASA Astrophysics Data System (ADS)
Santagati, C.; Inzerillo, L.; Di Paola, F.
2013-07-01
3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.
A desktop 3D printer with dual extruders to produce customised electronic circuitry
NASA Astrophysics Data System (ADS)
Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan
2018-03-01
3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.
GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.
Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan
2011-05-01
Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.
NASA Astrophysics Data System (ADS)
Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José
2018-05-01
We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.
Lessons from a doctoral thesis.
Peiris, A N; Mueller, R A; Sheridan, D P
1990-01-01
The production of a doctoral thesis is a time-consuming affair that until recently was done in conjunction with professional publishing services. Advances in computer technology have made many sophisticated desktop publishing techniques available to the microcomputer user. We describe the computer method used, the problems encountered, and the solutions improvised in the production of a doctoral thesis by computer. The Apple Macintosh was selected for its ease of use and intrinsic graphics capabilities. A scanner was used to incorporate text from published papers into a word processing program. The body of the text was updated and supplemented with new sections. Scanned graphics from the published papers were less suitable for publication, and the original data were replotted and modified with a graphics-drawing program. Graphics were imported and incorporated in the text. Final hard copy was produced by a laser printer and bound with both conventional and rapid new binding techniques. Microcomputer-based desktop processing methods provide a rapid and cost-effective means of communicating the written word. We anticipate that this evolving technology will have increased use by physicians in both the private and academic sectors.
What caused the breach? An examination of use of information technology and health data breaches.
Wikina, Suanu Bliss
2014-01-01
Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states-Virginia, Illinois, California, Florida, New York, and Tennessee-in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates.
What Caused the Breach? An Examination of Use of Information Technology and Health Data Breaches
Wikina, Suanu Bliss
2014-01-01
Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states—Virginia, Illinois, California, Florida, New York, and Tennessee—in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates. PMID:25593574
Yang, Yiqun; Urban, Matthew W; McGough, Robert J
2018-05-15
Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green's functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green's function approach are ideally suited for high-performance GPUs.
Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories
NASA Technical Reports Server (NTRS)
Ng, Hok Kwan; Sridhar, Banavar
2016-01-01
This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.
NASA Technical Reports Server (NTRS)
Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil
2008-01-01
We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].
Ganalyzer: A tool for automatic galaxy image analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-05-01
Ganalyzer is a model-based tool that automatically analyzes and classifies galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large datasets of galaxy images collected by autonomous sky surveys such as SDSS, LSST or DES.
FAST SIMULATION OF SOLID TUMORS THERMAL ABLATION TREATMENTS WITH A 3D REACTION DIFFUSION MODEL *
BERTACCINI, DANIELE; CALVETTI, DANIELA
2007-01-01
An efficient computational method for near real-time simulation of thermal ablation of tumors via radio frequencies is proposed. Model simulations of the temperature field in a 3D portion of tissue containing the tumoral mass for different patterns of source heating can be used to design the ablation procedure. The availability of a very efficient computational scheme makes it possible update the predicted outcome of the procedure in real time. In the algorithms proposed here a discretization in space of the governing equations is followed by an adaptive time integration based on implicit multistep formulas. A modification of the ode15s MATLAB function which uses Krylov space iterative methods for the solution of for the linear systems arising at each integration step makes it possible to perform the simulations on standard desktop for much finer grids than using the built-in ode15s. The proposed algorithm can be applied to a wide class of nonlinear parabolic differential equations. PMID:17173888
NASA Astrophysics Data System (ADS)
Herbrechtsmeier, Stefan; Witkowski, Ulf; Rückert, Ulrich
Mobile robots become more and more important in current research and education. Especially small ’on the table’ experiments attract interest, because they need no additional or special laboratory equipments. In this context platforms are desirable which are small, simple to access and relatively easy to program. An additional powerful information processing unit is advantageous to simplify the implementation of algorithm and the porting of software from desktop computers to the robot platform. In this paper we present a new versatile miniature robot that can be ideally used for research and education. The small size of the robot of about 9 cm edge length, its robust drive and its modular structure make the robot a general device for single and multi-robot experiments executed ’on the table’. For programming and evaluation the robot can be wirelessly connected via Bluetooth or WiFi. The operating system of the robot is based on the standard Linux kernel and the GNU C standard library. A player/stage model eases software development and testing.
Review of Collaborative Tools for Planning and Engineering
2007-10-01
including PDAs) and Operating Systems 1 In general, should support laptops, desktops, Windows OS, Mac OS, Palm OS, Windows CE, Blackberry , Sun...better), voting (to establish operating parameters), reactor design, wind tunnel simulation Display same material on every computer, synchronisation
Hyped Type: An Exercise in Creative Typography.
ERIC Educational Resources Information Center
Osterer, Irv
2001-01-01
Provides a history of typography and discusses the effects of technology. Describes an art project in which high school students designed contemporary typographic specimen sheets. Explains that the students created their own broadsheets using Macintosh computers and QuarkXPress desktop publishing. (CMK)
Report #10-P-0194, August 23, 2010. Although EPA indicated it could avoid spending more than $115.4 million over 8.5 years by consolidating the desktop computing environment, improved management practices are needed.
Connecting to HPC VPN | High-Performance Computing | NREL
and password will match your NREL network account login/password. From OS X or Linux, open a terminal finalized. Open a Remote Desktop connection using server name WINHPC02 (this is the login node). Mac Mac
ERIC Educational Resources Information Center
Jordan, Jim
1988-01-01
Summarizes how infograhics are produced and how they provide information graphically in high school publications. Offers suggestions concerning information gathering, graphic format, and software selection, and provides examples of computer/student designed infographics. (MM)
Designing Communication and Learning Environments.
ERIC Educational Resources Information Center
Gayeski, Diane M., Ed.
Designing and remodeling educational facilities are becoming more complex with options that include computer-based collaboration, classrooms with multimedia podiums, conference centers, and workplaces with desktop communication systems. This book provides a collection of articles that address educational facility design categorized in the…
76 FR 32146 - Procurement List; Proposed Additions
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... . SUPPLEMENTARY INFORMATION: This notice is published pursuant to 41 U.S.C. 47(a)(2) and 41 CFR 51-2.3. Its... Combination Lock. NSN: 5340-00-NIB-0099--Desktop & Peripherals Locking Kit, Standard. NPA: Alphapointe...
A Geospatial Information Grid Framework for Geological Survey.
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.
A Geospatial Information Grid Framework for Geological Survey
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255
14- by 22-Foot Subsonic Tunnel Laser Velocimeter Upgrade
NASA Technical Reports Server (NTRS)
Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.; Fletcher, Mark T.
2012-01-01
A long-focal length laser velocimeter constructed in the early 1980's was upgraded using current technology to improve usability, reliability and future serviceability. The original, free-space optics were replaced with a state-of-the-art fiber-optic subsystem which allowed most of the optics, including the laser, to be remote from the harsh tunnel environment. General purpose high-speed digitizers were incorporated in a standard modular data acquisition system, along with custom signal processing software executed on a desktop computer, served as the replacement for the signal processors. The resulting system increased optical sensitivity with real-time signal/data processing that produced measurement precisions exceeding those of the original system. Monte Carlo simulations, along with laboratory and wind tunnel investigations were used to determine system characteristics and measurement precision.
The Modernization of a Long-Focal Length Fringe-Type Laser Velocimeter
NASA Technical Reports Server (NTRS)
Meyers, James F.; Lee, Joseph W.; Cavone, Angelo A.; Fletcher, Mark T.
2012-01-01
A long-focal length laser velocimeter constructed in the early 1980's was upgraded using current technology to improve usability, reliability and future serviceability. The original, free-space optics were replaced with a state-of-the-art fiber-optic subsystem which allowed most of the optics, including the laser, to be remote from the harsh tunnel environment. General purpose high-speed digitizers were incorporated in a standard modular data acquisition system, along with custom signal processing software executed on a desktop computer, served as the replacement for the signal processors. The resulting system increased optical sensitivity with real-time signal/data processing that produced measurement precisions exceeding those of the original system. Monte Carlo simulations, along with laboratory and wind tunnel investigations were used to determine system characteristics and measurement precision.
Cloud Based Electronic Health Record Applications are Essential to Expeditionary Patient Care
2017-05-01
security46 and privacy concerns47). Privacy/Security Risks of Cloud Computing A quantitative study based on the preceding literature review...to medical IT wherever there is a Wi-Fi connection and a computing device (desktop, laptop , tablet, phone, etc.). In 2015 the DoD launched MiCare, a...Hosting Services: a Study on Students’ Acceptance,” Computers in Human Behavior, 2013. Takai, Teri. DoD CIO’s 10-Point Plan for IT Modernization
Time synchronized video systems
NASA Technical Reports Server (NTRS)
Burnett, Ron
1994-01-01
The idea of synchronizing multiple video recordings to some type of 'range' time has been tried to varying degrees of success in the past. Combining this requirement with existing time code standards (SMPTE) and the new innovations in desktop multimedia however, have afforded an opportunity to increase the flexibility and usefulness of such efforts without adding costs over the traditional data recording and reduction systems. The concept described can use IRIG, GPS or a battery backed internal clock as the master time source. By converting that time source to Vertical Interval Time Code or Longitudinal Time Code, both in accordance with the SMPTE standards, the user will obtain a tape that contains machine/computer readable time code suitable for use with editing equipment that is available off-the-shelf. Accuracy on playback is then determined by the playback system chosen by the user. Accuracies of +/- 2 frames are common among inexpensive systems and complete frame accuracy is more a matter of the users' budget than the capability of the recording system.
ERIC Educational Resources Information Center
Griffin, Irma Amado
This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…
Computational adaptive optics for broadband optical interferometric tomography of biological tissue.
Adie, Steven G; Graf, Benedikt W; Ahmad, Adeel; Carney, P Scott; Boppart, Stephen A
2012-05-08
Aberrations in optical microscopy reduce image resolution and contrast, and can limit imaging depth when focusing into biological samples. Static correction of aberrations may be achieved through appropriate lens design, but this approach does not offer the flexibility of simultaneously correcting aberrations for all imaging depths, nor the adaptability to correct for sample-specific aberrations for high-quality tomographic optical imaging. Incorporation of adaptive optics (AO) methods have demonstrated considerable improvement in optical image contrast and resolution in noninterferometric microscopy techniques, as well as in optical coherence tomography. Here we present a method to correct aberrations in a tomogram rather than the beam of a broadband optical interferometry system. Based on Fourier optics principles, we correct aberrations of a virtual pupil using Zernike polynomials. When used in conjunction with the computed imaging method interferometric synthetic aperture microscopy, this computational AO enables object reconstruction (within the single scattering limit) with ideal focal-plane resolution at all depths. Tomographic reconstructions of tissue phantoms containing subresolution titanium-dioxide particles and of ex vivo rat lung tissue demonstrate aberration correction in datasets acquired with a highly astigmatic illumination beam. These results also demonstrate that imaging with an aberrated astigmatic beam provides the advantage of a more uniform depth-dependent signal compared to imaging with a standard gaussian beam. With further work, computational AO could enable the replacement of complicated and expensive optical hardware components with algorithms implemented on a standard desktop computer, making high-resolution 3D interferometric tomography accessible to a wider group of users and nonspecialists.
Getting Your Name on the Right Desktops, or How to Be Found on the Internet.
ERIC Educational Resources Information Center
Fagan, Phoebe
1995-01-01
Explains aspects of an Internet gopher developed by the National Institute of Standards and Technology's (NIST) Standard Reference Data (SRD) program. The gopher lists SRD projects and data centers and enables users to find and contact the researchers associated with these projects. Delivery of scientific information is also discussed. (LRW)
Design of MPPT Controller Monitoring Software Based on QT Framework
NASA Astrophysics Data System (ADS)
Meng, X. Z.; Lu, P. G.
2017-10-01
The MPPT controller was a hardware device for tracking the maximum power point of solar photovoltaic array. Multiple controllers could be working as networking mode by specific communicating protocol. In this article, based on C++ GUI programming with Qt frame, we designed one sort of desktop application for monitoring and analyzing operational parameter of MPPT controller. The type of communicating protocol for building network was Modbus protocol which using Remote Terminal Unit mode and The desktop application of host computer was connected with all the controllers in the network through RS485 communication or ZigBee wireless communication. Using this application, user could monitor the parameter of controller wherever they were by internet.
ERIC Educational Resources Information Center
Cutsinger, John
1988-01-01
Explains how a high school literary magazine staff accessed the journalism department's Apple Macintosh computers to typeset its publication. Provides examples of magazine layouts designed partially or completely by "Pagemaker" software on a Macintosh. (MM)
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments
Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu
2017-01-01
High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734
Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.
Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu
2017-01-01
High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.
Streamlined, Inexpensive 3D Printing of the Brain and Skull
Cash, Sydney S.
2015-01-01
Neuroimaging technologies such as Magnetic Resonance Imaging (MRI) and Computed Tomography (CT) collect three-dimensional data (3D) that is typically viewed on two-dimensional (2D) screens. Actual 3D models, however, allow interaction with real objects such as implantable electrode grids, potentially improving patient specific neurosurgical planning and personalized clinical education. Desktop 3D printers can now produce relatively inexpensive, good quality prints. We describe our process for reliably generating life-sized 3D brain prints from MRIs and 3D skull prints from CTs. We have integrated a standardized, primarily open-source process for 3D printing brains and skulls. We describe how to convert clinical neuroimaging Digital Imaging and Communications in Medicine (DICOM) images to stereolithography (STL) files, a common 3D object file format that can be sent to 3D printing services. We additionally share how to convert these STL files to machine instruction gcode files, for reliable in-house printing on desktop, open-source 3D printers. We have successfully printed over 19 patient brain hemispheres from 7 patients on two different open-source desktop 3D printers. Each brain hemisphere costs approximately $3–4 in consumable plastic filament as described, and the total process takes 14–17 hours, almost all of which is unsupervised (preprocessing = 4–6 hr; printing = 9–11 hr, post-processing = <30 min). Printing a matching portion of a skull costs $1–5 in consumable plastic filament and takes less than 14 hr, in total. We have developed a streamlined, cost-effective process for 3D printing brain and skull models. We surveyed healthcare providers and patients who confirmed that rapid-prototype patient specific 3D models may help interdisciplinary surgical planning and patient education. The methods we describe can be applied for other clinical, research, and educational purposes. PMID:26295459
NASA Astrophysics Data System (ADS)
Min, M.
2017-10-01
Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.
DOT National Transportation Integrated Search
2013-01-01
The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...
Teach Your Computer to Read: Scanners and Optical Character Recognition.
ERIC Educational Resources Information Center
Marsden, Jim
1993-01-01
Desktop scanners can be used with a software technology called optical character recognition (OCR) to convert the text on virtually any paper document into an electronic form. OCR offers educators new flexibility in incorporating text into tests, lesson plans, and other materials. (MLF)
Personal Computers and Laser Printers Are Becoming Popular Tools for Creating Documents on Campuses.
ERIC Educational Resources Information Center
DeLoughry, Thomas J.
1987-01-01
Desktop publishing techniques are bringing control over institutional newsletters, catalogues, brochures, and many other print materials directly to the author's office. The technology also has the potential for integrating campus information systems and saving much time and money. (MSE)
CWRUnet--Case History of a Campus-Wide Fiber-to-the-Desktop Network.
ERIC Educational Resources Information Center
Neff, Raymond K.; Haigh, Peter J.
1992-01-01
This article describes the development at Case Western Reserve University of an all-fiber optic communications network linking 7,300 outlets (faculty offices, student residences, classrooms, libraries, and laboratories) with computer data, television, audio, facsimile, and image information services. (Author/DB)
Centrifuge: rapid and sensitive classification of metagenomic sequences.
Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L
2016-12-01
Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.
NASA Technical Reports Server (NTRS)
2008-01-01
NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.
Microcomputers in Libraries: The Quiet Revolution.
ERIC Educational Resources Information Center
Boss, Richard
1985-01-01
This article defines three separate categories of microcomputers--personal, desk-top, multi-user devices--and relates storage capabilities (expandability, floppy disks) to library applications. Highlghts include de facto standards, operating systems, database management systems, applications software, circulation control systems, dumb and…
Propagation Environment Assessment Using UAV Electromagnetic Sensors
2018-03-01
could be added, we limit this study to two dimensions.) The computer program then processes the data and determines the existence of any atmospheric... computer to have large processing capacity, and a typical workstation desktop or laptop can perform the function. E. FLIGHT PATTERNS AND DATA...different types of flight patterns were studied , and our findings show that the vertical flight pattern using a rotary platform is more efficient
2016-09-01
and network. The computing and network hardware are identified and include routers, servers, firewalls, laptops , backup hard drives, smart phones...deployable hardware units will be necessary. This includes the use of ruggedized laptops and desktop computers , a projector system, communications system...ENGINEERING STUDY AND CONCEPT DEVELOPMENT FOR A HUMANITARIAN AID AND DISASTER RELIEF OPERATIONS MANAGEMENT PLATFORM by Julie A. Reed September
A highly efficient multi-core algorithm for clustering extremely large datasets
2010-01-01
Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922
2011-10-01
Fortunately, some products offer centralized management and deployment tools for local desktop implementation . Figure 5 illustrates the... implementation of a secure desktop infrastructure based on virtualization. It includes an overview of desktop virtualization, including an in-depth...environment in the data centre, whereas LHVD places it on the endpoint itself. Desktop virtualization implementation considerations and potential
Transitioning EEG experiments away from the laboratory using a Raspberry Pi 2.
Kuziek, Jonathan W P; Shienh, Axita; Mathewson, Kyle E
2017-02-01
Electroencephalography (EEG) experiments are typically performed in controlled laboratory settings to minimise noise and produce reliable measurements. These controlled conditions also reduce the applicability of the obtained results to more varied environments and may limit their relevance to everyday situations. Advances in computer portability may increase the mobility and applicability of EEG results while decreasing costs. In this experiment we show that stimulus presentation using a Raspberry Pi 2 computer provides a low cost, reliable alternative to a traditional desktop PC in the administration of EEG experimental tasks. Significant and reliable MMN and P3 activity, typical event-related potentials (ERPs) associated with an auditory oddball paradigm, were measured while experiments were administered using the Raspberry Pi 2. While latency differences in ERP triggering were observed between systems, these differences reduced power only marginally, likely due to the reduced processing power of the Raspberry Pi 2. An auditory oddball task administered using the Raspberry Pi 2 produced similar ERPs to those derived from a desktop PC in a laboratory setting. Despite temporal differences and slight increases in trials needed for similar statistical power, the Raspberry Pi 2 can be used to design and present auditory experiments comparable to a PC. Our results show that the Raspberry Pi 2 is a low cost alternative to the desktop PC when administering EEG experiments and, due to its small size and low power consumption, will enable mobile EEG experiments unconstrained by a traditional laboratory setting. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis of DoD Usage of Multimedia Technology to Determine Requirements for Standards.
1995-03-01
1995 Accesion For NTIS CRA&I DTlC TAB Unannounced Justification By._ _ Distribution/ i D D Availability Codes Dist A-/ Avail...developing a standard architecture for their multimedia systems. When the DoD participants were asked to identify areas where standards are lack...are limited, they will sacrifice vid- eo quality in order to sustain audio quality. In order for desktop conferencing to become a market success
Drajsajtl, Tomáš; Struk, Petr; Bednárová, Alice
2013-01-01
AsTeRICS - "The Assistive Technology Rapid Integration & Construction Set" is a construction set for assistive technologies which can be adapted to the motor abilities of end-users. AsTeRICS allows access to different devices such as PCs, cell phones and smart home devices, with all of them integrated in a platform adapted as much as possible to each user. People with motor disabilities in the upper limbs, with no cognitive impairment, no perceptual limitations (neither visual nor auditory) and with basic skills in using technologies such as PCs, cell phones, electronic agendas, etc. have available a flexible and adaptable technology which enables them to access the Human-Machine-Interfaces (HMI) on the standard desktop and beyond. AsTeRICS provides graphical model design tools, a middleware and hardware support for the creation of tailored AT-solutions involving bioelectric signal acquisition, Brain-/Neural Computer Interfaces, Computer-Vision techniques and standardized actuator and device controls and allows combining several off-the-shelf AT-devices in every desired combination. Novel, end-user ready solutions can be created and adapted via a graphical editor without additional programming efforts. The AsTeRICS open-source framework provides resources for utilization and extension of the system to developers and researches. AsTeRICS was developed by the AsTeRICS project and was partially funded by EC.
Artificial neural network-aided image analysis system for cell counting.
Sjöström, P J; Frydel, B R; Wahlberg, L U
1999-05-01
In histological preparations containing debris and synthetic materials, it is difficult to automate cell counting using standard image analysis tools, i.e., systems that rely on boundary contours, histogram thresholding, etc. In an attempt to mimic manual cell recognition, an automated cell counter was constructed using a combination of artificial intelligence and standard image analysis methods. Artificial neural network (ANN) methods were applied on digitized microscopy fields without pre-ANN feature extraction. A three-layer feed-forward network with extensive weight sharing in the first hidden layer was employed and trained on 1,830 examples using the error back-propagation algorithm on a Power Macintosh 7300/180 desktop computer. The optimal number of hidden neurons was determined and the trained system was validated by comparison with blinded human counts. System performance at 50x and lO0x magnification was evaluated. The correlation index at 100x magnification neared person-to-person variability, while 50x magnification was not useful. The system was approximately six times faster than an experienced human. ANN-based automated cell counting in noisy histological preparations is feasible. Consistent histology and computer power are crucial for system performance. The system provides several benefits, such as speed of analysis and consistency, and frees up personnel for other tasks.
A New User Interface for On-Demand Customizable Data Products for Sensors in a SensorWeb
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Pat; Frye, Stuart; Sohlberg, Rob; Ly, Vuong; Chien, Steve; Sullivan, Don
2011-01-01
A SensorWeb is a set of sensors, which can consist of ground, airborne and space-based sensors interoperating in an automated or autonomous collaborative manner. The NASA SensorWeb toolbox, developed at NASA/GSFC in collaboration with NASA/JPL, NASA/Ames and other partners, is a set of software and standards that (1) enables users to create virtual private networks of sensors over open networks; (2) provides the capability to orchestrate their actions; (3) provides the capability to customize the output data products and (4) enables automated delivery of the data products to the users desktop. A recent addition to the SensorWeb Toolbox is a new user interface, together with web services co-resident with the sensors, to enable rapid creation, loading and execution of new algorithms for processing sensor data. The web service along with the user interface follows the Open Geospatial Consortium (OGC) standard called Web Coverage Processing Service (WCPS). This presentation will detail the prototype that was built and how the WCPS was tested against a HyspIRI flight testbed and an elastic computation cloud on the ground with EO-1 data. HyspIRI is a future NASA decadal mission. The elastic computation cloud stores EO-1 data and runs software similar to Amazon online shopping.
Development of a Wireless Computer Vision Instrument to Detect Biotic Stress in Wheat
Casanova, Joaquin J.; O'Shaughnessy, Susan A.; Evett, Steven R.; Rush, Charles M.
2014-01-01
Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM). In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV), vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32) than stressed wheat (111.34). In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014), as did the conventional camera (p < 0.0001). Vegetation hue obtained through a wireless computer vision system in this study is a viable option for determining biotic crop stress in irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications. PMID:25251410
ERIC Educational Resources Information Center
Yee, Kevin; Hargis, Jace
2010-01-01
This article discusses the benefits of screencasts and its instructional uses. Well-known for some years to advanced technology users, Screen Capture Software (SCS) offers the promise of recording action on the computer desktop together with voiceover narration, all combined into a single movie file that can be shared, emailed, or uploaded.…
The Technological Evolution in Schools: Reflections and Projections.
ERIC Educational Resources Information Center
Higgins, James E.
1991-01-01
Presents a first-person account of one teacher's experiences with computer hardware and software. The article discusses various programs and applications, such as integrated learning systems, database searching via CD-ROM, desktop publishing, authoring programs, and indicates future changes in instruction with increasing use of technology. (SM)
Incorporating a Human-Computer Interaction Course into Software Development Curriculums
ERIC Educational Resources Information Center
Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph
2015-01-01
Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…
Massive Query Resolution for Rapid Selective Dissemination of Information.
ERIC Educational Resources Information Center
Cohen, Jonathan D.
1999-01-01
Outlines an efficient approach to performing query resolution which, when matched with a keyword scanner, offers rapid selecting and routing for massive Boolean queries, and which is suitable for implementation on a desktop computer. Demonstrates the system's operation with large examples in a practical setting. (AEF)
Medical Recording Tools for Biodosimetry in Radiation Incidents
2005-01-01
assistant) devices. To ac- complish this, the second edition of the AFRRI handbook will be redesigned, using Adobe FrameMaker desk- top publishing...Handbook will be redes- igned for display on hand-held computer devices, using Adobe FrameMaker desktop publishing software. Portions of the text
TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA
The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...
Controlling Robots with Personal Computers.
ERIC Educational Resources Information Center
Singer, Andrew; Rony, Peter
1983-01-01
Discusses new robots that are mechanical arms small enough to sit on a desktop. They offer scaled-down price and performance, but are able to handle light production tasks such as spray painting or part orientation. (Available from W. C. Publications Inc., P.O. Box 1578, Montclair, NJ 07042.) (JOW)
NASA Astrophysics Data System (ADS)
Yang, Yiqun; Urban, Matthew W.; McGough, Robert J.
2018-05-01
Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green’s functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green’s function approach are ideally suited for high-performance GPUs.
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies
Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; ...
2015-01-21
Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with themore » data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. Furthermore, a tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial.« less
A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies
Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; Blackwood, Christopher B.; Rosen, Gail L.
2015-01-01
Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with the data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. A tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial. Availability: http://www.ece.drexel.edu/gailr/EESI/tutorial.php. PMID:25607539
The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector
NASA Astrophysics Data System (ADS)
Axani, S. N.; Frankiewicz, K.; Conrad, J. M.
2018-03-01
The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.
Undergraduate computational physics projects on quantum computing
NASA Astrophysics Data System (ADS)
Candela, D.
2015-08-01
Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.
Computing at DESY — current setup, trends and strategic directions
NASA Astrophysics Data System (ADS)
Ernst, Michael
1998-05-01
Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.
Accelerated Compressed Sensing Based CT Image Reconstruction.
Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C
2015-01-01
In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.
Accelerated Compressed Sensing Based CT Image Reconstruction
Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.
2015-01-01
In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200
LOSCAR: Long-term Ocean-atmosphere-Sediment CArbon cycle Reservoir Model
NASA Astrophysics Data System (ADS)
Zeebe, R. E.
2011-06-01
The LOSCAR model is designed to efficiently compute the partitioning of carbon between ocean, atmosphere, and sediments on time scales ranging from centuries to millions of years. While a variety of computationally inexpensive carbon cycle models are already available, many are missing a critical sediment component, which is indispensable for long-term integrations. One of LOSCAR's strengths is the coupling of ocean-atmosphere routines to a computationally efficient sediment module. This allows, for instance, adequate computation of CaCO3 dissolution, calcite compensation, and long-term carbon cycle fluxes, including weathering of carbonate and silicate rocks. The ocean component includes various biogeochemical tracers such as total carbon, alkalinity, phosphate, oxygen, and stable carbon isotopes. We have previously published applications of the model tackling future projections of ocean chemistry and weathering, pCO2 sensitivity to carbon cycle perturbations throughout the Cenozoic, and carbon/calcium cycling during the Paleocene-Eocene Thermal Maximum. The focus of the present contribution is the detailed description of the model including numerical architecture, processes and parameterizations, tuning, and examples of input and output. Typical CPU integration times of LOSCAR are of order seconds for several thousand model years on current standard desktop machines. The LOSCAR source code in C can be obtained from the author by sending a request to loscar.model@gmail.com.
A discrete Fourier transform for virtual memory machines
NASA Technical Reports Server (NTRS)
Galant, David C.
1992-01-01
An algebraic theory of the Discrete Fourier Transform is developed in great detail. Examination of the details of the theory leads to a computationally efficient fast Fourier transform for the use on computers with virtual memory. Such an algorithm is of great use on modern desktop machines. A FORTRAN coded version of the algorithm is given for the case when the sequence of numbers to be transformed is a power of two.
AstroGrid: Taverna in the Virtual Observatory .
NASA Astrophysics Data System (ADS)
Benson, K. M.; Walton, N. A.
This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.
When Everyone Is a Probe, Everyone Is a Learner
ERIC Educational Resources Information Center
Berenfeld, Boris; Krupa, Tatiana; Lebedev, Arseny; Stafeev, Sergey
2014-01-01
Most students globally have mobile devices and the Global Students Laboratory (GlobalLab) project is integrating mobility into learning. First launched in 1991, GlobalLab builds a community of learners engaged in collaborative, distributed investigations. Long relying on stationary desktop computers, or students inputting their observations by…
Software Reviews: Programs Worth a Second Look.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Reviews three computer software programs: (1) "The Children's Writing and Publishing Center"--writing and creative arts, grades 2-8, Apple II; (2) "Slide Shop"--graphics and desktop presentations, grades 4-12, Apple II and IBM; and (3) "Solve It"--problem solving and language arts, grades 4-12, Apple II. (MVL)
The Library Macintosh at SCIL [Small Computers in Libraries]'88.
ERIC Educational Resources Information Center
Valauskas, Edward J.; And Others
1988-01-01
The first of three papers describes the role of Macintosh workstations in a library. The second paper explains why the Macintosh was selected for end-user searching in an academic library, and the third discusses advantages and disadvantages of desktop publishing for librarians. (8 references) (MES)
Ideas without Words--Internationalizing Business Presentations.
ERIC Educational Resources Information Center
Sondak, Norman; Sondak, Eileen
This paper presents elements of the computer graphics environment including information on: Lotus 1-2-3; Apple Macintosh; Desktop Publishing; Object-Oriented Programming; and Microsoft's Windows 3. A brief scenario illustrates the use of the minimization principle in presenting a new product to a group of international financiers. A taxonomy of…
Most Social Scientists Shun Free Use of Supercomputers.
ERIC Educational Resources Information Center
Kiernan, Vincent
1998-01-01
Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…
Classrooms for the Millennials: An Approach for the Next Generation
ERIC Educational Resources Information Center
Gerber, Lindsey N.; Ward, Debra D.
2016-01-01
The purpose of this paper is to introduce educators to three types of applets that are compatible with smartphones, tablets, and desktop computers: screencasting applets, graphing calculator applets, and student response applets. The applets discussed can be seamlessly and effectively integrated into classrooms to help facilitate lectures, collect…
Evaluating Technology Integration in the Elementary School: A Site-Based Approach.
ERIC Educational Resources Information Center
Mowe, Richard
This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…
The Impact on Homes with a School-Provided Laptop
ERIC Educational Resources Information Center
Bu Shell, Shawna M.
2012-01-01
For decades, educational policy advocates have argued for providing technology to students to enhance their learning environments. From filmstrips (Oppenheimer, 1997) to desktop computers (Cuban, 2002) to laptops (Silvernail, 2009; Warschauer, 2006), they have attempted to change the environment and style in which students learn, and how tools can…
ERIC Educational Resources Information Center
Van Horn, Royal
2001-01-01
Several years after the first audiovisual Macintosh computer appeared, most educators are still oblivious of this technology. Almost every other economic sector (including the porn industry) makes abundant use of digital and streaming video. Desktop movie production is so easy that primary grade students can do it. Tips are provided. (MLH)
Colleges' Effort To Prepare for Y2K May Yield Benefits for Many Years.
ERIC Educational Resources Information Center
Olsen, Florence
2000-01-01
Suggests that the money spent ($100 billion) to fix the Y2K bug in the United States resulted in improved campus computer systems. Reports from campuses around the country indicate that both mainframe and desktop systems experienced fewer problems than expected. (DB)
Student Record Automating Using Desktop Computer Technologies.
ERIC Educational Resources Information Center
Almerico, Gina M.; Baker, Russell K.; Matassini, Norma
Teacher education programs nationwide are required by state and federal governments to maintain comprehensive student records of all current and graduated students in their programs. A private, mid-sized university established a faculty team to analyze record-keeping procedures to comply with these government requirements. The team's mandate was…
Goal-seismic computer programs in BASIC: Part I; Store, plot, and edit array data
Hasbrouck, Wilfred P.
1979-01-01
Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in an extended BASIC language specially augmented for acceptance by the Tektronix 4051 Graphic System. This report presents five computer programs used to store, plot, and edit array data for the line, cross, and triangle arrays commonly employed in our coal-seismic investigations. * Use of brand names in this report is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey.
The spinal posture of computing adolescents in a real-life setting
2014-01-01
Background It is assumed that good postural alignment is associated with the less likelihood of musculoskeletal pain symptoms. Encouraging good sitting postures have not reported consequent musculoskeletal pain reduction in school-based populations, possibly due to a lack of clear understanding of good posture. Therefore this paper describes the variability of postural angles in a cohort of asymptomatic high-school students whilst working on desk-top computers in a school computer classroom and to report on the relationship between the postural angles and age, gender, height, weight and computer use. Methods The baseline data from a 12 month longitudinal study is reported. The study was conducted in South African school computer classrooms. 194 Grade 10 high-school students, from randomly selected high-schools, aged 15–17 years, enrolled in Computer Application Technology for the first time, asymptomatic during the preceding month, and from whom written informed consent were obtained, participated in the study. The 3D Posture Analysis Tool captured five postural angles (head flexion, neck flexion, cranio-cervical angle, trunk flexion and head lateral bend) while the students were working on desk-top computers. Height, weight and computer use were also measured. Individual and combinations of postural angles were analysed. Results 944 Students were screened for eligibility of which the data of 194 students are reported. Trunk flexion was the most variable angle. Increased neck flexion and the combination of increased head flexion, neck flexion and trunk flexion were significantly associated with increased weight and BMI (p = 0.0001). Conclusions High-school students sit with greater ranges of trunk flexion (leaning forward or reclining) when using the classroom computer. Increased weight is significantly associated with increased sagittal plane postural angles. PMID:24950887
Volunteered Cloud Computing for Disaster Management
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.
Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena
2015-04-01
To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.
Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz
2009-08-25
Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms.
Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz
2009-01-01
Background Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. Results We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. Conclusion dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms. PMID:19706156
On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers
NASA Astrophysics Data System (ADS)
Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.
2017-10-01
This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.
Wound-healing outcomes using standardized assessment and care in clinical practice.
Bolton, Laura; McNees, Patrick; van Rijswijk, Lia; de Leon, Jean; Lyder, Courtney; Kobza, Laura; Edman, Kelly; Scheurich, Anne; Shannon, Ron; Toth, Michelle
2004-01-01
Wound-healing outcomes applying standardized protocols have typically been measured within controlled clinical trials, not natural settings. Standardized protocols of wound care have been validated for clinical use, creating an opportunity to measure the resulting outcomes. Wound-healing outcomes were explored during clinical use of standardized validated protocols of care based on patient and wound assessments. This was a prospective multicenter study of wound-healing outcomes management in real-world clinical practice. Healing outcomes from March 26 to October 31, 2001, were recorded on patients in 3 long-term care facilities, 1 long-term acute care hospital, and 12 home care agencies for wounds selected by staff to receive care based on computer-generated validated wound care algorithms. After diagnosis, wound dimensions and status were assessed using a tool adapted from the Pressure Sore Status Toolfor use on all wounds. Wound, ostomy, and continence nursing professionals accessed consistent protocols of care, via telemedicine in home care or paper forms in long-term care. A physician entered assessments into a desktop computer in the wound clinic. Based on evidence that healing proceeds faster with fewer infections in environments without gauze, the protocols generally avoided gauze dressings. Most of the 767 wounds selected to receive the standardized-protocols of care were stage III-IV pressure ulcers (n = 373; mean healing time 62 days) or full-thickness venous ulcers (n = 124; mean healing time 57 days). Partial-thickness wounds healed faster than same-etiology full-thickness wounds. These results provide benchmarks for natural-setting healing outcomes and help to define and address wound care challenges. Outcomes primarily using nongauze protocols of care matched or surpassed best previously published results on similar wounds using gauze-based protocols of care, including protocols applying gauze impregnated with growth factors or other agents.
2015-11-01
provided by a stand-alone desktop or hand held computing device. This introduces into the discussion a large number of mobile , tactical command...control, communications, and computer (C4) systems across the Services. A couple of examples are mobile command posts mounted on the back of an M1152... infrastructure (DCPI). This term encompasses on-site backup generators, switchgear, uninterruptible power supplies (UPS), power distribution units
Evaluating computer capabilities in a primary care practice-based research network.
Ariza, Adolfo J; Binns, Helen J; Christoffel, Katherine Kaufer
2004-01-01
We wanted to assess computer capabilities in a primary care practice-based research network and to understand how receptive the practices were to new ideas for automation of practice activities and research. This study was conducted among members of the Pediatric Practice Research Group (PPRG). A survey to assess computer capabilities was developed to explore hardware types, software programs, Internet connectivity and data transmission; views on privacy and security; and receptivity to future electronic data collection approaches. Of the 40 PPRG practices participating in the study during the autumn of 2001, all used IBM-compatible systems. Of these, 45% used stand-alone desktops, 40% had networked desktops, and approximately 15% used laptops and minicomputers. A variety of software packages were used, with most practices (82%) having software for some aspect of patient care documentation, patient accounting (90%), business support (60%), and management reports and analysis (97%). The main obstacles to expanding use of computers in patient care were insufficient staff training (63%) and privacy concerns (82%). If provided with training and support, most practices indicated they were willing to consider an array of electronic data collection options for practice-based research activities. There is wide variability in hardware and software use in the pediatric practice setting. Implementing electronic data collection in the PPRG would require a substantial start-up effort and ongoing training and support at the practice site.
Developing and validating an instrument for measuring mobile computing self-efficacy.
Wang, Yi-Shun; Wang, Hsiu-Yuan
2008-08-01
IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.
Economic analysis of cloud-based desktop virtualization implementation at a hospital
2012-01-01
Background Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting. PMID:23110661
Economic analysis of cloud-based desktop virtualization implementation at a hospital.
Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee
2012-10-30
Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.
Technical Writing Teachers and the Challenges of Desktop Publishing.
ERIC Educational Resources Information Center
Kalmbach, James
1988-01-01
Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)
Automated micromanipulation desktop station based on mobile piezoelectric microrobots
NASA Astrophysics Data System (ADS)
Fatikow, Sergej
1996-12-01
One of the main problems of present-day research on microsystem technology (MST) is to assemble a whole micro- system from different microcomponents. This paper presents a new concept of an automated micromanipulation desktop- station including piezoelectrically driven microrobots placed on a high-precise x-y-stage of a light microscope, a CCD-camera as a local sensor subsystem, a laser sensor unit as a global sensor subsystem, a parallel computer system with C167 microcontrollers, and a Pentium PC equipped additionally with an optical grabber. The microrobots can perform high-precise manipulations (with an accuracy of up to 10 nm) and a nondestructive transport (at a speed of about 3 cm/sec) of very small objects under the microscope. To control the desktop-station automatically, an advanced control system that includes a task planning level and a real-time execution level is being developed. The main function of the task planning sub-system is to interpret the implicit action plan and to generate a sequence of explicit operations which are sent to the execution level of the control system. The main functions of the execution control level are the object recognition, image processing and feedback position control of the microrobot and the microscope stage.
Video streaming in nursing education: bringing life to online education.
Smith-Stoner, Marilyn; Willer, Ann
2003-01-01
Distance education is a standard form of instruction for many colleges of nursing. Web-based course and program content has been delivered primarily through text-based presentations such as PowerPoint slides and Web search activities. However, the rapid pace of technological innovation is making available more sophisticated forms of delivery such as video streaming. High-quality video streams, created at the instructor's desktop or in basic recording studios, can be produced that build on PowerPoint or create new media for use on the Web. The technology required to design, produce, and upload short video-streamed course content objects to the Internet is described. The preparation of materials, suggested production guidelines, and examples of information presented via desktop video methods are presented.
Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.
2015-01-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240
Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L
2015-08-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Three Approaches to Green Computing on Campus
ERIC Educational Resources Information Center
Thompson, John T.
2009-01-01
A "carbon footprint" is the "total set of greenhouse gas emissions caused directly and indirectly by an (individual, event, organization, and product) expressed as CO2" emissions. Since CO2 emissions are indicative of energy use, the higher the associated CO2 emissions, typically the greater the associated costs. A typical desktop PC system…
Cross-Platform User Interface of E-Learning Applications
ERIC Educational Resources Information Center
Stoces, Michal; Masner, Jan; Jarolímek, Jan; Šimek, Pavel; Vanek, Jirí; Ulman, Miloš
2015-01-01
The paper discusses the development of Web educational services for specific groups. A key feature is to allow the display and use of educational materials and training services to the widest possible set of different devices, especially in the browser classic desktop computers, notebooks, tablets, mobile phones and also on different readers for…
Blocking of Goal-Location Learning Based on Shape
ERIC Educational Resources Information Center
Alexander, Tim; Wilson, Stuart P.; Wilson, Paul N.
2009-01-01
Using desktop, computer-simulated virtual environments (VEs), the authors conducted 5 experiments to investigate blocking of learning about a goal location based on Shape B as a consequence of preliminary training to locate that goal using Shape A. The shapes were large 2-dimensional horizontal figures on the ground. Blocking of spatial learning…
High Resolution Displays In The Apple Macintosh And IBM PC Environments
NASA Astrophysics Data System (ADS)
Winegarden, Steven
1989-07-01
High resolution displays are one of the key elements that distinguish user oriented document finishing or publishing stations. A number of factors have been involved in bringing these to the desktop environment. At Sigma Designs we have concentrated on enhancing the capabilites of IBM PCs and compatibles and Apple Macintosh computer systems.
Notebooks, Handhelds, and Software in Physical Education (Grades 5-8)
ERIC Educational Resources Information Center
Mohnsen, Bonnie
2005-01-01
Heart monitors, pedometers, and now virtual reality-based equipment (e.g., Cyberbikes, "Dance Dance Revolution") have been embraced by physical educators as technologies worth using in the physical education program; however, the use of computers (be it a desktop, notebook, or handheld) in the physical education instructional program, has not been…
Epilepsy Forewarning Using A Hand-Held Device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hively, LM
2005-02-21
Over the last decade, ORNL has developed and patented a novel approach for forewarning of a large variety of machine and biomedical events. The present implementation uses desktop computers to analyze archival data. This report describes the next logical step in this effort, namely use of a hand-held device for the analysis.
Full Immersive Virtual Environment Cave[TM] in Chemistry Education
ERIC Educational Resources Information Center
Limniou, M.; Roberts, D.; Papadopoulos, N.
2008-01-01
By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…
Reading, Writing, and Documentation and Managing the Development of User Documentation.
ERIC Educational Resources Information Center
Lindberg, Wayne; Hoffman, Terrye
1987-01-01
The first of two articles addressing the issue of user documentation for computer software discusses the need to teach users how to read documentation. The second presents a guide for writing documentation that is based on the instructional systems design model, and makes suggestions for the desktop publishing of user manuals. (CLB)
Learning Computer Hardware by Doing: Are Tablets Better than Desktops?
ERIC Educational Resources Information Center
Raven, John; Qalawee, Mohamed; Atroshi, Hanar
2016-01-01
In this world of rapidly evolving technologies, educational institutions often struggle to keep up with change. Change often requires a state of readiness at both the micro and macro levels. This paper looks at a tertiary institution that undertook a significant technology change initiative by introducing tablet based components for teaching a…
WLANs for the 21st Century Library
ERIC Educational Resources Information Center
Calamari, Cal
2009-01-01
As educational and research needs have changed, libraries have changed as well. They must meet ever-increasing demand for access to online media, subscriptions to archives, video, audio, and other content. The way a user/patron accesses this information has also changed. Gone are the days of a few hardwired desktops or computer carts. While…
Infrastructure Suitability Assessment Modeling for Cloud Computing Solutions
2011-09-01
Virtualization vs . Para-Virtualization .......................................................10 Figure 4. Modeling alternatives in relation to model...the conceptual difference between full virtualization and para-virtualization. Figure 3. Full Virtualization vs . Para-Virtualization 2. XEN...Besides Microsoft’s own client implementations, dubbed “Remote Desktop Con- nection Client” for Windows® and Apple ® operating systems, various open
Redesigning a Library Space for Collaborative Learning
ERIC Educational Resources Information Center
Gabbard, Ralph B.; Kaiser, Anthony; Kaunelis, David
2007-01-01
The reference desk at Indiana State University's (ISU) library offers an excellent view of student work areas on the first floor. From this vantage point, the reference librarians noticed students, especially in the evening and on weekends, huddled together in small groups, with one student at the keyboard of a laptop or desktop computer. The…
The Promise of Zoomable User Interfaces
ERIC Educational Resources Information Center
Bederson, Benjamin B.
2011-01-01
Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…
Growth-simulation model for lodgepole pine in central Oregon.
Walter G. Dahms
1983-01-01
A growth-simulation model for central Oregon lodgepole pine (Pinus contorta Dougl.) has been constructed by combining data from temporary and permanent sample plots. The model is similar to a conventional yield table with the added capacity for dealing with the stand-density variable. The simulator runs on a desk-top computer.
The Influence of Textual Cues on First Impressions of an Email Sender
ERIC Educational Resources Information Center
Marlow, Shannon L.; Lacerenza, Christina N.; Iwig, Chelsea
2018-01-01
The present study experimentally manipulated the gender of an email sender, closing salutation, and sending mode (i.e., email sent via desktop computer/laptop as compared with email sent via a mobile device) to determine if these specific cues influence first impressions of the sender's competence, professionalism, positive affect, and negative…
ERIC Educational Resources Information Center
Bucknall, Ruary
1996-01-01
Overview of the interactive technologies used by the Northern Territory Secondary Correspondence School in Australia: print media utilizing desktop publishing and electronic transfer; telephone or H-F radio; interactive television; and interactive computing. More fully describes its interactive CD-ROM courses. Emphasizes that the programs are…
From Floppies to Flash--Your Guide to Removable Media
ERIC Educational Resources Information Center
Berdinka, Matthew J.
2005-01-01
Technology that once involved a scary, mysterious machine the size of a small house now fits on desktops and commonly appears in offices, schools, and homes. Computers allow for processing, storing and transmitting data between two or more people virtually anywhere in the world. They also allow users to save documents, presentations, photos and…
Speckle interferometry using fiber optic phase stepping
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Beheim, Glenn
1989-01-01
A system employing closed-loop phase-stepping is used to measure the out-of-plane deformation of a diffusely reflecting object. Optical fibers are used to provide reference and object beam illumination for a standard two-beam speckle interferometer, providing set-up flexibility and ease of alignment. Piezoelectric fiber-stretchers and a phase-measurement/servo system are used to provide highly accurate phase steps. Intensity data is captured with a charge-injection-device camera, and is converted into a phase map using a desktop computer. The closed-loop phase-stepping system provides 90 deg phase steps which are accurate to 0.02 deg, greatly improving this system relative to open-loop interferometers. The system is demonstrated on a speckle interferometer, measuring the rigid-body translation of a diffusely reflecting object with an accuracy + or - 10 deg, or roughly + or - 15 nanometers. This accuracy is achieved without the use of a pneumatically mounted optics table.
Transformation of personal computers and mobile phones into genetic diagnostic systems.
Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom
2014-09-16
Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.
Transformation of Personal Computers and Mobile Phones into Genetic Diagnostic Systems
2014-01-01
Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone—devices that have become readily accessible in developing countries—into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite. PMID:25223929
3D Printing of Preoperative Simulation Models of a Splenic Artery Aneurysm: Precision and Accuracy.
Takao, Hidemasa; Amemiya, Shiori; Shibata, Eisuke; Ohtomo, Kuni
2017-05-01
Three-dimensional (3D) printing is attracting increasing attention in the medical field. This study aimed to apply 3D printing to the production of hollow splenic artery aneurysm models for use in the simulation of endovascular treatment, and to evaluate the precision and accuracy of the simulation model. From 3D computed tomography (CT) angiography data of a splenic artery aneurysm, 10 hollow models reproducing the vascular lumen were created using a fused deposition modeling-type desktop 3D printer. After filling with water, each model was scanned using T2-weighted magnetic resonance imaging for the evaluation of the lumen. All images were coregistered, binarized, and then combined to create an overlap map. The cross-sectional area of the splenic artery aneurysm and its standard deviation (SD) were calculated perpendicular to the x- and y-axes. Most voxels overlapped among the models. The cross-sectional areas were similar among the models, with SDs <0.05 cm 2 . The mean cross-sectional areas of the splenic artery aneurysm were slightly smaller than those calculated from the original mask images. The maximum mean cross-sectional areas calculated perpendicular to the x- and y-axes were 3.90 cm 2 (SD, 0.02) and 4.33 cm 2 (SD, 0.02), whereas those calculated from the original mask images were 4.14 cm 2 and 4.66 cm 2 , respectively. The mean cross-sectional areas of the afferent artery were, however, almost the same as those calculated from the original mask images. The results suggest that 3D simulation modeling of a visceral artery aneurysm using a fused deposition modeling-type desktop 3D printer and computed tomography angiography data is highly precise and accurate. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Liver Volumetry Plug and Play: Do It Yourself with ImageJ
Dello, Simon A. W. G.; van Dam, Ronald M.; Slangen, Jules J. G.; van de Poll, Marcel C. G.; Bemelmans, Marc H. A.; Greve, Jan Willem W. M.; Beets-Tan, Regina G. H.; Wigmore, Stephen J.
2007-01-01
Background A small remnant liver volume is an important risk factor for posthepatectomy liver failure and can be predicted accurately by computed tomography (CT) volumetry using radiologic image analysis software. Unfortunately, this software is expensive and usually requires support by a radiologist. ImageJ is a freely downloadable image analysis software package developed by the National Institute of Health (NIH) and brings liver volumetry to the surgeon’s desktop. We aimed to assess the accuracy of ImageJ for hepatic CT volumetry. Methods ImageJ was downloaded from http://www.rsb.info.nih.gov/ij/. Preoperative CT scans of 15 patients who underwent liver resection for colorectal cancer liver metastases were retrospectively analyzed. Scans were opened in ImageJ; and the liver, all metastases, and the intended parenchymal transection line were manually outlined on each slice. The area of each selected region, metastasis, resection specimen, and remnant liver was multiplied by the slice thickness to calculate volume. Volumes of virtual liver resection specimens measured with ImageJ were compared with specimen weights and calculated volumes obtained during pathology examination after resection. Results There was an excellent correlation between the volumes calculated with ImageJ and the actual measured weights of the resection specimens (r² = 0.98, p < 0.0001). The weight/volume ratio amounted to 0.88 ± 0.04 (standard error) and was in agreement with our earlier findings using CT-linked radiologic software. Conclusion ImageJ can be used for accurate hepatic CT volumetry on a personal computer. This application brings CT volumetry to the surgeon’s desktop at no expense and is particularly useful in cases of tertiary referred patients, who already have a proper CT scan on CD-ROM from the referring institution. Most likely the discrepancy between volume and weight results from exsanguination of the liver after resection. PMID:17726630
NASA Astrophysics Data System (ADS)
van Leunen, J. A. J.; Dreessen, J.
1984-05-01
The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey
2003-01-01
The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.
Influence of direct computer experience on older adults' attitudes toward computers.
Jay, G M; Willis, S L
1992-07-01
This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.
Desktop Publishing Choices: Making an Appropriate Decision.
ERIC Educational Resources Information Center
Crawford, Walt
1991-01-01
Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…
Integrated IMA (Information Mission Areas) IC (Information Center) Guide
1989-06-01
COMPUTER AIDED DESIGN / COMPUTER AIDED MANUFACTURE 8-8 8.3.7 LIQUID CRYSTAL DISPLAY PANELS 8-8 8.3.8 ARTIFICIAL INTELLIGENCE APPLIED TO VI 8-9 8.4...2 10.3.1 DESKTOP PUBLISHING 10-3 10.3.2 INTELLIGENT COPIERS 10-5 10.3.3 ELECTRONIC ALTERNATIVES TO PRINTED DOCUMENTS 10-5 10.3.4 ELECTRONIC FORMS...Optical Disk LCD Units Storage Image Scanners Graphics Forms Output Generation Copiers Devices Software Optical Disk Intelligent Storage Copiers Work Group
CERN's Common Unix and X Terminal Environment
NASA Astrophysics Data System (ADS)
Cass, Tony
The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.
Data Transfers Among the HP-75, HP-86, and HP-9845 Microcomputers.
1983-01-01
AD-A139 438 DAT TRANSFERS AMONG THE HP-75 HP-86 AND HP-9845 / MICROCOMPUTENS(U) AIR FORCE INST OF TECH WNIOHT-PATTERSON AFN OH D P CONNOR 1983...hereafter called the ඓ") and the HP-86 (hereafter called the ඞ"). The computers are to be used for classroom instruction and research at SOC. On...the main campus another Hewlett-Packard desktop computer, the HP-9845 (hereafter called the "), is already in use; it controls and processes data
2006-09-01
required directional control for each thruster due to their high precision and equivalent power and computer interface requirements to those for the...Universal Serial Bus) ports, LPT (Line Printing Terminal) and KVM (Keyboard-Video- Mouse) interfaces. Additionally, power is supplied to the computer through...of the IDE cable to the Prometheus Development Kit ACC-IDEEXT. Connect a small drive power connector from the desktop ATX power supply to the ACC
Video control system for a drilling in furniture workpiece
NASA Astrophysics Data System (ADS)
Khmelev, V. L.; Satarov, R. N.; Zavyalova, K. V.
2018-05-01
During last 5 years, Russian industry has being starting to be a robotic, therefore scientific groups got new tasks. One of new tasks is machine vision systems, which should solve problem of automatic quality control. This type of systems has a cost of several thousand dollars each. The price is impossible for regional small business. In this article, we describe principle and algorithm of cheap video control system, which one uses web-cameras and notebook or desktop computer as a computing unit.
The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...
77 FR 13061 - Electronic Reporting of Toxics Release Inventory Data
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-05
...--Reporting Year SIC--Standard Industrial Code TRI--Toxics Release Inventory TRI-ME--TRI-Made Easy Desktop... EPA to ``publish a uniform toxic chemical release form for facilities covered'' by the TRI Program. 42... practicable. Similarly, EPA's Cross-Media Electronic Reporting Regulation (CROMERR) (40 CFR Part 3), published...
Computer virus information update CIAC-2301
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orvis, W.J.
1994-01-15
While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information inmore » this document was extracted from CIAC`s Virus database.« less
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.
2016-12-01
The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.
Feedback about Astronomical Application Developments for Mobile Devices
NASA Astrophysics Data System (ADS)
Schaaff, A.; Boch, T.; Fernique, P.; Houpin, R.; Kaestlé, V.; Royer, M.; Scheffmann, J.; Weiler, A.
2013-10-01
Within a few years, Smartphones have become the standard for mobile telephony, and we are now witnessing a rapid development of Internet tablets. These mobile devices have enough powerful hardware features to run more and more complex applications. In the field of astronomy it is not only possible to use these tools to access data via a simple browser, but also to develop native applications reusing libraries (Java for Android, Objective-C for iOS) developed for desktops. We have been working for two years on mobile application development and we now have the skills in native iOS and Android development, Web development (especially HTML5, JavaScript, CSS3) and conversion tools (PhoneGap) from Web development to native applications. The biggest change comes from human/computer interaction that is radically changed by the use of multitouch. This interaction requires a redesign of interfaces to take advantage of new features (simultaneous selections in different parts of the screen, etc.). In the case of native applications, the distribution is usually done through online stores (App Store, Google Play, etc.) which gives visibility to a wider audience. Our approach is not only to perform testing of materials and developing of prototypes, but also operational applications. The native application development is costly in development time, but the possibilities are broader because it is possible to use native hardware such as the gyroscope and the accelerometer, to point out an object in the sky. Development depends on the Web browser and the rendering and performance are often very different between different browsers. It is also possible to convert Web developments to native applications, but currently it is better to restrict this possibility to light applications in terms of functionality. Developments in HTML5 are promising but are far behind those available on desktops. HTML5 has the advantage of allowing development independent from the evolution of the mobile platforms (“write once, run everywhere”). The upcoming Windows 8 support on desktops and Internet tablets as well as a mobile version for smartphones will further expand the native systems family. This will enhance the interest of Web development.
Using Desk-Top Publishing to Develop Literacy.
ERIC Educational Resources Information Center
Wray, David; Medwell, Jane
1989-01-01
Examines the learning benefits which may accrue from using desk-top publishing techniques with children, especially in terms of the development of literacy skills. Analyzes desk-top publishing as an extension of word processing and describes some ways of using desk-top publishing in the classroom. (RS)
A Fine-Tuned Look at White Space Variation in Desktop Publishing.
ERIC Educational Resources Information Center
Knupfer, Nancy Nelson; McIsaac, Marina Stock
This investigation of the use of white space in print-based, computer-generated text focused on the point at which the white space interferes with reading speed and comprehension. It was hypothesized that reading speed and comprehension would be significantly greater when text was wrapped tightly around the graphic than when it had one-half inch…
Simulation Packages Expand Aircraft Design Options
NASA Technical Reports Server (NTRS)
2013-01-01
In 2001, NASA released a new approach to computational fluid dynamics that allows users to perform automated analysis on complex vehicle designs. In 2010, Palo Alto, California-based Desktop Aeronautics acquired a license from Ames Research Center to sell the technology. Today, the product assists organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets.
iChat[TM] Do You? Using Desktop Web Conferencing in Education
ERIC Educational Resources Information Center
Bell, Randy L.; Garofalo, Joe
2006-01-01
Videoconferencing is not a new technology and it has been widely used in educational settings since the mid-1980s. Videoconferencing has evolved into the integration of personal computers to what is now referred to as Web conferencing. In the mid-1990s, Internet Protocol (IP) was introduced into the mainstream but the educational community has…
3D Printing in Technology and Engineering Education
ERIC Educational Resources Information Center
Martin, Robert L.; Bowden, Nicholas S.; Merrill, Chris
2014-01-01
In the past five years, there has been tremendous growth in the production and use of desktop 3D printers. This growth has been driven by the increasing availability of inexpensive computing and electronics technologies. The ability to rapidly share ideas and intelligence over the Internet has also played a key role in the growth. Growth is also…
The Role of Theory and Technology in Learning Video Production: The Challenge of Change
ERIC Educational Resources Information Center
Shewbridge, William; Berge, Zane L.
2004-01-01
The video production field has evolved beyond being exclusively relevant to broadcast television. The convergence of low-cost consumer cameras and desktop computer editing has led to new applications of video in a wide range of areas, including the classroom. This presents educators with an opportunity to rethink how students learn video…
ERIC Educational Resources Information Center
Gamble-Risley, Michelle
2006-01-01
In the past, projection systems were large, heavy, and unwieldy and cost $3,000 to $5,000. Setup was fraught with the challenges of multiple wires plugged into the backs of desktop computers, often causing confusion about what went where. Systems were sometimes so difficult to set up that teachers had to spend pre-class time putting them together.…
Training Learners to Use Quizlet Vocabulary Activities on Mobile Phones in Vietnam with Facebook
ERIC Educational Resources Information Center
Tran, Phuong
2016-01-01
Mobile phone ownership among university students in Vietnam has reached almost 100%, exceeding that of Internet-capable desktop computers. This has made them increasingly popular to allow learners to carry out learning activities outside of the classroom, but some studies have suggested that learners are not always willing to engage in activities…
CALLing All Foreign Language Teachers: Computer-Assisted Language Learning in the Classroom
ERIC Educational Resources Information Center
Erben, Tony, Ed.; Sarieva, Iona, Ed.
2008-01-01
This book is a comprehensive guide to help foreign language teachers use technology in their classrooms. It offers the best ways to integrate technology into teaching for student-centered learning. CALL Activities include: Email; Building a Web site; Using search engines; Powerpoint; Desktop publishing; Creating sound files; iMovie; Internet chat;…
Teacher Associations: Extending Our Advocacy Reach
ERIC Educational Resources Information Center
Boitnott, Kitty
2012-01-01
School library media specialists have seen more fundamental changes in their jobs and in their roles within their schools than any other group of education professionals. The author started her first job as a school librarian when there were no desktop computers, and card catalogs were the order of the day. Over the course of the thirty-seven…
Programs for road network planning.
Ward W. Carson; Dennis P. Dykstra
1978-01-01
This paper describes four computer programs developed to assist logging engineers to plan transportation in a forest. The objective of these programs, to be used together, is to find the shortest path through a transportation network from a point of departure to a destination. Three of the programs use the digitizing and plotting capabilities of a programable desk-top...
Fire characteristics charts for fire behavior and U.S. fire danger rating
Faith Ann Heinsch; Pat Andrews
2010-01-01
The fire characteristics chart is a graphical method of presenting U.S. National Fire Danger Rating indices or primary surface or crown fire behavior characteristics. A desktop computer application has been developed to produce fire characteristics charts in a format suitable for inclusion in reports and presentations. Many options include change of scales, colors,...
ERIC Educational Resources Information Center
Fehn, Bruce; Johnson, Melanie; Smith, Tyson
2010-01-01
Elementary and secondary school history students demonstrate a great deal of enthusiasm for making documentary films. With free and easy-to-use software, as well as vast online, archival resources containing images and sounds, students can sit at a computer and make serious and engaging documentary productions. With students affectively engaged by…
Designing a Mobile Training System in Rural Areas with Bayesian Factor Models
ERIC Educational Resources Information Center
Omidi Najafabadi, Maryam; Mirdamadi, Seyed Mehdi; Payandeh Najafabadi, Amir Teimour
2014-01-01
The facts that the wireless technologies (1) are more convenient; and (2) need less skill than desktop computers, play a crucial role to decrease digital gap in rural areas. This study employed the Bayesian Confirmatory Factor Analysis (CFA) to design a mobile training system in rural areas of Iran. It categorized challenges, potential, and…
ERIC Educational Resources Information Center
Lawless-Reljic, Sabine Karine
2010-01-01
Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-07
..., Winston-Salem, North Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR... from Staffing Solutions, South East, and Omni Resources and Recovery. The notices were published in the... firm. The workers are engaged in employment related to the production of desktop computers. New...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-14
..., APN and ICONMA, Winston-Salem, North Carolina. The notice was published in the Federal Register on... Seaton Corporation. The notices were published on the Federal Register on April 19, 2010 (75 FR 20385... workers are engaged in employment related to the production of desktop computers. New information shows...
Ethnography at a Distance: Globally Mobile Parents Choosing International Schools
ERIC Educational Resources Information Center
Forsey, Martin; Breidenstein, Georg; Krüger, Oliver; Roch, Anna
2015-01-01
The research we report on was conducted from our computer desktops. We have not met the people we have studied; they are part of what Eichhorn described as a "textual community", gathered around the threads of online conversations associated with a website servicing the needs of English-language speakers in Germany. The thread in…
When Neurons Meet Electrons: Three Trends That Are Sparking Change in Computer Publishing.
ERIC Educational Resources Information Center
Cranney, Charles
1992-01-01
Three important trends in desktop publishing include (1) use of multiple media in presentation of information; (2) networking; and (3) "hot links" (integrated file-exchange formats). It is also important for college publications professionals to be familiar with sources of information about technological change and to be able to sort out the…
ERIC Educational Resources Information Center
Anderson, Mary Alice, Ed.
This notebook is a compilation of 53 lesson plans for grades 6-12, written by various authors and focusing on the integration of technology into the curriculum. Lesson plans include topics such as online catalog searching, electronic encyclopedias, CD-ROM databases, exploring the Internet, creating a computer slide show, desktop publishing, and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... desktop computers and mobile devices, we expect significant innovation to continue in the provision of... inside of buildings.'' 47. Discussion. Publicly available reports, such as a March 2011 study from J. D... installed Wi-Fi access points, and a growing number of mobile devices (e.g., smartphones, laptops, and...
An Ethical Dilemma: Talking about Plagiarism and Academic Integrity in the Digital Age
ERIC Educational Resources Information Center
Thomas, Ebony Elizabeth; Sassi, Kelly
2011-01-01
Today, many students not only access the Internet through desktop and laptop computers at home or at school but also have copious amounts of information at their fingertips via portable devices (e.g., iPods, iPads, netbooks, smartphones). While some teachers welcome the proliferation of portable technologies and easy wireless Internet access, and…
ERIC Educational Resources Information Center
Goins, L. Keith, Ed.
This proceedings includes the following papers: "Dealing with Discipline Problems in Schools" (Allen); "Developing Global Awareness" (Arnold); "Desktop Publishing Using WordPerfect 6.0 for Windows" (Broughton); "Learn and Earn" (Cauley); "Using the Computer to Teach Merchandising Math"…
Code White: A Signed Code Protection Mechanism for Smartphones
2010-09-01
analogous to computer security is the use of antivirus (AV) software . 12 AV software is a brute force approach to security. The software ...these users, numerous malicious programs have also surfaced. And while smartphones have desktop-like capabilities to execute software , they do not...11 2.3.1 Antivirus and Mobile Phones ............................................................... 11 2.3.2
Wong, J.; Göktepe, S.; Kuhl, E.
2014-01-01
Summary Computational modeling of the human heart allows us to predict how chemical, electrical, and mechanical fields interact throughout a cardiac cycle. Pharmacological treatment of cardiac disease has advanced significantly over the past decades, yet it remains unclear how the local biochemistry of an individual heart cell translates into global cardiac function. Here we propose a novel, unified strategy to simulate excitable biological systems across three biological scales. To discretize the governing chemical, electrical, and mechanical equations in space, we propose a monolithic finite element scheme. We apply a highly efficient and inherently modular global-local split, in which the deformation and the transmembrane potential are introduced globally as nodal degrees of freedom, while the chemical state variables are treated locally as internal variables. To ensure unconditional algorithmic stability, we apply an implicit backward Euler finite difference scheme to discretize the resulting system in time. To increase algorithmic robustness and guarantee optimal quadratic convergence, we suggest an incremental iterative Newton-Raphson scheme. The proposed algorithm allows us to simulate the interaction of chemical, electrical, and mechanical fields during a representative cardiac cycle on a patient-specific geometry, robust and stable, with calculation times on the order of four days on a standard desktop computer. PMID:23798328
Basics of Desktop Publishing. Second Edition.
ERIC Educational Resources Information Center
Beeby, Ellen; Crummett, Jerrie
This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…
Desktop Publishing: A Brave New World and Publishing from the Desktop.
ERIC Educational Resources Information Center
Lormand, Robert; Rowe, Jane J.
1988-01-01
The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…
NASA Astrophysics Data System (ADS)
Roccatello, E.; Nozzi, A.; Rumor, M.
2013-05-01
This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.
Flexible workflow sharing and execution services for e-scientists
NASA Astrophysics Data System (ADS)
Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely
2013-04-01
The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already integrated with the execution engine of the SHIWA Portal. Other engines can be added when required. Through the SHIWA Portal one can define and run simulations on the SHIWA Virtual Organisation, an e-infrastructure that gathers computing and data resources from various DCIs, including the European Grid Infrastructure. The Portal via third party workflow engines provides support for the most widely used academic workflow engines and it can be extended with other engines on demand. Such extensions translate between workflow languages and facilitate the nesting of workflows into larger workflows even when those are written in different languages and require different interpreters for execution. Through the workflow repository and the portal lonely scientists and scientific collaborations can share and offer workflows for reuse and execution. Given the integrated nature of the SHIWA Simulation Platform the shared workflows can be executed online, without installing any special client environment and downloading workflows. The FP7 "Building a European Research Community through Interoperable Workflows and Data" (ER-flow) project disseminates the achievements of the SHIWA project and use these achievements to build workflow user communities across Europe. ER-flow provides application supports to research communities within and beyond the project consortium to develop, share and run workflows with the SHIWA Simulation Platform.
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio; ...
2017-07-04
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tibi, Rigobert; Young, Christopher; Gonzales, Antonio
The matched filtering technique that uses the cross correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this paper, we introduce an approximate nearest neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation. Our method begins with a projection into a reduced dimensionality space, based on correlation with a randomized subset ofmore » the full template archive. Searching for a specified number of nearest neighbors for a query waveform is accomplished by iteratively comparing it with the neighbors of its immediate neighbors. We used the approach to search for matches to each of ~2300 analyst-reviewed signal detections reported in May 2010 for the International Monitoring System station MKAR. The template library in this case consists of a data set of more than 200,000 analyst-reviewed signal detections for the same station from February 2002 to July 2016 (excluding May 2010). Of these signal detections, 73% are teleseismic first P and 17% regional phases (Pn, Pg, Sn, and Lg). Finally, the analyses performed on a standard desktop computer show that the proposed ANN approach performs a search of the large template libraries about 25 times faster than the standard full linear search and achieves recall rates greater than 80%, with the recall rate increasing for higher correlation thresholds.« less
Ignizio, Drew A.; O'Donnell, Michael S.; Talbert, Colin B.
2014-01-01
Creating compliant metadata for scientific data products is mandated for all federal Geographic Information Systems professionals and is a best practice for members of the geospatial data community. However, the complexity of the The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata, the limited availability of easy-to-use tools, and recent changes in the ESRI software environment continue to make metadata creation a challenge. Staff at the U.S. Geological Survey Fort Collins Science Center have developed a Python toolbox for ESRI ArcDesktop to facilitate a semi-automated workflow to create and update metadata records in ESRI’s 10.x software. The U.S. Geological Survey Metadata Wizard tool automatically populates several metadata elements: the spatial reference, spatial extent, geospatial presentation format, vector feature count or raster column/row count, native system/processing environment, and the metadata creation date. Once the software auto-populates these elements, users can easily add attribute definitions and other relevant information in a simple Graphical User Interface. The tool, which offers a simple design free of esoteric metadata language, has the potential to save many government and non-government organizations a significant amount of time and costs by facilitating the development of The Federal Geographic Data Committee’s Content Standards for Digital Geospatial Metadata compliant metadata for ESRI software users. A working version of the tool is now available for ESRI ArcDesktop, version 10.0, 10.1, and 10.2 (downloadable at http:/www.sciencebase.gov/metadatawizard).
Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC
NASA Astrophysics Data System (ADS)
Alruwaili, Manal
With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.
NASA Astrophysics Data System (ADS)
Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.
2009-05-01
It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.
Analysis of helium-ion scattering with a desktop computer
NASA Astrophysics Data System (ADS)
Butler, J. W.
1986-04-01
This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.
Ito, C; Satoh, I; Michiya, H; Kitayama, Y; Miyazaki, K; Ota, S; Satoh, H; Sakurai, T; Shirato, H; Miyasaka, K
1997-01-01
A computerised nursing support system (CNSS) linked to the hospital information system (HIS) was developed and has been in use for one year, in order to reduce the workload of nurses. CNSS consists of (1) a hand held computer for each nurse (2) desk-top computers in the nurses' station and doctors' rooms (3) a data server (4) an interface with the main hospital information system. Nurses enter vital signs, food intake and other information about the patients into the hand held computer at the bed-side. The information is then sent automatically to the CNSS data server, which also receives patients' details (prescribed medicines etc.) from the HIS. Nurses and doctors can see all the information on the desk-top and hand held computers. This system was introduced in May 1995 into a university hospital ward with 40 beds. A questionnaire was completed by 23 nurses before and after the introduction of CNSS. The mean time required to post vital data was significantly reduced from 121 seconds to 54 seconds (p < 0.01). After three months 30% of nurses felt CNSS had reduced their workload, while 30% felt it had complicated their work; after five months 70% noted a reduction and 0% reported that CNSS had made their work more complex. The study therefore concludes that the interface between a computerised nursing support system and the hospital information system reduced the workload of nurses.
Electronic patient data confidentiality practices among surgical trainees: questionnaire study.
Mole, Damian J; Fox, Colin; Napolitano, Giulio
2006-10-01
The objective of this work was to evaluate the safeguards implemented by surgical trainees to protect the confidentiality of electronic patient data through a structured questionnaire sent to Northern Ireland surgical trainees. A group of 32 basic and higher surgical trainees attending a meeting of the Northern Ireland Association of Surgeons-in-Training were invited to complete a questionnaire regarding their computer use, UK Data Protection Act, 1988 registration and electronic data confidentiality practices. Of these 32 trainees, 29 returned completed questionnaires of whom 26 trainees regularly stored sensitive patient data for audit or research purposes on a computer. Only one person was registered under the Data Protection Act, 1988. Of the computers used to store and analyse sensitive data, only 3 of 14 desktops, 8 of 19 laptops and 3 of 14 hand-held computers forced a password logon. Of the 29 trainees, 16 used the same password for all machines, and 25 of 27 passwords were less than 8 characters long. Two respondents declined to reveal details of their secure passwords. Half of all trainees had never adjusted their internet security settings, despite all 14 desktops, 16 of 19 laptops and 5 of 14 hand-helds being routinely connected to the internet. Of the 29 trainees, 28 never encrypted their sensitive data files. Ten trainees had sent unencrypted sensitive patient data over the internet, using a non-secure server. Electronic data confidentiality practices amongst Northern Ireland surgical trainees are unsafe. Simple practical measures to safeguard confidentiality are recommended.
Desktop Video Productions. ICEM Guidelines Publications No. 6.
ERIC Educational Resources Information Center
Taufour, P. A.
Desktop video consists of integrating the processing of the video signal in a microcomputer. This definition implies that desktop video can take multiple forms such as virtual editing or digital video. Desktop video, which does not imply any particular technology, has been approached in different ways in different technical fields. It remains a…
ERIC Educational Resources Information Center
Lee, Paul
This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…
Experiments with microcomputer-based artificial intelligence environments
Summers, E.G.; MacDonald, R.A.
1988-01-01
The U.S. Geological Survey (USGS) has been experimenting with the use of relatively inexpensive microcomputers as artificial intelligence (AI) development environments. Several AI languages are available that perform fairly well on desk-top personal computers, as are low-to-medium cost expert system packages. Although performance of these systems is respectable, their speed and capacity limitations are questionable for serious earth science applications foreseen by the USGS. The most capable artificial intelligence applications currently are concentrated on what is known as the "artificial intelligence computer," and include Xerox D-series, Tektronix 4400 series, Symbolics 3600, VAX, LMI, and Texas Instruments Explorer. The artificial intelligence computer runs expert system shells and Lisp, Prolog, and Smalltalk programming languages. However, these AI environments are expensive. Recently, inexpensive 32-bit hardware has become available for the IBM/AT microcomputer. USGS has acquired and recently completed Beta-testing of the Gold Hill Systems 80386 Hummingboard, which runs Common Lisp on an IBM/AT microcomputer. Hummingboard appears to have the potential to overcome many of the speed/capacity limitations observed with AI-applications on standard personal computers. USGS is a Beta-test site for the Gold Hill Systems GoldWorks expert system. GoldWorks combines some high-end expert system shell capabilities in a medium-cost package. This shell is developed in Common Lisp, runs on the 80386 Hummingboard, and provides some expert system features formerly available only on AI-computers including frame and rule-based reasoning, on-line tutorial, multiple inheritance, and object-programming. ?? 1988 International Association for Mathematical Geology.
Schatz, Philip; Moser, Rosemarie Scolaro; Solomon, Gary S.; Ott, Summer D.; Karpf, Robin
2012-01-01
Context: Limited data are available regarding the prevalence and nature of invalid computerized baseline neurocognitive test data. Objective: To identify the prevalence of invalid baselines on the desktop and online versions of ImPACT and to document the utility of correcting for left-right (L-R) confusion on the desktop version of ImPACT. Design: Cross-sectional study of independent samples of high school (HS) and collegiate athletes who completed the desktop or online versions of ImPACT. Participants or Other Participants: A total of 3769 HS (desktop = 1617, online = 2152) and 2130 collegiate (desktop = 742, online = 1388) athletes completed preseason baseline assessments. Main Outcome Measure(s): Prevalence of 5 ImPACT validity indicators, with correction for L-R confusion (reversing left and right mouse-click responses) on the desktop version, by test version and group. Chi-square analyses were conducted for sex and attentional or learning disorders. Results: At least 1 invalid indicator was present on 11.9% (desktop) versus 6.3% (online) of the HS baselines and 10.2% (desktop) versus 4.1% (online) of collegiate baselines; correcting for L-R confusion (desktop) decreased this overall prevalence to 8.4% (HS) and 7.5% (collegiate). Online Impulse Control scores alone yielded 0.4% (HS) and 0.9% (collegiate) invalid baselines, compared with 9.0% (HS) and 5.4% (collegiate) on the desktop version; correcting for L-R confusion (desktop) decreased the prevalence of invalid Impulse Control scores to 5.4% (HS) and 2.6% (collegiate). Male athletes and HS athletes with attention deficit or learning disorders who took the online version were more likely to have at least 1 invalid indicator. Utility of additional invalidity indicators is reported. Conclusions: The online ImPACT version appeared to yield fewer invalid baseline results than did the desktop version. Identification of L-R confusion reduces the prevalence of invalid baselines (desktop only) and the potency of Impulse Control as a validity indicator. We advise test administrators to be vigilant in identifying invalid baseline results as part of routine concussion management and prevention programs. PMID:22892410
Help for the Help Desk: School District Technology Managers Learn to Do with Less.
ERIC Educational Resources Information Center
Kongshem, Lars
2001-01-01
Although the E-Rate has been a catalyst for school technology purchases, there are no subsidies for hiring qualified technology support staff. District technology coordinators are relying on technology support systems and shoestring survival strategies, employing standardized equipment and hard-drive configurations, desktop lockdowns, anti-virus…
Addressing the English Language Arts Technology Standard in a Secondary Reading Methodology Course.
ERIC Educational Resources Information Center
Merkley, Donna J.; Schmidt, Denise A.; Allen, Gayle
2001-01-01
Describes efforts to integrate technology into a reading methodology course for secondary English majors. Discusses the use of e-mail, multimedia, distance education for videoconferences, online discussion technology, subject-specific software, desktop publishing, a database management system, a concept mapping program, and the use of the World…
An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung
2011-01-01
In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less
ERIC Educational Resources Information Center
Lazerick, Beth
1990-01-01
This article describes desktop publishing and discusses the features and classroom uses of one of the newest desktop publishing programs. Several desktop publishing projects for teachers and students are suggested. (IAH)
An analysis of running skyline load path.
Ward W. Carson; Charles N. Mann
1971-01-01
This paper is intended for those who wish to prepare an algorithm to determine the load path of a running skyline. The mathematics of a simplified approach to this running skyline design problem are presented. The approach employs assumptions which reduce the complexity of the problem to the point where it can be solved on desk-top computers of limited capacities. The...
Army Communicator. Volume 34, Number 2
2009-01-01
tunneled into the NIPRNet traffic. The encryption hides the contents of the SIPRNet data through a process that randomizes the bit patterns...and technologies such as desktop applications, Virtual Private Network, Blackberry support, and the training and troubleshoot- ing of complex computer...to your own Standing Operating Procedure and then contract for services off the backside to a local Strategic Entry Point or tunnel through
Green Desktop Computing at the University of Oxford
ERIC Educational Resources Information Center
Noble, Howard; Curtis, Daniel; Tang, Kang
2009-01-01
The government of the United Kingdom has set a target to reduce CO2 emissions by at least 34 percent from 1990 levels by 2020. The Carbon Reduction Commitment (CRC) will require all large public and private sector organizations across the U.K. to cut carbon emissions and report total CO2 emissions annually so that the data can be published in a…
NASA Technical Reports Server (NTRS)
Westmeyer, Paul A. (Inventor); Wertenberg, Russell F. (Inventor); Krage, Frederick J. (Inventor); Riegel, Jack F. (Inventor)
2017-01-01
An authentication procedure utilizes multiple independent sources of data to determine whether usage of a device, such as a desktop computer, is authorized. When a comparison indicates an anomaly from the base-line usage data, the system, provides a notice that access of the first device is not authorized.
ERIC Educational Resources Information Center
Winans, Glen T.
This paper presents a descriptive review of how the Provost's Office of the College of Letters and Science at the University of California, Santa Barbara (UCSB) implemented 330 microcomputers in the 34 academic departments from July 1984 through June 1986. The decision to implement stand-alone microcomputers was based on four concerns: increasing…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
... Carolina. The workers are engaged in employment related to the production of desktop computers. The notice was published in the Federal Register on April 23, 2010 (75 FR 21361). The notices were amended on.... The notices were published in the Federal Register on April 19, 2010 (75 FR 20385), September 13, 2010...
ERIC Educational Resources Information Center
Gayle, Susan, Ed.
These proceedings address the appropriate uses of technology in education, including papers and summaries of presentations on the following topics: community partnerships; desktop publishing; English as a Second Language/English to Speakers of Other Languages (ESL/ESOL); cognitive issues in multimedia; higher education applications; social studies…
Contemporary issues in HIM. The application layer--III.
Wear, L L; Pinkert, J R
1993-07-01
We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.
Monte Carlo simulation of electrothermal atomization on a desktop personal computer
NASA Astrophysics Data System (ADS)
Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.
1996-07-01
Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.
Use phase signals to promote lifetime extension for Windows PCs.
Hickey, Stewart; Fitzpatrick, Colin; O'Connell, Maurice; Johnson, Michael
2009-04-01
This paper proposes a signaling methodology for personal computers. Signaling may be viewed as an ecodesign strategy that can positively influence the consumer to consumer (C2C) market process. A number of parameters are identified that can provide the basis for signal implementation. These include operating time, operating temperature, operating voltage, power cycle counts, hard disk drive (HDD) self-monitoring, and reporting technology (SMART) attributes and operating system (OS) event information. All these parameters are currently attainable or derivable via embedded technologies in modern desktop systems. A case study detailing a technical implementation of how the development of signals can be achieved in personal computers that incorporate Microsoft Windows operating systems is presented. Collation of lifetime temperature data from a system processor is demonstrated as a possible means of characterizing a usage profile for a desktop system. In addition, event log data is utilized for devising signals indicative of OS quality. The provision of lifetime usage data in the form of intuitive signals indicative of both hardware and software quality can in conjunction with consumer education facilitate an optimal remarketing strategy for used systems. This implementation requires no additional hardware.
Wesemann, Christian; Muallah, Jonas; Mah, James; Bumann, Axel
2017-01-01
The primary objective of this study was to compare the accuracy and time efficiency of an indirect and direct digitalization workflow with that of a three-dimensional (3D) printer in order to identify the most suitable method for orthodontic use. A master model was measured with a coordinate measuring instrument. The distances measured were the intercanine width, the intermolar width, and the dental arch length. Sixty-four scans were taken with each of the desktop scanners R900 and R700 (3Shape), the intraoral scanner TRIOS Color Pod (3Shape), and the Promax 3D Mid cone beam computed tomography (CBCT) unit (Planmeca). All scans were measured with measuring software. One scan was selected and printed 37 times on the D35 stereolithographic 3D printer (Innovation MediTech). The printed models were measured again using the coordinate measuring instrument. The most accurate results were obtained by the R900. The R700 and the TRIOS intraoral scanner showed comparable results. CBCT-3D-rendering with the Promax 3D Mid CBCT unit revealed significantly higher accuracy with regard to dental casts than dental impressions. 3D printing offered a significantly higher level of deviation than digitalization with desktop scanners or an intraoral scanner. The chairside time required for digital impressions was 27% longer than for conventional impressions. Conventional impressions, model casting, and optional digitization with desktop scanners remains the recommended workflow process. For orthodontic demands, intraoral scanners are a useful alternative for full-arch scans. For prosthodontic use, the scanning scope should be less than one quadrant and three additional teeth.
Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo
2014-01-01
Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
NASA Astrophysics Data System (ADS)
Agrawal, Arun; Koff, David; Bak, Peter; Bender, Duane; Castelli, Jane
2015-03-01
The deployment of regional and national Electronic Health Record solutions has been a focus of many countries throughout the past decade. A major challenge for these deployments has been support for ubiquitous image viewing. More specifically, these deployments require an imaging solution that can work over the Internet, leverage any point of service device: desktop, tablet, phone; and access imaging data from any source seamlessly. Whereas standards exist to enable ubiquitous image viewing, few if any solutions exist that leverage these standards and meet the challenge. Rather, most of the currently available web based DI viewing solutions are either proprietary solutions or require special plugins. We developed a true zero foot print browser based DI viewing solution based on the Web Access DICOM Objects (WADO) and Cross-enterprise Document Sharing for Imaging (XDS-I.b) standards to a) demonstrate that a truly ubiquitous image viewer can be deployed; b) identify the gaps in the current standards and the design challenges for developing such a solution. The objective was to develop a viewer, which works on all modern browsers on both desktop and mobile devices. The implementation allows basic viewing functionalities of scroll, zoom, pan and window leveling (limited). The major gaps identified in the current DICOM WADO standards are a lack of ability to allow any kind of 3D reconstruction or MPR views. Other design challenges explored include considerations related to optimization of the solution for response time and low memory foot print.
NASA Technical Reports Server (NTRS)
Rogers, R. H. (Principal Investigator)
1980-01-01
The results achieved during the first eight months of a program to transfer LANDSAT technology to practicing professionals in the private and public sectors (grass roots) through community colleges and other locally available institutions are reported. The approach offers hands-on interactive analysis training and demonstrations through the use of color desktop computer terminals communicating with a host computer by telephone lines. The features of the terminals and associated training materials are reviewed together with plans for their use in training and demonstration projects.
Leonardi, Rosalia; Maiorana, Francesco; Giordano, Daniela
2008-06-01
Many of us use and maintain files on more than 1 computer--a desktop part of the time, and a notebook, a palmtop, or removable devices at other times. It can be easy to forget which device contains the latest version of a particular file, and time-consuming searches often ensue. One way to solve this problem is to use software that synchronizes the files. This allows users to maintain updated versions of the same file in several locations.
Ganalyzer: A Tool for Automatic Galaxy Image Analysis
NASA Astrophysics Data System (ADS)
Shamir, Lior
2011-08-01
We describe Ganalyzer, a model-based tool that can automatically analyze and classify galaxy images. Ganalyzer works by separating the galaxy pixels from the background pixels, finding the center and radius of the galaxy, generating the radial intensity plot, and then computing the slopes of the peaks detected in the radial intensity plot to measure the spirality of the galaxy and determine its morphological class. Unlike algorithms that are based on machine learning, Ganalyzer is based on measuring the spirality of the galaxy, a task that is difficult to perform manually, and in many cases can provide a more accurate analysis compared to manual observation. Ganalyzer is simple to use, and can be easily embedded into other image analysis applications. Another advantage is its speed, which allows it to analyze ~10,000,000 galaxy images in five days using a standard modern desktop computer. These capabilities can make Ganalyzer a useful tool in analyzing large data sets of galaxy images collected by autonomous sky surveys such as SDSS, LSST, or DES. The software is available for free download at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer, and the data used in the experiment are available at http://vfacstaff.ltu.edu/lshamir/downloads/ganalyzer/GalaxyImages.zip.
Benchmarking multimedia performance
NASA Astrophysics Data System (ADS)
Zandi, Ahmad; Sudharsanan, Subramania I.
1998-03-01
With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.
phpMs: A PHP-Based Mass Spectrometry Utilities Library.
Collins, Andrew; Jones, Andrew R
2018-03-02
The recent establishment of cloud computing, high-throughput networking, and more versatile web standards and browsers has led to a renewed interest in web-based applications. While traditionally big data has been the domain of optimized desktop and server applications, it is now possible to store vast amounts of data and perform the necessary calculations offsite in cloud storage and computing providers, with the results visualized in a high-quality cross-platform interface via a web browser. There are number of emerging platforms for cloud-based mass spectrometry data analysis; however, there is limited pre-existing code accessible to web developers, especially for those that are constrained to a shared hosting environment where Java and C applications are often forbidden from use by the hosting provider. To remedy this, we provide an open-source mass spectrometry library for one of the most commonly used web development languages, PHP. Our new library, phpMs, provides objects for storing and manipulating spectra and identification data as well as utilities for file reading, file writing, calculations, peptide fragmentation, and protein digestion as well as a software interface for controlling search engines. We provide a working demonstration of some of the capabilities at http://pgb.liv.ac.uk/phpMs .
Haldane, Waddington and recombinant inbred lines: extension of their work to any number of genes.
Samal, Areejit; Martin, Olivier C
2017-11-01
In the early 1930s, J. B. S. Haldane and C. H. Waddington collaborated on the consequences of genetic linkage and inbreeding. One elegant mathematical genetics problem solved by them concerns recombinant inbred lines (RILs) produced via repeated self or brother-sister mating. In this classic contribution, Haldane and Waddington derived an analytical formula for the probabilities of 2-locus and 3-locus RIL genotypes. Specifically, the Haldane-Waddington formula gives the recombination rate R in such lines as a simple function of the per generation recombination rate r. Interestingly, for more than 80 years, an extension of this result to four or more loci remained elusive. In 2015, we generalized the Haldane-Waddington self-mating result to any number of loci. Our solution used self-consistent equations of the multi-locus probabilities 'for an infinite number of generations' and solved these by simple algebraic operations. In practice, our approach provides a quantum leap in the systems that can be handled: the cases of up to six loci can be solved by hand while a computer program implementing our mathematical formalism tackles up to 20 loci on standard desktop computers.