Sample records for powerful desktop computers

  1. Computer usage and national energy consumption: Results from a field-metering study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desroches, Louis-Benoit; Fuchs, Heidi; Greenblatt, Jeffery

    The electricity consumption of miscellaneous electronic loads (MELs) in the home has grown in recent years, and is expected to continue rising. Consumer electronics, in particular, are characterized by swift technological innovation, with varying impacts on energy use. Desktop and laptop computers make up a significant share of MELs electricity consumption, but their national energy use is difficult to estimate, given uncertainties around shifting user behavior. This report analyzes usage data from 64 computers (45 desktop, 11 laptop, and 8 unknown) collected in 2012 as part of a larger field monitoring effort of 880 households in the San Francisco Baymore » Area, and compares our results to recent values from the literature. We find that desktop computers are used for an average of 7.3 hours per day (median = 4.2 h/d), while laptops are used for a mean 4.8 hours per day (median = 2.1 h/d). The results for laptops are likely underestimated since they can be charged in other, unmetered outlets. Average unit annual energy consumption (AEC) for desktops is estimated to be 194 kWh/yr (median = 125 kWh/yr), and for laptops 75 kWh/yr (median = 31 kWh/yr). We estimate national annual energy consumption for desktop computers to be 20 TWh. National annual energy use for laptops is estimated to be 11 TWh, markedly higher than previous estimates, likely reflective of laptops drawing more power in On mode in addition to greater market penetration. This result for laptops, however, carries relatively higher uncertainty compared to desktops. Different study methodologies and definitions, changing usage patterns, and uncertainty about how consumers use computers must be considered when interpreting our results with respect to existing analyses. Finally, as energy consumption in On mode is predominant, we outline several energy savings opportunities: improved power management (defaulting to low-power modes after periods of inactivity as well as power scaling), matching the rated power of power supplies to computing needs, and improving the efficiency of individual components.« less

  2. Desktop Publishing: A Powerful Tool for Advanced Composition Courses.

    ERIC Educational Resources Information Center

    Sullivan, Patricia

    1988-01-01

    Examines the advantages of using desktop publishing in advanced writing classes. Explains how desktop publishing can spur creativity, call attention to the interaction between words and pictures, encourage the social dimensions of computing and composing, and provide students with practical skills. (MM)

  3. Development of a small-scale computer cluster

    NASA Astrophysics Data System (ADS)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  4. Randomized Trial of Desktop Humidifier for Dry Eye Relief in Computer Users.

    PubMed

    Wang, Michael T M; Chan, Evon; Ea, Linda; Kam, Clifford; Lu, Yvonne; Misra, Stuti L; Craig, Jennifer P

    2017-11-01

    Dry eye is a frequently reported problem among computer users. Low relative humidity environments are recognized to exacerbate signs and symptoms of dry eye, yet are common in offices of computer operators. Desktop USB-powered humidifiers are available commercially, but their efficacy for dry eye relief has not been established. This study aims to evaluate the potential for a desktop USB-powered humidifier to improve tear-film parameters, ocular surface characteristics, and subjective comfort of computer users. Forty-four computer users were enrolled in a prospective, masked, randomized crossover study. On separate days, participants were randomized to 1 hour of continuous computer use, with and without exposure to a desktop humidifier. Lipid-layer grade, noninvasive tear-film breakup time, and tear meniscus height were measured before and after computer use. Following the 1-hour period, participants reported whether ocular comfort was greater, equal, or lesser than that at baseline. The desktop humidifier effected a relative difference in humidity between the two environments of +5.4 ± 5.0% (P < .001). Participants demonstrated no significant differences in lipid-layer grade and tear meniscus height between the two environments (all P > .05). However, a relative increase in the median noninvasive tear-film breakup time of +4.0 seconds was observed in the humidified environment (P < .001), which was associated with a higher proportion of subjects reporting greater comfort relative to baseline (36% vs. 5%, P < .001). Even with a modest increase in relative humidity locally, the desktop humidifier shows potential to improve tear-film stability and subjective comfort during computer use.Trial registration no: ACTRN12617000326392.

  5. Reach a New Threshold of Freedom and Control with Dell's Flexible Computing Solution: On-Demand Desktop Streaming

    ERIC Educational Resources Information Center

    Technology & Learning, 2008

    2008-01-01

    When it comes to IT, there has always been an important link between data center control and client flexibility. As computing power increases, so do the potentially crippling threats to security, productivity and financial stability. This article talks about Dell's On-Demand Desktop Streaming solution which is designed to centralize complete…

  6. The Battle for Desktop Control: When It Comes to the Management of Classroom Computers, Educators and the Technical Staff Who Support Them Must Forge a Common Ground

    ERIC Educational Resources Information Center

    Fryer, Wesley

    2004-01-01

    There has long been a power struggle between techies and teachers over classroom computer desktops. IT personnel tend to believe allowing "inept" educators to have unfettered access to their computer's hard drive is an open invitation for trouble. Conversely, teachers often perceive tech support to be "uncaring" adversaries standing in the way of…

  7. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  8. Desktop Virtualization in Action: Simplicity Is Power

    ERIC Educational Resources Information Center

    Fennell, Dustin

    2010-01-01

    Discover how your institution can better manage and increase access to instructional applications and desktops while providing a blended learning environment. Receive practical insight into how academic computing virtualization can be leveraged to enhance education at your institution while lowering Total Cost of Ownership (TCO) and reducing the…

  9. Desktop Virtual Reality: A Powerful New Technology for Teaching and Research in Industrial Teacher Education

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.

    2004-01-01

    Virtual Reality has been defined in many different ways and now means different things in various contexts. VR can range from simple environments presented on a desktop computer to fully immersive multisensory environments experienced through complex headgear and bodysuits. In all of its manifestations, VR is basically a way of simulating or…

  10. Current implementation and future plans on new code architecture, programming language and user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, B.

    1997-07-01

    Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.

  11. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    PubMed

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  12. 12 CFR 1732.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... home computer systems of an employee; or (4) Whether the information is active or inactive. (k) Record... (e.g., e-mail, databases, spreadsheets, PowerPoint presentations, electronic reporting systems... information is stored or located, including network servers, desktop or laptop computers and handheld...

  13. Tablet PCs: A Physical Educator's New Clipboard

    ERIC Educational Resources Information Center

    Nye, Susan B.

    2010-01-01

    Computers in education have come a long way from the abacus of 5,000 years ago to the desktop and laptop computers of today. Computers have transformed the educational environment, and with each new iteration of smaller and more powerful machines come additional advantages for teaching practices. The Tablet PC is one. Tablet PCs are fully…

  14. Design and Integration of a Three Degrees-of-Freedom Robotic Vehicle with Control Moment Gyro for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Testbed

    DTIC Science & Technology

    2006-09-01

    required directional control for each thruster due to their high precision and equivalent power and computer interface requirements to those for the...Universal Serial Bus) ports, LPT (Line Printing Terminal) and KVM (Keyboard-Video- Mouse) interfaces. Additionally, power is supplied to the computer through...of the IDE cable to the Prometheus Development Kit ACC-IDEEXT. Connect a small drive power connector from the desktop ATX power supply to the ACC

  15. A simple grid implementation with Berkeley Open Infrastructure for Network Computing using BLAST as a model

    PubMed Central

    Pinthong, Watthanai; Muangruen, Panya

    2016-01-01

    Development of high-throughput technologies, such as Next-generation sequencing, allows thousands of experiments to be performed simultaneously while reducing resource requirement. Consequently, a massive amount of experiment data is now rapidly generated. Nevertheless, the data are not readily usable or meaningful until they are further analysed and interpreted. Due to the size of the data, a high performance computer (HPC) is required for the analysis and interpretation. However, the HPC is expensive and difficult to access. Other means were developed to allow researchers to acquire the power of HPC without a need to purchase and maintain one such as cloud computing services and grid computing system. In this study, we implemented grid computing in a computer training center environment using Berkeley Open Infrastructure for Network Computing (BOINC) as a job distributor and data manager combining all desktop computers to virtualize the HPC. Fifty desktop computers were used for setting up a grid system during the off-hours. In order to test the performance of the grid system, we adapted the Basic Local Alignment Search Tools (BLAST) to the BOINC system. Sequencing results from Illumina platform were aligned to the human genome database by BLAST on the grid system. The result and processing time were compared to those from a single desktop computer and HPC. The estimated durations of BLAST analysis for 4 million sequence reads on a desktop PC, HPC and the grid system were 568, 24 and 5 days, respectively. Thus, the grid implementation of BLAST by BOINC is an efficient alternative to the HPC for sequence alignment. The grid implementation by BOINC also helped tap unused computing resources during the off-hours and could be easily modified for other available bioinformatics software. PMID:27547555

  16. GREEN SUPERCOMPUTING IN A DESKTOP BOX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HSU, CHUNG-HSING; FENG, WU-CHUN; CHING, AVERY

    2007-01-17

    The computer workstation, introduced by Sun Microsystems in 1982, was the tool of choice for scientists and engineers as an interactive computing environment for the development of scientific codes. However, by the mid-1990s, the performance of workstations began to lag behind high-end commodity PCs. This, coupled with the disappearance of BSD-based operating systems in workstations and the emergence of Linux as an open-source operating system for PCs, arguably led to the demise of the workstation as we knew it. Around the same time, computational scientists started to leverage PCs running Linux to create a commodity-based (Beowulf) cluster that provided dedicatedmore » computer cycles, i.e., supercomputing for the rest of us, as a cost-effective alternative to large supercomputers, i.e., supercomputing for the few. However, as the cluster movement has matured, with respect to cluster hardware and open-source software, these clusters have become much more like their large-scale supercomputing brethren - a shared (and power-hungry) datacenter resource that must reside in a machine-cooled room in order to operate properly. Consequently, the above observations, when coupled with the ever-increasing performance gap between the PC and cluster supercomputer, provide the motivation for a 'green' desktop supercomputer - a turnkey solution that provides an interactive and parallel computing environment with the approximate form factor of a Sun SPARCstation 1 'pizza box' workstation. In this paper, they present the hardware and software architecture of such a solution as well as its prowess as a developmental platform for parallel codes. In short, imagine a 12-node personal desktop supercomputer that achieves 14 Gflops on Linpack but sips only 185 watts of power at load, resulting in a performance-power ratio that is over 300% better than their reference SMP platform.« less

  17. 48 CFR 52.223-16 - Acquisition of EPEAT-Registered Personal Computer Products.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) and liquid crystal display (LCD). Desktop computer means a computer where the main unit is intended to... periods of time either with or without a direct connection to an AC power source. Notebooks must utilize... products that, at the time of submission of proposals and at the time of award, were EPEAT® bronze...

  18. Improved Distance Learning Environment For Marine Forces Reserve

    DTIC Science & Technology

    2016-09-01

    keyboard, to 20 form a desktop computer . Laptop computers share similar components but add mobility to the user. If additional desktop computers ...for stationary computing devices such as desktop PCs and laptops include the Microsoft Windows, Mac OS, and Linux families of OSs 44 (Hopkins...opportunities to all Marines. For active duty Marines, government-provided desktops and laptops (GPDLs) typically support DL T&E or learning resource

  19. Using the Power of Media to Communicate Science: A Question of Style?

    ERIC Educational Resources Information Center

    Imhof, Heidi

    1991-01-01

    Discusses educational effects of the style, content, and quality inherent in several multimedia and desktop-publishing products available to science teachers, including books, interactive software, videos, and computer simulations. (JJK)

  20. 78 FR 41873 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... limited to) desktop computers, integrated desktop computers, laptop/notebook/ netbook computers, and... computer, and 65% of U.S. households owning a notebook, laptop, or netbook computer, in 2013.\\4\\ Coverage... recently published studies. In these studies, the average annual energy use for a desktop computer was...

  1. Transitioning EEG experiments away from the laboratory using a Raspberry Pi 2.

    PubMed

    Kuziek, Jonathan W P; Shienh, Axita; Mathewson, Kyle E

    2017-02-01

    Electroencephalography (EEG) experiments are typically performed in controlled laboratory settings to minimise noise and produce reliable measurements. These controlled conditions also reduce the applicability of the obtained results to more varied environments and may limit their relevance to everyday situations. Advances in computer portability may increase the mobility and applicability of EEG results while decreasing costs. In this experiment we show that stimulus presentation using a Raspberry Pi 2 computer provides a low cost, reliable alternative to a traditional desktop PC in the administration of EEG experimental tasks. Significant and reliable MMN and P3 activity, typical event-related potentials (ERPs) associated with an auditory oddball paradigm, were measured while experiments were administered using the Raspberry Pi 2. While latency differences in ERP triggering were observed between systems, these differences reduced power only marginally, likely due to the reduced processing power of the Raspberry Pi 2. An auditory oddball task administered using the Raspberry Pi 2 produced similar ERPs to those derived from a desktop PC in a laboratory setting. Despite temporal differences and slight increases in trials needed for similar statistical power, the Raspberry Pi 2 can be used to design and present auditory experiments comparable to a PC. Our results show that the Raspberry Pi 2 is a low cost alternative to the desktop PC when administering EEG experiments and, due to its small size and low power consumption, will enable mobile EEG experiments unconstrained by a traditional laboratory setting. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Evaluating virtual hosted desktops for graphics-intensive astronomy

    NASA Astrophysics Data System (ADS)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  3. Data Center Energy Efficiency Technologies and Methodologies: A Review of Commercial Technologies and Recommendations for Application to Department of Defense Systems

    DTIC Science & Technology

    2015-11-01

    provided by a stand-alone desktop or hand held computing device. This introduces into the discussion a large number of mobile , tactical command...control, communications, and computer (C4) systems across the Services. A couple of examples are mobile command posts mounted on the back of an M1152... infrastructure (DCPI). This term encompasses on-site backup generators, switchgear, uninterruptible power supplies (UPS), power distribution units

  4. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  5. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  6. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  7. 36 CFR 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Desktop and portable computers. 1194.26 Section 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  8. A framework for interactive visualization of digital medical images.

    PubMed

    Koehring, Andrew; Foo, Jung Leng; Miyano, Go; Lobe, Thom; Winer, Eliot

    2008-10-01

    The visualization of medical images obtained from scanning techniques such as computed tomography and magnetic resonance imaging is a well-researched field. However, advanced tools and methods to manipulate these data for surgical planning and other tasks have not seen widespread use among medical professionals. Radiologists have begun using more advanced visualization packages on desktop computer systems, but most physicians continue to work with basic two-dimensional grayscale images or not work directly with the data at all. In addition, new display technologies that are in use in other fields have yet to be fully applied in medicine. It is our estimation that usability is the key aspect in keeping this new technology from being more widely used by the medical community at large. Therefore, we have a software and hardware framework that not only make use of advanced visualization techniques, but also feature powerful, yet simple-to-use, interfaces. A virtual reality system was created to display volume-rendered medical models in three dimensions. It was designed to run in many configurations, from a large cluster of machines powering a multiwalled display down to a single desktop computer. An augmented reality system was also created for, literally, hands-on interaction when viewing models of medical data. Last, a desktop application was designed to provide a simple visualization tool, which can be run on nearly any computer at a user's disposal. This research is directed toward improving the capabilities of medical professionals in the tasks of preoperative planning, surgical training, diagnostic assistance, and patient education.

  9. Task Scheduling in Desktop Grids: Open Problems

    NASA Astrophysics Data System (ADS)

    Chernov, Ilya; Nikitina, Natalia; Ivashko, Evgeny

    2017-12-01

    We survey the areas of Desktop Grid task scheduling that seem to be insufficiently studied so far and are promising for efficiency, reliability, and quality of Desktop Grid computing. These topics include optimal task grouping, "needle in a haystack" paradigm, game-theoretical scheduling, domain-imposed approaches, special optimization of the final stage of the batch computation, and Enterprise Desktop Grids.

  10. Powering Down from the Bottom up: Greener Client Computing

    ERIC Educational Resources Information Center

    O'Donnell, Tom

    2009-01-01

    A decade ago, people wanting to practice "green computing" recycled their printer paper, turned their personal desktop systems off from time to time, and tried their best to donate old equipment to a nonprofit instead of throwing it away. A campus IT department can shave a few watts off just about any IT process--the real trick is planning and…

  11. 36 CFR § 1194.26 - Desktop and portable computers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Desktop and portable computers. § 1194.26 Section § 1194.26 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION... § 1194.26 Desktop and portable computers. (a) All mechanically operated controls and keys shall comply...

  12. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  13. SkinScan©: A PORTABLE LIBRARY FOR MELANOMA DETECTION ON HANDHELD DEVICES

    PubMed Central

    Wadhawan, Tarun; Situ, Ning; Lancaster, Keith; Yuan, Xiaojing; Zouridakis, George

    2011-01-01

    We have developed a portable library for automated detection of melanoma termed SkinScan© that can be used on smartphones and other handheld devices. Compared to desktop computers, embedded processors have limited processing speed, memory, and power, but they have the advantage of portability and low cost. In this study we explored the feasibility of running a sophisticated application for automated skin cancer detection on an Apple iPhone 4. Our results demonstrate that the proposed library with the advanced image processing and analysis algorithms has excellent performance on handheld and desktop computers. Therefore, deployment of smartphones as screening devices for skin cancer and other skin diseases can have a significant impact on health care delivery in underserved and remote areas. PMID:21892382

  14. TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA

    EPA Science Inventory

    The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...

  15. 75 FR 25185 - Broadband Initiatives Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...

  16. Creating a Parallel Version of VisIt for Microsoft Windows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitlock, B J; Biagas, K S; Rawson, P L

    2011-12-07

    VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less

  17. Desktop Publishing Made Simple.

    ERIC Educational Resources Information Center

    Wentling, Rose Mary

    1989-01-01

    The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)

  18. Running GUI Applications on Peregrine from OSX | High-Performance Computing

    Science.gov Websites

    Learn how to use Virtual Network Computing to access a Linux graphical desktop environment on Peregrine local port (on, e.g., your laptop), starts a VNC server process that manages a virtual desktop on your virtual desktop. This is persistent, so remember it-you will use this password whenever accessing

  19. Yearbook Production: Yearbook Staffs Can Now "Blame" Strengths, Weaknesses on Computer as They Take More Control of Their Publications.

    ERIC Educational Resources Information Center

    Hall, H. L.

    1988-01-01

    Reports on the advantages and disadvantages of desktop publishing, using the Apple Macintosh and "Pagemaker" software, to produce a high school yearbook. Asserts that while desktop publishing may be initially more time consuming for those unfamiliar with computers, desktop publishing gives high school journalism staffs more control over…

  20. A discrete Fourier transform for virtual memory machines

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    An algebraic theory of the Discrete Fourier Transform is developed in great detail. Examination of the details of the theory leads to a computationally efficient fast Fourier transform for the use on computers with virtual memory. Such an algorithm is of great use on modern desktop machines. A FORTRAN coded version of the algorithm is given for the case when the sequence of numbers to be transformed is a power of two.

  1. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments

    PubMed Central

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734

  2. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.

    PubMed

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.

  3. Pages from the Desktop: Desktop Publishing Today.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  4. Common Sense Wordworking III: Desktop Publishing and Desktop Typesetting.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1987-01-01

    Describes current desktop publishing packages available for microcomputers and discusses the disadvantages, especially in cost, for most personal computer users. Also described is a less expensive alternative technology--desktop typesetting--which meets the requirements of users who do not need elaborate techniques for combining text and graphics.…

  5. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    DTIC Science & Technology

    2007-04-01

    judgmental self-doubt, depression, and causal uncertainty, tend to take fewer risks, and have lower self-esteem. Results from two studies (Nygren, 2000...U.S. Army Research Institute for the Behavioral and Social Sciences Research Report 1869 Assessment of Two Desk-Top Computer Simulations Used to...SUBTITLE 5a. CONTRACT OR GRANT NUMBER Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit

  6. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  7. Design of MPPT Controller Monitoring Software Based on QT Framework

    NASA Astrophysics Data System (ADS)

    Meng, X. Z.; Lu, P. G.

    2017-10-01

    The MPPT controller was a hardware device for tracking the maximum power point of solar photovoltaic array. Multiple controllers could be working as networking mode by specific communicating protocol. In this article, based on C++ GUI programming with Qt frame, we designed one sort of desktop application for monitoring and analyzing operational parameter of MPPT controller. The type of communicating protocol for building network was Modbus protocol which using Remote Terminal Unit mode and The desktop application of host computer was connected with all the controllers in the network through RS485 communication or ZigBee wireless communication. Using this application, user could monitor the parameter of controller wherever they were by internet.

  8. Seat Interfaces for Aircrew Performance and Safety

    DTIC Science & Technology

    2010-01-01

    Quantum -II Desktop System consists of a keyboard and hardware accessories (electrodes, cables, etc.), and interfaces with a desktop computer via software...segment. Resistance and reactance data was collected to estimate blood volume changes. The Quantum -II Desktop system collected continuous data of...Approved for public release; distribution unlimited. 88 ABW Cleared 03/13/2015; 88ABW-2015-1053. mockup also included a laptop computer , a

  9. Competition in Defense Acquisitions

    DTIC Science & Technology

    2008-05-14

    NASA employees to maintain desktop assets No way to track costs, no standardization, not tracking service quality NASA’s Outsourcing Desktop...assets to the private sector. ODIN Goals Cut desktop computing costs Increase service quality Achieve interoperability and standardization Focus...not tracking service quality NASA’s Outsourcing Desktop Initiative (ODIN) transferred the responsibility for providing and managing the vast

  10. Desktop Technology for Newspapers: Use of the Computer Tool.

    ERIC Educational Resources Information Center

    Wilson, Howard Alan

    This work considers desktop publishing technology as a way used to paginate newspapers electronically, tracing the technology's development from the beginning of desktop publishing in the mid-1980s to the 1990s. The work emphasizes how desktop publishing technology is and can be used by weekly newspapers. It reports on a Pennsylvania weekly…

  11. What's New in Software? Mastery of the Computer through Desktop Publishing.

    ERIC Educational Resources Information Center

    Hedley, Carolyn N.; Ellsworth, Nancy J.

    1993-01-01

    Offers thoughts on the phenomenon of the underuse of classroom computers. Argues that desktop publishing is one way of overcoming the computer malaise occurring in schools, using the incentive of classroom reading and writing for mastery of many aspects of computer production, including writing, illustrating, reading, and publishing. (RS)

  12. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  13. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  14. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  15. 48 CFR 252.204-7011 - Alternative Line Item Structure.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Unit Unit price Amount 0001 Computer, Desktop with CPU, Monitor, Keyboard and Mouse 20 EA Alternative... Unit Unit Price Amount 0001 Computer, Desktop with CPU, Keyboard and Mouse 20 EA 0002 Monitor 20 EA...

  16. Desktop Publishing: Its Impact on Community College Journalism.

    ERIC Educational Resources Information Center

    Grzywacz-Gray, John; And Others

    1987-01-01

    Illustrates the kinds of copy that can be created on Apple Macintosh computers and laser printers. Shows font and type specification options. Discusses desktop publishing costs, potential problems, and computer compatibility. Considers the use of computers in college journalism in production, graphics, accounting, advertising, and promotion. (AYC)

  17. Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.

  18. omniClassifier: a Desktop Grid Computing System for Big Data Prediction Modeling

    PubMed Central

    Phan, John H.; Kothari, Sonal; Wang, May D.

    2016-01-01

    Robust prediction models are important for numerous science, engineering, and biomedical applications. However, best-practice procedures for optimizing prediction models can be computationally complex, especially when choosing models from among hundreds or thousands of parameter choices. Computational complexity has further increased with the growth of data in these fields, concurrent with the era of “Big Data”. Grid computing is a potential solution to the computational challenges of Big Data. Desktop grid computing, which uses idle CPU cycles of commodity desktop machines, coupled with commercial cloud computing resources can enable research labs to gain easier and more cost effective access to vast computing resources. We have developed omniClassifier, a multi-purpose prediction modeling application that provides researchers with a tool for conducting machine learning research within the guidelines of recommended best-practices. omniClassifier is implemented as a desktop grid computing system using the Berkeley Open Infrastructure for Network Computing (BOINC) middleware. In addition to describing implementation details, we use various gene expression datasets to demonstrate the potential scalability of omniClassifier for efficient and robust Big Data prediction modeling. A prototype of omniClassifier can be accessed at http://omniclassifier.bme.gatech.edu/. PMID:27532062

  19. Efficient Redundancy Techniques in Cloud and Desktop Grid Systems using MAP/G/c-type Queues

    NASA Astrophysics Data System (ADS)

    Chakravarthy, Srinivas R.; Rumyantsev, Alexander

    2018-03-01

    Cloud computing is continuing to prove its flexibility and versatility in helping industries and businesses as well as academia as a way of providing needed computing capacity. As an important alternative to cloud computing, desktop grids allow to utilize the idle computer resources of an enterprise/community by means of distributed computing system, providing a more secure and controllable environment with lower operational expenses. Further, both cloud computing and desktop grids are meant to optimize limited resources and at the same time to decrease the expected latency for users. The crucial parameter for optimization both in cloud computing and in desktop grids is the level of redundancy (replication) for service requests/workunits. In this paper we study the optimal replication policies by considering three variations of Fork-Join systems in the context of a multi-server queueing system with a versatile point process for the arrivals. For services we consider phase type distributions as well as shifted exponential and Weibull. We use both analytical and simulation approach in our analysis and report some interesting qualitative results.

  20. Disaster easily averted? Data confidentiality and the hospital desktop computer.

    PubMed

    Sethi, Neeraj; Lane, Gethin; Newton, Sophie; Egan, Philip; Ghosh, Samit

    2014-05-01

    We specifically identified the hospital desktop computer as a potential source of breaches in confidentiality. We aimed to evaluate if there was accessible, unprotected, confidential information stored on the desktop screen on computers in a district general hospital and if so, how a teaching intervention could improve this situation. An unannounced spot check of 59 ward computers was performed. Data were collected regarding how many had confidential information stored on the desktop screen without any password protection. An online learning module was mandated for healthcare staff and a second cycle of inspection performed. A district general hospital. Two doctors conducted the audit. Computers in clinical areas were assessed. All clinical staff with computer access underwent the online learning module. An online learning module regarding data protection and confidentiality. In the first cycle, 55% of ward computers had easily accessible patient or staff confidential information stored on their desktop screen. This included handovers, referral letters, staff sick leave lists, audits and nursing reports. The majority (85%) of computers accessed were logged in under a generic username and password. The intervention produced an improvement in the second cycle findings with only 26% of computers being found to have unprotected confidential information stored on them. The failure to comply with appropriate confidential data protection regulations is a persistent problem. Education produces some improvement but we also propose a systemic approach to solving this problem. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Open source OCR framework using mobile devices

    NASA Astrophysics Data System (ADS)

    Zhou, Steven Zhiying; Gilani, Syed Omer; Winkler, Stefan

    2008-02-01

    Mobile phones have evolved from passive one-to-one communication device to powerful handheld computing device. Today most new mobile phones are capable of capturing images, recording video, and browsing internet and do much more. Exciting new social applications are emerging on mobile landscape, like, business card readers, sing detectors and translators. These applications help people quickly gather the information in digital format and interpret them without the need of carrying laptops or tablet PCs. However with all these advancements we find very few open source software available for mobile phones. For instance currently there are many open source OCR engines for desktop platform but, to our knowledge, none are available on mobile platform. Keeping this in perspective we propose a complete text detection and recognition system with speech synthesis ability, using existing desktop technology. In this work we developed a complete OCR framework with subsystems from open source desktop community. This includes a popular open source OCR engine named Tesseract for text detection & recognition and Flite speech synthesis module, for adding text-to-speech ability.

  2. The CosmicWatch Desktop Muon Detector: a self-contained, pocket sized particle detector

    NASA Astrophysics Data System (ADS)

    Axani, S. N.; Frankiewicz, K.; Conrad, J. M.

    2018-03-01

    The CosmicWatch Desktop Muon Detector is a self-contained, hand-held cosmic ray muon detector that is valuable for astro/particle physics research applications and outreach. The material cost of each detector is under 100 and it takes a novice student approximately four hours to build their first detector. The detectors are powered via a USB connection and the data can either be recorded directly to a computer or to a microSD card. Arduino- and Python-based software is provided to operate the detector and an online application to plot the data in real-time. In this paper, we describe the various design features, evaluate the performance, and illustrate the detectors capabilities by providing several example measurements.

  3. Campus Computing 1993. The USC National Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in spring and summer 1993 at over 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges and community colleges. Respondents (N=1011) were individuals specifically responsible for the operation and future…

  4. 78 FR 41868 - Energy Conservation Program for Consumer Products and Certain Commercial and Industrial Equipment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-12

    ... services and manage networked resources for client devices such as desktop and laptop computers. These... include desktop or laptop computers, which are not primarily accessed via network connections. DOE seeks... Determination of Computer Servers as a Covered Consumer Product AGENCY: Office of Energy Efficiency and...

  5. Campus Computing 1991. The EDUCOM-USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    A national survey of desktop computing in higher education was conducted in 1991 of 2500 institutions. Data were responses from public and private research universities, public and private four-year colleges, and community colleges. Respondents (N=1099) were individuals specifically responsible for the operation and future direction of academic…

  6. Desktop Publishing for Counselors.

    ERIC Educational Resources Information Center

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  7. HEP Computing

    Science.gov Websites

    Argonne National Laboratory High Energy Physics Division Windows Desktops Problem Report Service Request Password Help New Users Back to HEP Computing Email on ANL Exchange: See Windows Clients section (Outlook or Thunderbird recommended) Web Browsers: Web Browsers for Windows Desktops Software: Available

  8. The Next Generation of Lab and Classroom Computing - The Silver Lining

    DTIC Science & Technology

    2016-12-01

    desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The research... infrastructure , VDI, hardware cost, software cost, manpower, availability, cloud computing, private cloud, bring your own device, BYOD, thin client...virtual desktop infrastructure (VDI) solution, as well as the computing solutions at three universities, was selected as the basis for comparison. The

  9. Campus Computing 1990: The EDUCOM/USC Survey of Desktop Computing in Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.; Eastman, Skip

    The National Survey of Desktop Computer Use in Higher Education was conducted in the spring and summer of 1990 by the Center for Scholarly Technology at the University of Southern California, in cooperation with EDUCOM and with support from 15 corporate sponsors. The survey was designed to collect information about campus planning, policies, and…

  10. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  11. Teaching Chemistry Using Student-Created Videos and Photo Blogs Accessed with Smartphones and Two-Dimensional Barcodes

    ERIC Educational Resources Information Center

    Benedict, Lucille; Pence, Harry E.

    2012-01-01

    Increasing numbers of college students own cell phones, and many of these phones are smartphones, which include features such as still and video cameras, global positioning systems, Internet access, and computers as powerful as the desktop models of only a few years ago. A number of chemical educators are already using these devices for education.…

  12. Design and Implementation of a Motor Incremental Shaft Encoder

    DTIC Science & Technology

    2008-09-01

    SDC Student Design Center VHDL Verilog Hardware Description Language VSC Voltage Source Converters ZCE Zero Crossing Event xiii EXECUTIVE...student to make accurate predictions of voltage source converters ( VSC ) behavior via software simulation; these simulated results could also be... VSC ), and several other off-the-shelf components, a circuit board interface between FPGA and the power source, and a desktop computer [1]. Now, the

  13. Climate Ocean Modeling on a Beowulf Class System

    NASA Technical Reports Server (NTRS)

    Cheng, B. N.; Chao, Y.; Wang, P.; Bondarenko, M.

    2000-01-01

    With the growing power and shrinking cost of personal computers. the availability of fast ethernet interconnections, and public domain software packages, it is now possible to combine them to build desktop parallel computers (named Beowulf or PC clusters) at a fraction of what it would cost to buy systems of comparable power front supercomputer companies. This led as to build and assemble our own sys tem. specifically for climate ocean modeling. In this article, we present our experience with such a system, discuss its network performance, and provide some performance comparison data with both HP SPP2000 and Cray T3E for an ocean Model used in present-day oceanographic research.

  14. Desk-top publishing using IBM-compatible computers.

    PubMed

    Grencis, P W

    1991-01-01

    This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.

  15. Video Conferencing: The Next Wave for International Business Communication.

    ERIC Educational Resources Information Center

    Sondak, Norman E.; Sondak, Eileen M.

    This paper suggests that desktop computer-based video conferencing, with high fidelity sound, and group software support, is emerging as a major communications option. Briefly addressed are the following critical factors that are propelling the computer-based video conferencing revolution: (1) widespread availability of desktop computers…

  16. Image-Based Modeling Techniques for Architectural Heritage 3d Digitalization: Limits and Potentialities

    NASA Astrophysics Data System (ADS)

    Santagati, C.; Inzerillo, L.; Di Paola, F.

    2013-07-01

    3D reconstruction from images has undergone a revolution in the last few years. Computer vision techniques use photographs from data set collection to rapidly build detailed 3D models. The simultaneous applications of different algorithms (MVS), the different techniques of image matching, feature extracting and mesh optimization are inside an active field of research in computer vision. The results are promising: the obtained models are beginning to challenge the precision of laser-based reconstructions. Among all the possibilities we can mainly distinguish desktop and web-based packages. Those last ones offer the opportunity to exploit the power of cloud computing in order to carry out a semi-automatic data processing, thus allowing the user to fulfill other tasks on its computer; whereas desktop systems employ too much processing time and hard heavy approaches. Computer vision researchers have explored many applications to verify the visual accuracy of 3D model but the approaches to verify metric accuracy are few and no one is on Autodesk 123D Catch applied on Architectural Heritage Documentation. Our approach to this challenging problem is to compare the 3Dmodels by Autodesk 123D Catch and 3D models by terrestrial LIDAR considering different object size, from the detail (capitals, moldings, bases) to large scale buildings for practitioner purpose.

  17. Global computing for bioinformatics.

    PubMed

    Loewe, Laurence

    2002-12-01

    Global computing, the collaboration of idle PCs via the Internet in a SETI@home style, emerges as a new way of massive parallel multiprocessing with potentially enormous CPU power. Its relations to the broader, fast-moving field of Grid computing are discussed without attempting a review of the latter. This review (i) includes a short table of milestones in global computing history, (ii) lists opportunities global computing offers for bioinformatics, (iii) describes the structure of problems well suited for such an approach, (iv) analyses the anatomy of successful projects and (v) points to existing software frameworks. Finally, an evaluation of the various costs shows that global computing indeed has merit, if the problem to be solved is already coded appropriately and a suitable global computing framework can be found. Then, either significant amounts of computing power can be recruited from the general public, or--if employed in an enterprise-wide Intranet for security reasons--idle desktop PCs can substitute for an expensive dedicated cluster.

  18. Coal-seismic, desktop computer programs in BASIC; Part 6, Develop rms velocity functions and apply mute and normal movement

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.

  19. MDA-image: an environment of networked desktop computers for teleradiology/pathology.

    PubMed

    Moffitt, M E; Richli, W R; Carrasco, C H; Wallace, S; Zimmerman, S O; Ayala, A G; Benjamin, R S; Chee, S; Wood, P; Daniels, P

    1991-04-01

    MDA-Image, a project of The University of Texas M. D. Anderson Cancer Center, is an environment of networked desktop computers for teleradiology/pathology. Radiographic film is digitized with a film scanner and histopathologic slides are digitized using a red, green, and blue (RGB) video camera connected to a microscope. Digitized images are stored on a data server connected to the institution's computer communication network (Ethernet) and can be displayed from authorized desktop computers connected to Ethernet. Images are digitized for cases presented at the Bone Tumor Management Conference, a multidisciplinary conference in which treatment options are discussed among clinicians, surgeons, radiologists, pathologists, radiotherapists, and medical oncologists. These radiographic and histologic images are shown on a large screen computer monitor during the conference. They are available for later review for follow-up or representation.

  20. Writing Essays on a Laptop or a Desktop Computer: Does It Matter?

    ERIC Educational Resources Information Center

    Ling, Guangming; Bridgeman, Brent

    2013-01-01

    To explore the potential effect of computer type on the Test of English as a Foreign Language-Internet-Based Test (TOEFL iBT) Writing Test, a sample of 444 international students was used. The students were randomly assigned to either a laptop or a desktop computer to write two TOEFL iBT practice essays in a simulated testing environment, followed…

  1. A comparison of the postures assumed when using laptop computers and desktop computers.

    PubMed

    Straker, L; Jones, K J; Miller, J

    1997-08-01

    This study evaluated the postural implications of using a laptop computer. Laptop computer screens and keyboards are joined, and are therefore unable to be adjusted separately in terms of screen height and distance, and keyboard height and distance. The posture required for their use is likely to be constrained, as little adjustment can be made for the anthropometric differences of users. In addition to the postural constraints, the study looked at discomfort levels and performance when using laptops as compared with desktops. Statistical analysis showed significantly greater neck flexion and head tilt with laptop use. The other body angles measured (trunk, shoulder, elbow, wrist, and scapula and neck protraction/retraction) showed no statistical differences. The average discomfort experienced after using the laptop for 20 min, although appearing greater than the discomfort experienced after using the desktop, was not significantly greater. When using the laptop, subjects tended to perform better than when using the desktop, though not significantly so. Possible reasons for the results are discussed and implications of the findings outlined.

  2. Coal-seismic, desktop computer programs in BASIC; Part 5, Perform X-square T-square analysis and plot normal moveout lines on seismogram overlay

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language used by the Tektronix 4051 Graphic System. This report presents computer programs to perform X-square/T-square analyses and to plot normal moveout lines on a seismogram overlay.

  3. Open Radio Communications Architecture Core Framework V1.1.0 Volume 1 Software Users Manual

    DTIC Science & Technology

    2005-02-01

    on a PC utilizing the KDE desktop that comes with Red Hat Linux . The default desktop for most Red Hat Linux installations is the GNOME desktop. The...SCA) v2.2. The software was designed for a desktop computer running the Linux operating system (OS). It was developed in C++, uses ACE/TAO for CORBA...middleware, Xerces for the XML parser, and Red Hat Linux for the Operating System. The software is referred to as, Open Radio Communication

  4. Children in Front of Screens: Alone or in Company? Desktop or Hybrid Computer? Children's Viewing and Browsing Habit

    ERIC Educational Resources Information Center

    Zilka, Gila

    2017-01-01

    The viewing and browsing habits of Israeli children age 8-12 are the subject of this study. The participants did not have a computer at home and were given either a desktop or hybrid computer for home use. Television viewing and internet surfing habits were described, examining whether the children did so with their parents, family members, and…

  5. Creative Computer Detective: The Basics of Teaching Desktop Publishing.

    ERIC Educational Resources Information Center

    Slothower, Jodie

    Teaching desktop publishing (dtp) in college journalism classes is most effective when the instructor integrates into specific courses four types of software--a word processor, a draw program, a paint program and a layout program. In a course on design and layout, the instructor can demonstrate with the computer how good design can be created and…

  6. 26 CFR 1.179-5 - Time and manner of making election.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... desktop computer costing $1,500. On Taxpayer's 2003 Federal tax return filed on April 15, 2004, Taxpayer elected to expense under section 179 the full cost of the laptop computer and the full cost of the desktop... provided by the Internal Revenue Code, the regulations under the Code, or other guidance published in the...

  7. Application of desktop computers in nuclear engineering education

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solutionmore » to the problem being solved, and yet be flexible enough to accommodate most problem solution options.« less

  8. Desktop Manufacturing Technologies.

    ERIC Educational Resources Information Center

    Snyder, Mark

    1991-01-01

    Desktop manufacturing is the use of data from a computer-assisted design system to construct actual models of an object. Emerging processes are stereolithography, laser sintering, ballistic particle manufacturing, laminated object manufacturing, and photochemical machining. (SK)

  9. Biologically inspired collision avoidance system for unmanned vehicles

    NASA Astrophysics Data System (ADS)

    Ortiz, Fernando E.; Graham, Brett; Spagnoli, Kyle; Kelmelis, Eric J.

    2009-05-01

    In this project, we collaborate with researchers in the neuroscience department at the University of Delaware to develop an Field Programmable Gate Array (FPGA)-based embedded computer, inspired by the brains of small vertebrates (fish). The mechanisms of object detection and avoidance in fish have been extensively studied by our Delaware collaborators. The midbrain optic tectum is a biological multimodal navigation controller capable of processing input from all senses that convey spatial information, including vision, audition, touch, and lateral-line (water current sensing in fish). Unfortunately, computational complexity makes these models too slow for use in real-time applications. These simulations are run offline on state-of-the-art desktop computers, presenting a gap between the application and the target platform: a low-power embedded device. EM Photonics has expertise in developing of high-performance computers based on commodity platforms such as graphic cards (GPUs) and FPGAs. FPGAs offer (1) high computational power, low power consumption and small footprint (in line with typical autonomous vehicle constraints), and (2) the ability to implement massively-parallel computational architectures, which can be leveraged to closely emulate biological systems. Combining UD's brain modeling algorithms and the power of FPGAs, this computer enables autonomous navigation in complex environments, and further types of onboard neural processing in future applications.

  10. Does It Matter Whether One Takes a Test on an iPad or a Desktop Computer?

    ERIC Educational Resources Information Center

    Ling, Guangming

    2016-01-01

    To investigate possible iPad related mode effect, we tested 403 8th graders in Indiana, Maryland, and New Jersey under three mode conditions through random assignment: a desktop computer, an iPad alone, and an iPad with an external keyboard. All students had used an iPad or computer for six months or longer. The 2-hour test included reading, math,…

  11. Real-time control system for adaptive resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flath, L; An, J; Brase, J

    2000-07-24

    Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.

  12. Accuracy of remote chest X-ray interpretation using Google Glass technology.

    PubMed

    Spaedy, Emily; Christakopoulos, Georgios E; Tarar, Muhammad Nauman J; Christopoulos, Georgios; Rangan, Bavana V; Roesle, Michele; Ochoa, Cristhiaan D; Yarbrough, William; Banerjee, Subhash; Brilakis, Emmanouil S

    2016-09-15

    We sought to explore the accuracy of remote chest X-ray reading using hands-free, wearable technology (Google Glass, Google, Mountain View, California). We compared interpretation of twelve chest X-rays with 23 major cardiopulmonary findings by faculty and fellows from cardiology, radiology, and pulmonary-critical care via: (1) viewing the chest X-ray image on the Google Glass screen; (2) viewing a photograph of the chest X-ray taken using Google Glass and interpreted on a mobile device; (3) viewing the original chest X-ray on a desktop computer screen. One point was given for identification of each correct finding and a subjective rating of user experience was recorded. Fifteen physicians (5 faculty and 10 fellows) participated. The average chest X-ray reading score (maximum 23 points) as viewed through the Google Glass, Google Glass photograph on a mobile device, and the original X-ray viewed on a desktop computer was 14.1±2.2, 18.5±1.5 and 21.3±1.7, respectively (p<0.0001 between Google Glass and mobile device, p<0.0001 between Google Glass and desktop computer and p=0.0004 between mobile device and desktop computer). Of 15 physicians, 11 (73.3%) felt confident in detecting findings using the photograph taken by Google Glass as viewed on a mobile device. Remote chest X-ray interpretation using hands-free, wearable technology (Google Glass) is less accurate than interpretation using a desktop computer or a mobile device, suggesting that further technical improvements are needed before widespread application of this novel technology. Published by Elsevier Ireland Ltd.

  13. Influences of Gender and Computer Gaming Experience in Occupational Desktop Virtual Environments: A Cross-Case Analysis Study

    ERIC Educational Resources Information Center

    Ausburn, Lynna J.; Ausburn, Floyd B.; Kroutter, Paul J.

    2013-01-01

    This study used a cross-case analysis methodology to compare four line-of-inquiry studies of desktop virtual environments (DVEs) to examine the relationships of gender and computer gaming experience to learning performance and perceptions. Comparison was made of learning patterns in a general non-technical DVE with patterns in technically complex,…

  14. Capturing and analyzing wheelchair maneuvering patterns with mobile cloud computing.

    PubMed

    Fu, Jicheng; Hao, Wei; White, Travis; Yan, Yuqing; Jones, Maria; Jan, Yih-Kuen

    2013-01-01

    Power wheelchairs have been widely used to provide independent mobility to people with disabilities. Despite great advancements in power wheelchair technology, research shows that wheelchair related accidents occur frequently. To ensure safe maneuverability, capturing wheelchair maneuvering patterns is fundamental to enable other research, such as safe robotic assistance for wheelchair users. In this study, we propose to record, store, and analyze wheelchair maneuvering data by means of mobile cloud computing. Specifically, the accelerometer and gyroscope sensors in smart phones are used to record wheelchair maneuvering data in real-time. Then, the recorded data are periodically transmitted to the cloud for storage and analysis. The analyzed results are then made available to various types of users, such as mobile phone users, traditional desktop users, etc. The combination of mobile computing and cloud computing leverages the advantages of both techniques and extends the smart phone's capabilities of computing and data storage via the Internet. We performed a case study to implement the mobile cloud computing framework using Android smart phones and Google App Engine, a popular cloud computing platform. Experimental results demonstrated the feasibility of the proposed mobile cloud computing framework.

  15. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  16. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  17. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    PubMed

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  18. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  19. Students' Beliefs about Mobile Devices vs. Desktop Computers in South Korea and the United States

    ERIC Educational Resources Information Center

    Sung, Eunmo; Mayer, Richard E.

    2012-01-01

    College students in the United States and in South Korea completed a 28-item multidimensional scaling (MDS) questionnaire in which they rated the similarity of 28 pairs of multimedia learning materials on a 10-point scale (e.g., narrated animation on a mobile device Vs. movie clip on a desktop computer) and a 56-item semantic differential…

  20. Validation of tablet-based evaluation of color fundus images

    PubMed Central

    Christopher, Mark; Moga, Daniela C.; Russell, Stephen R.; Folk, James C.; Scheetz, Todd; Abràmoff, Michael D.

    2012-01-01

    Purpose To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer to recommendations made using a standard desktop display. Methods A tablet computer (iPad) and a desktop PC with a high-definition color display were compared. For each platform, two retinal specialists independently rated 1200 color fundus images from patients at risk for DR using an annotation program, Truthseeker. The specialists determined whether each image had referable DR, and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet- and desktop display-based referral ratings were compared using cross-platform, intra-observer kappa as the primary outcome measure. Additionally, inter-observer kappa, sensitivity, specificity, and area under ROC (AUC) were determined. Results A high level of cross-platform, intra-observer agreement was found for the DR referral ratings between the platforms (κ=0.778), and for the two graders, (κ=0.812). Inter-observer agreement was similar for the two platforms (κ=0.544 and κ=0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an AUC of 0.950 compared to desktop display-based ratings. Conclusions In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR. PMID:22495326

  1. Using Avizo Software on the Peregrine System | High-Performance Computing |

    Science.gov Websites

    be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can

  2. RighTime: A real time clock correcting program for MS-DOS-based computer systems

    NASA Technical Reports Server (NTRS)

    Becker, G. Thomas

    1993-01-01

    A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.

  3. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  4. Big Memory Elegance: HyperCard Information Processing and Desktop Publishing.

    ERIC Educational Resources Information Center

    Bitter, Gary G.; Gerson, Charles W., Jr.

    1991-01-01

    Discusses hardware requirements, functions, and applications of five information processing and desktop publishing software packages for the Macintosh: HyperCard, PageMaker, Cricket Presents, Power Point, and Adobe illustrator. Benefits of these programs for schools are considered. (MES)

  5. Desktop Publishing: A New Frontier for Instructional Technologists.

    ERIC Educational Resources Information Center

    Bell, Norman T.; Warner, James W.

    1986-01-01

    Discusses new possibilities that computers and laser printers offer instructional technologists. Includes a brief history of printed communications, a description of new technological advances referred to as "desktop publishing," and suggests the application of this technology to instructional tasks. (TW)

  6. CIM for 300-mm semiconductor fab

    NASA Astrophysics Data System (ADS)

    Luk, Arthur

    1997-08-01

    Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.

  7. MICROPROCESSOR-BASED DATA-ACQUISITION SYSTEM FOR A BOREHOLE RADAR.

    USGS Publications Warehouse

    Bradley, Jerry A.; Wright, David L.

    1987-01-01

    An efficient microprocessor-based system is described that permits real-time acquisition, stacking, and digital recording of data generated by a borehole radar system. Although the system digitizes, stacks, and records independently of a computer, it is interfaced to a desktop computer for program control over system parameters such as sampling interval, number of samples, number of times the data are stacked prior to recording on nine-track tape, and for graphics display of the digitized data. The data can be transferred to the desktop computer during recording, or it can be played back from a tape at a latter time. Using the desktop computer, the operator observes results while recording data and generates hard-copy graphics in the field. Thus, the radar operator can immediately evaluate the quality of data being obtained, modify system parameters, study the radar logs before leaving the field, and rerun borehole logs if necessary. The system has proven to be reliable in the field and has increased productivity both in the field and in the laboratory.

  8. The desktop muon detector: A simple, physics-motivated machine- and electronics-shop project for university students

    NASA Astrophysics Data System (ADS)

    Axani, S. N.; Conrad, J. M.; Kirby, C.

    2017-12-01

    This paper describes the construction of a desktop muon detector, an undergraduate-level physics project that develops machine-shop and electronics-shop technical skills. The desktop muon detector is a self-contained apparatus that employs a plastic scintillator as the detection medium and a silicon photomultiplier for light collection. This detector can be battery powered and is used in conjunction with the provided software. The total cost per detector is approximately 100. We describe physics experiments we have performed, and then suggest several other interesting measurements that are possible, with one or more desktop muon detectors.

  9. Desktop computer graphics for RMS/payload handling flight design

    NASA Technical Reports Server (NTRS)

    Homan, D. J.

    1984-01-01

    A computer program, the Multi-Adaptive Drawings, Renderings and Similitudes (MADRAS) program, is discussed. The modeling program, written for a desktop computer system (the Hewlett-Packard 9845/C), is written in BASIC and uses modular construction of objects while generating both wire-frame and hidden-line drawings from any viewpoint. The dimensions and placement of objects are user definable. Once the hidden-line calculations are made for a particular viewpoint, the viewpoint may be rotated in pan, tilt, and roll without further hidden-line calculations. The use and results of this program are discussed.

  10. A VM-shared desktop virtualization system based on OpenStack

    NASA Astrophysics Data System (ADS)

    Liu, Xi; Zhu, Mingfa; Xiao, Limin; Jiang, Yuanjie

    2018-04-01

    With the increasing popularity of cloud computing, desktop virtualization is rising in recent years as a branch of virtualization technology. However, existing desktop virtualization systems are mostly designed as a one-to-one mode, which one VM can only be accessed by one user. Meanwhile, previous desktop virtualization systems perform weakly in terms of response time and cost saving. This paper proposes a novel VM-Shared desktop virtualization system based on OpenStack platform. The paper modified the connecting process and the display data transmission process of the remote display protocol SPICE to support VM-Shared function. On the other hand, we propose a server-push display mode to improve user interactive experience. The experimental results show that our system performs well in response time and achieves a low CPU consumption.

  11. The Nimrod computational workbench: a case study in desktop metacomputing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramson, D.; Sosic, R.; Foster, I.

    The coordinated use of geographically distributed computers, or metacomputing, can in principle provide more accessible and cost- effective supercomputing than conventional high-performance systems. However, we lack evidence that metacomputing systems can be made easily usable, or that there exist large numbers of applications able to exploit metacomputing resources. In this paper, we present work that addresses both these concerns. The basis for this work is a system called Nimrod that provides a desktop problem-solving environment for parametric experiments. We describe how Nimrod has been extended to support the scheduling of computational resources located in a wide-area environment, and report onmore » an experiment in which Nimrod was used to schedule a large parametric study across the Australian Internet. The experiment provided both new scientific results and insights into Nimrod capabilities. We relate the results of this experiment to lessons learned from the I-WAY distributed computing experiment, and draw conclusions as to how Nimrod and I-WAY- like computing environments should be developed to support desktop metacomputing.« less

  12. ATM: The Key To Harnessing the Power of Networked Multimedia.

    ERIC Educational Resources Information Center

    Gross, Rod

    1996-01-01

    ATM (Asynchronous Transfer Mode) network technology handles the real-time continuous traffic flow necessary to support desktop multimedia applications. Describes network applications already used: desktop video collaboration, distance learning, and broadcasting video delivery. Examines the architecture of ATM technology, video delivery and sound…

  13. Contemporary issues in HIM. The application layer--III.

    PubMed

    Wear, L L; Pinkert, J R

    1993-07-01

    We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.

  14. Monte Carlo simulation of electrothermal atomization on a desktop personal computer

    NASA Astrophysics Data System (ADS)

    Histen, Timothy E.; Güell, Oscar A.; Chavez, Iris A.; Holcombea, James A.

    1996-07-01

    Monte Carlo simulations have been applied to electrothermal atomization (ETA) using a tubular atomizer (e.g. graphite furnace) because of the complexity in the geometry, heating, molecular interactions, etc. The intense computational time needed to accurately model ETA often limited its effective implementation to the use of supercomputers. However, with the advent of more powerful desktop processors, this is no longer the case. A C-based program has been developed and can be used under Windows TM or DOS. With this program, basic parameters such as furnace dimensions, sample placement, furnace heating and kinetic parameters such as activation energies for desorption and adsorption can be varied to show the absorbance profile dependence on these parameters. Even data such as time-dependent spatial distribution of analyte inside the furnace can be collected. The DOS version also permits input of external temperaturetime data to permit comparison of simulated profiles with experimentally obtained absorbance data. The run-time versions are provided along with the source code. This article is an electronic publication in Spectrochimica Acta Electronica (SAE), the electronic section of Spectrochimica Acta Part B (SAB). The hardcopy text is accompanied by a diskette with a program (PC format), data files and text files.

  15. Use phase signals to promote lifetime extension for Windows PCs.

    PubMed

    Hickey, Stewart; Fitzpatrick, Colin; O'Connell, Maurice; Johnson, Michael

    2009-04-01

    This paper proposes a signaling methodology for personal computers. Signaling may be viewed as an ecodesign strategy that can positively influence the consumer to consumer (C2C) market process. A number of parameters are identified that can provide the basis for signal implementation. These include operating time, operating temperature, operating voltage, power cycle counts, hard disk drive (HDD) self-monitoring, and reporting technology (SMART) attributes and operating system (OS) event information. All these parameters are currently attainable or derivable via embedded technologies in modern desktop systems. A case study detailing a technical implementation of how the development of signals can be achieved in personal computers that incorporate Microsoft Windows operating systems is presented. Collation of lifetime temperature data from a system processor is demonstrated as a possible means of characterizing a usage profile for a desktop system. In addition, event log data is utilized for devising signals indicative of OS quality. The provision of lifetime usage data in the form of intuitive signals indicative of both hardware and software quality can in conjunction with consumer education facilitate an optimal remarketing strategy for used systems. This implementation requires no additional hardware.

  16. Desktop Virtualization: Applications and Considerations

    ERIC Educational Resources Information Center

    Hodgman, Matthew R.

    2013-01-01

    As educational technology continues to rapidly become a vital part of a school district's infrastructure, desktop virtualization promises to provide cost-effective and education-enhancing solutions to school-based computer technology problems in school systems locally and abroad. This article outlines the history of and basic concepts behind…

  17. Modeling of Heat Transfer and Ablation of Refractory Material Due to Rocket Plume Impingement

    NASA Technical Reports Server (NTRS)

    Harris, Michael F.; Vu, Bruce T.

    2012-01-01

    CR Tech's Thermal Desktop-SINDA/FLUINT software was used in the thermal analysis of a flame deflector design for Launch Complex 39B at Kennedy Space Center, Florida. The analysis of the flame deflector takes into account heat transfer due to plume impingement from expected vehicles to be launched at KSC. The heat flux from the plume was computed using computational fluid dynamics provided by Ames Research Center in Moffet Field, California. The results from the CFD solutions were mapped onto a 3-D Thermal Desktop model of the flame deflector using the boundary condition mapping capabilities in Thermal Desktop. The ablation subroutine in SINDA/FLUINT was then used to model the ablation of the refractory material.

  18. Basics of Desktop Publishing. Teacher Edition.

    ERIC Educational Resources Information Center

    Beeby, Ellen

    This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…

  19. Volunteered Cloud Computing for Disaster Management

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.

  20. Desktop Publishing: Probable Effects on University Extension.

    ERIC Educational Resources Information Center

    Misanchuk, Earl R.

    Desktop publishing (DTP) could potentially become a powerful, relatively inexpensive tool for use in university extension activities. This paper describes and explains the characteristics of DTP and examines its effects on university extension. In addition, it outlines the kind of hardware, software, and skills needed and costs; describes new…

  1. Exciting Normal Distribution

    ERIC Educational Resources Information Center

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  2. Searching on the Run

    ERIC Educational Resources Information Center

    Tenopir, Carol

    2004-01-01

    With wireless connectivity and small laptop computers, people are no longer tied to the desktop for online searching. Handheld personal digital assistants (PDAs) offer even greater portability. So far, the most common uses of PDAs are as calendars and address books, or to interface with a laptop or desktop machine. More advanced PDAs, like…

  3. Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).

    ERIC Educational Resources Information Center

    Guthrie, Jim

    1995-01-01

    Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…

  4. WEB-BASED MODELING OF A FERTILIZER SOLUTION SPILL IN THE OHIO RIVER

    EPA Science Inventory

    Environmental computer models are usually desktop models. Some web-enabled models are beginning to appear where the user can use a browser to run the models on a central web server. Several issues arise when a desktop model is transferred to a web architecture. This paper discuss...

  5. Viewpoints: A New Computer Program for Interactive Exploration of Large Multivariate Space Science and Astrophysics Data.

    NASA Astrophysics Data System (ADS)

    Levit, Creon; Gazis, P.

    2006-06-01

    The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.

  6. Preparation of digital movie clips for online journal publication.

    PubMed

    Yam, Chun-Shan

    2006-07-01

    This article presents general guidelines for preparing movie clips for online journal publication. As more and more radiology journals establish an online presence, radiologists wishing to submit journal articles with movie clips need to understand the electronic submission process. Viewing a movie clip via an online journal is different from viewing one with PowerPoint using a local desktop computer because the movie file must first be downloaded onto the client computer before it can be displayed. Users thus should be cautious in selecting movie format and compression when creating movie clips for online journals. This article provides step-by-step demonstrations and general guidelines for movie format and compression selections.

  7. Practical experience with graphical user interfaces and object-oriented design in the clinical laboratory.

    PubMed

    Wells, I G; Cartwright, R Y; Farnan, L P

    1993-12-15

    The computing strategy in our laboratories evolved from research in Artificial Intelligence, and is based on powerful software tools running on high performance desktop computers with a graphical user interface. This allows most tasks to be regarded as design problems rather than implementation projects, and both rapid prototyping and an object-oriented approach to be employed during the in-house development and enhancement of the laboratory information systems. The practical application of this strategy is discussed, with particular reference to the system designer, the laboratory user and the laboratory customer. Routine operation covers five departments, and the systems are stable, flexible and well accepted by the users. Client-server computing, currently undergoing final trials, is seen as the key to further development, and this approach to Pathology computing has considerable potential for the future.

  8. Oak Ridge Institutional Cluster Autotune Test Drive Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jibonananda, Sanyal; New, Joshua Ryan

    2014-02-01

    The Oak Ridge Institutional Cluster (OIC) provides general purpose computational resources for the ORNL staff to run computation heavy jobs that are larger than desktop applications but do not quite require the scale and power of the Oak Ridge Leadership Computing Facility (OLCF). This report details the efforts made and conclusions derived in performing a short test drive of the cluster resources on Phase 5 of the OIC. EnergyPlus was used in the analysis as a candidate user program and the overall software environment was evaluated against anticipated challenges experienced with resources such as the shared memory-Nautilus (JICS) and Titanmore » (OLCF). The OIC performed within reason and was found to be acceptable in the context of running EnergyPlus simulations. The number of cores per node and the availability of scratch space per node allow non-traditional desktop focused applications to leverage parallel ensemble execution. Although only individual runs of EnergyPlus were executed, the software environment on the OIC appeared suitable to run ensemble simulations with some modifications to the Autotune workflow. From a standpoint of general usability, the system supports common Linux libraries, compilers, standard job scheduling software (Torque/Moab), and the OpenMPI library (the only MPI library) for MPI communications. The file system is a Panasas file system which literature indicates to be an efficient file system.« less

  9. A New Definition for Ground Control

    NASA Technical Reports Server (NTRS)

    2002-01-01

    LandForm(R) VisualFlight(R) blends the power of a geographic information system with the speed of a flight simulator to transform a user's desktop computer into a "virtual cockpit." The software product, which is fully compatible with all Microsoft(R) Windows(R) operating systems, provides distributed, real-time three-dimensional flight visualization over a host of networks. From a desktop, a user can immediately obtain a cockpit view, a chase-plane view, or an airborne tracker view. A customizable display also allows the user to overlay various flight parameters, including latitude, longitude, altitude, pitch, roll, and heading information. Rapid Imaging Software sought assistance from NASA, and the VisualFlight technology came to fruition under a Phase II SBIR contract with Johnson Space Center in 1998. Three years later, on December 13, 2001, Ken Ham successfully flew NASA's X-38 spacecraft from a remote, ground-based cockpit using LandForm VisualFlight as part of his primary situation awareness display in a flight test at Edwards Air Force Base, California.

  10. IMAGE EXPLORER: Astronomical Image Analysis on an HTML5-based Web Application

    NASA Astrophysics Data System (ADS)

    Gopu, A.; Hayashi, S.; Young, M. D.

    2014-05-01

    Large datasets produced by recent astronomical imagers cause the traditional paradigm for basic visual analysis - typically downloading one's entire image dataset and using desktop clients like DS9, Aladin, etc. - to not scale, despite advances in desktop computing power and storage. This paper describes Image Explorer, a web framework that offers several of the basic visualization and analysis functionality commonly provided by tools like DS9, on any HTML5 capable web browser on various platforms. It uses a combination of the modern HTML5 canvas, JavaScript, and several layers of lossless PNG tiles producted from the FITS image data. Astronomers are able to rapidly and simultaneously open up several images on their web-browser, adjust the intensity min/max cutoff or its scaling function, and zoom level, apply color-maps, view position and FITS header information, execute typically used data reduction codes on the corresponding FITS data using the FRIAA framework, and overlay tiles for source catalog objects, etc.

  11. Massive Exploration of Perturbed Conditions of the Blood Coagulation Cascade through GPU Parallelization

    PubMed Central

    Cazzaniga, Paolo; Nobile, Marco S.; Besozzi, Daniela; Bellini, Matteo; Mauri, Giancarlo

    2014-01-01

    The introduction of general-purpose Graphics Processing Units (GPUs) is boosting scientific applications in Bioinformatics, Systems Biology, and Computational Biology. In these fields, the use of high-performance computing solutions is motivated by the need of performing large numbers of in silico analysis to study the behavior of biological systems in different conditions, which necessitate a computing power that usually overtakes the capability of standard desktop computers. In this work we present coagSODA, a CUDA-powered computational tool that was purposely developed for the analysis of a large mechanistic model of the blood coagulation cascade (BCC), defined according to both mass-action kinetics and Hill functions. coagSODA allows the execution of parallel simulations of the dynamics of the BCC by automatically deriving the system of ordinary differential equations and then exploiting the numerical integration algorithm LSODA. We present the biological results achieved with a massive exploration of perturbed conditions of the BCC, carried out with one-dimensional and bi-dimensional parameter sweep analysis, and show that GPU-accelerated parallel simulations of this model can increase the computational performances up to a 181× speedup compared to the corresponding sequential simulations. PMID:25025072

  12. Generating Alternative Engineering Designs by Integrating Desktop VR with Genetic Algorithms

    ERIC Educational Resources Information Center

    Chandramouli, Magesh; Bertoline, Gary; Connolly, Patrick

    2009-01-01

    This study proposes an innovative solution to the problem of multiobjective engineering design optimization by integrating desktop VR with genetic computing. Although, this study considers the case of construction design as an example to illustrate the framework, this method can very much be extended to other engineering design problems as well.…

  13. Meaning-Making in Online Language Learner Interactions via Desktop Videoconferencing

    ERIC Educational Resources Information Center

    Satar, H. Müge

    2016-01-01

    Online language learning and teaching in multimodal contexts has been identified as one of the key research areas in computer-aided learning (CALL) (Lamy, 2013; White, 2014). This paper aims to explore meaning-making in online language learner interactions via desktop videoconferencing (DVC) and in doing so illustrate multimodal transcription and…

  14. Desktop Publishing: The Effects of Computerized Formats on Reading Speed and Comprehension.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    1989-01-01

    Describes study that was conducted to determine the effects of two electronic text variables used in desktop publishing on undergraduate students' reading speed and comprehension. Research on text variables, graphic design, instructional text design, and computer screen design is discussed, and further studies are suggested. (22 references) (LRW)

  15. Practical Downloading to Desktop Publishing: Enhancing the Delivery of Information.

    ERIC Educational Resources Information Center

    Danziger, Pamela N.

    This paper is addressed to librarians and information managers who, as one of the many activities they routinely perform, frequently publish information in such formats as newsletters, manuals, brochures, forms, presentations, or reports. It is argued that desktop publishing--a personal computer-based software package used to generate documents of…

  16. Software tools for interactive instruction in radiologic anatomy.

    PubMed

    Alvarez, Antonio; Gold, Garry E; Tobin, Brian; Desser, Terry S

    2006-04-01

    To promote active learning in an introductory Radiologic Anatomy course through the use of computer-based exercises. DICOM datasets from our hospital PACS system were transferred to a networked cluster of desktop computers in a medical school classroom. Medical students in the Radiologic Anatomy course were divided into four small groups and assigned to work on a clinical case for 45 minutes. The groups used iPACS viewer software, a free DICOM viewer, to view images and annotate anatomic structures. The classroom instructor monitored and displayed each group's work sequentially on the master screen by running SynchronEyes, a software tool for controlling PC desktops remotely. Students were able to execute the assigned tasks using the iPACS software with minimal oversight or instruction. Course instructors displayed each group's work on the main display screen of the classroom as the students presented the rationale for their decisions. The interactive component of the course received high ratings from the students and overall course ratings were higher than in prior years when the course was given solely in lecture format. DICOM viewing software is an excellent tool for enabling students to learn radiologic anatomy from real-life clinical datasets. Interactive exercises performed in groups can be powerful tools for stimulating students to learn radiologic anatomy.

  17. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    PubMed Central

    Banerjee, Debjani; Bellesia, Giovanni; Daigle, Bernie J.; Douglas, Geoffrey; Gu, Mengyuan; Gupta, Anand; Hellander, Stefan; Horuk, Chris; Nath, Dibyendu; Takkar, Aviral; Lötstedt, Per; Petzold, Linda R.

    2016-01-01

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources and exchange models via a public model repository. We demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity. PMID:27930676

  18. Miniature Heat Pipes

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Small Business Innovation Research contracts from Goddard Space Flight Center to Thermacore Inc. have fostered the company work on devices tagged "heat pipes" for space application. To control the extreme temperature ranges in space, heat pipes are important to spacecraft. The problem was to maintain an 8-watt central processing unit (CPU) at less than 90 C in a notebook computer using no power, with very little space available and without using forced convection. Thermacore's answer was in the design of a powder metal wick that transfers CPU heat from a tightly confined spot to an area near available air flow. The heat pipe technology permits a notebook computer to be operated in any position without loss of performance. Miniature heat pipe technology has successfully been applied, such as in Pentium Processor notebook computers. The company expects its heat pipes to accommodate desktop computers as well. Cellular phones, camcorders, and other hand-held electronics are forsible applications for heat pipes.

  19. Stochastic Simulation Service: Bridging the Gap between the Computational Expert and the Biologist

    DOE PAGES

    Drawert, Brian; Hellander, Andreas; Bales, Ben; ...

    2016-12-08

    We present StochSS: Stochastic Simulation as a Service, an integrated development environment for modeling and simulation of both deterministic and discrete stochastic biochemical systems in up to three dimensions. An easy to use graphical user interface enables researchers to quickly develop and simulate a biological model on a desktop or laptop, which can then be expanded to incorporate increasing levels of complexity. StochSS features state-of-the-art simulation engines. As the demand for computational power increases, StochSS can seamlessly scale computing resources in the cloud. In addition, StochSS can be deployed as a multi-user software environment where collaborators share computational resources andmore » exchange models via a public model repository. We also demonstrate the capabilities and ease of use of StochSS with an example of model development and simulation at increasing levels of complexity.« less

  20. An Integrated RFID and Barcode Tagged Item Inventory System for Deployment at New Brunswick Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younkin, James R; Kuhn, Michael J; Gradle, Colleen

    New Brunswick Laboratory (NBL) has a numerous inventory containing thousands of plutonium and uranium certified reference materials. The current manual inventory process is well established but is a lengthy process which requires significant oversight and double checking to ensure correctness. Oak Ridge National Laboratory has worked with NBL to develop and deploy a new inventory system which utilizes handheld computers with barcode scanners and radio frequency identification (RFID) readers termed the Tagged Item Inventory System (TIIS). Certified reference materials are identified by labels which incorporate RFID tags and barcodes. The label printing process and RFID tag association process are integratedmore » into the main desktop software application. Software on the handheld computers syncs with software on designated desktop machines and the NBL inventory database to provide a seamless inventory process. This process includes: 1) identifying items to be inventoried, 2) downloading the current inventory information to the handheld computer, 3) using the handheld to read item and location labels, and 4) syncing the handheld computer with a designated desktop machine to analyze the results, print reports, etc. The security of this inventory software has been a major concern. Designated roles linked to authenticated logins are used to control access to the desktop software while password protection and badge verification are used to control access to the handheld computers. The overall system design and deployment at NBL will be presented. The performance of the system will also be discussed with respect to a small piece of the overall inventory. Future work includes performing a full inventory at NBL with the Tagged Item Inventory System and comparing performance, cost, and radiation exposures to the current manual inventory process.« less

  1. Virtual network computing: cross-platform remote display and collaboration software.

    PubMed

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  2. Fast neural net simulation with a DSP processor array.

    PubMed

    Muller, U A; Gunzinger, A; Guggenbuhl, W

    1995-01-01

    This paper describes the implementation of a fast neural net simulator on a novel parallel distributed-memory computer. A 60-processor system, named MUSIC (multiprocessor system with intelligent communication), is operational and runs the backpropagation algorithm at a speed of 330 million connection updates per second (continuous weight update) using 32-b floating-point precision. This is equal to 1.4 Gflops sustained performance. The complete system with 3.8 Gflops peak performance consumes less than 800 W of electrical power and fits into a 19-in rack. While reaching the speed of modern supercomputers, MUSIC still can be used as a personal desktop computer at a researcher's own disposal. In neural net simulation, this gives a computing performance to a single user which was unthinkable before. The system's real-time interfaces make it especially useful for embedded applications.

  3. Modems and More: The Computer Branches Out.

    ERIC Educational Resources Information Center

    Dyrli, Odvard Egil

    1986-01-01

    Surveys new "peripherals," electronic devices that attach to computers. Devices such as videodisc players, desktop laser printers, large screen projectors, and input mechanisms that circumvent the keyboard dramatically expand the computer's instructional uses. (Author/LHW)

  4. Algorithms of GPU-enabled reactive force field (ReaxFF) molecular dynamics.

    PubMed

    Zheng, Mo; Li, Xiaoxia; Guo, Li

    2013-04-01

    Reactive force field (ReaxFF), a recent and novel bond order potential, allows for reactive molecular dynamics (ReaxFF MD) simulations for modeling larger and more complex molecular systems involving chemical reactions when compared with computation intensive quantum mechanical methods. However, ReaxFF MD can be approximately 10-50 times slower than classical MD due to its explicit modeling of bond forming and breaking, the dynamic charge equilibration at each time-step, and its one order smaller time-step than the classical MD, all of which pose significant computational challenges in simulation capability to reach spatio-temporal scales of nanometers and nanoseconds. The very recent advances of graphics processing unit (GPU) provide not only highly favorable performance for GPU enabled MD programs compared with CPU implementations but also an opportunity to manage with the computing power and memory demanding nature imposed on computer hardware by ReaxFF MD. In this paper, we present the algorithms of GMD-Reax, the first GPU enabled ReaxFF MD program with significantly improved performance surpassing CPU implementations on desktop workstations. The performance of GMD-Reax has been benchmarked on a PC equipped with a NVIDIA C2050 GPU for coal pyrolysis simulation systems with atoms ranging from 1378 to 27,283. GMD-Reax achieved speedups as high as 12 times faster than Duin et al.'s FORTRAN codes in Lammps on 8 CPU cores and 6 times faster than the Lammps' C codes based on PuReMD in terms of the simulation time per time-step averaged over 100 steps. GMD-Reax could be used as a new and efficient computational tool for exploiting very complex molecular reactions via ReaxFF MD simulation on desktop workstations. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. dictyExpress: a Dictyostelium discoideum gene expression database with an explorative data analysis web-based interface.

    PubMed

    Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz

    2009-08-25

    Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms.

  6. dictyExpress: a Dictyostelium discoideum gene expression database with an explorative data analysis web-based interface

    PubMed Central

    Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz

    2009-01-01

    Background Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. Results We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. Conclusion dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms. PMID:19706156

  7. 76 FR 58138 - Defense Federal Acquisition Regulation Supplement (DFARS); Alternative Line Item Structure (DFARS...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    ... DoD published a proposed rule in the Federal Register at 76 FR 21847 on April 19, 2011, to add DFARS..., the contract line item may be for a desktop computer, but the actual items delivered, invoiced, and..., Desktop with 20 EA CPU, Monitor, Keyboard and Mouse. Alternative line-item structure offer where monitors...

  8. Digital Dome versus Desktop Display: Learning Outcome Assessments by Domain Experts

    ERIC Educational Resources Information Center

    Jacobson, Jeffery

    2013-01-01

    In previous publications, the author reported that students learned about Egyptian architecture and society by playing an educational game based on a virtual representation of a temple. Students played the game in a digital dome or on a standard desktop computer, and (each) then recorded a video tour of the temple. Those who had used the dome…

  9. A compact ECG R-R interval, respiration and activity recording system.

    PubMed

    Yoshimura, Takahiro; Yonezawa, Yoshiharu; Maki, Hiromichi; Ogawa, Hidekuni; Hahn, Allen W; Thayer, Julian F; Caldwell, W Morton

    2003-01-01

    An ECG R-R interval, respiration and activity recording system has been developed for monitoring variability of heart rate and respiratory frequency during daily life. The recording system employs a variable gain instrumentation amplifier, an accelerometer, a low power 8-bit single-chip microcomputer and a 1024 KB EEPROM. It is constructed on three ECG chest electrodes. The R-R interval and respiration are detected from the ECG. Activity during walking and running is calculated from an accelerator. The detected data are stored in an EEPROM and after recording, are downloaded to a desktop computer for analysis.

  10. Energy Use and Power Levels in New Monitors and Personal Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay

    2002-07-23

    Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can usemore » to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC). Cur rent ENERGY STAR monitor and computer criteria do not specify off or on power, but our results suggest opportunities for saving energy in these modes. Also, significant differences between CRT and LCD technology, and between field-measured and manufacturer-reported power levels reveal the need for standard methods and metrics for measuring and comparing monitor power consumption.« less

  11. 77 FR 26660 - Guidelines for the Transfer of Excess Computers or Other Technical Equipment Pursuant to Section...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ....usda.gov . SUPPLEMENTARY INFORMATION: A. Background A proposed rule was published in the Federal.... Computers or other technical equipment means central processing units, laptops, desktops, computer mouses...

  12. An Inverse Modeling Plugin for HydroDesktop using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio, C.; Over, M. W.; Rubin, Y.

    2011-12-01

    The CUAHSI Hydrologic Information System (HIS) software stack is based on an open and extensible architecture that facilitates the addition of new functions and capabilities at both the server side (using HydroServer) and the client side (using HydroDesktop). The HydroDesktop client plugin architecture is used here to expose a new scripting based plugin that makes use of the R statistics software as a means for conducting inverse modeling using the Method of Anchored Distributions (MAD). MAD is a Bayesian inversion technique for conditioning computational model parameters on relevant field observations yielding probabilistic distributions of the model parameters, related to the spatial random variable of interest, by assimilating multi-type and multi-scale data. The implementation of a desktop software tool for using the MAD technique is expected to significantly lower the barrier to use of inverse modeling in education, research, and resource management. The HydroDesktop MAD plugin is being developed following a community-based, open-source approach that will help both its adoption and long term sustainability as a user tool. This presentation will briefly introduce MAD, HydroDesktop, and the MAD plugin and software development effort.

  13. Effect of keyswitch design of desktop and notebook keyboards related to key stiffness and typing force.

    PubMed

    Bufton, Marcia J; Marklin, Richard W; Nagurka, Mark L; Simoneau, Guy G

    2006-08-15

    This study aimed to compare and analyse rubber-dome desktop, spring-column desktop and notebook keyboards in terms of key stiffness and fingertip typing force. The spring-column keyboard resulted in the highest mean peak contact force (0.86N), followed by the rubber dome desktop (0.68N) and the notebook (0.59N). All these differences were statistically significant. Likewise, the spring-column keyboard registered the highest fingertip typing force and the notebook keyboard the lowest. A comparison of forces showed the notebook (rubber dome) keyboard had the highest fingertip-to-peak contact force ratio (overstrike force), and the spring-column generated the least excess force (as a ratio of peak contact force). The results of this study could aid in optimizing computer key design that could possibly reduce subject discomfort and fatigue.

  14. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  15. 76 FR 70861 - Promoting Efficient Spending

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-15

    ... heads to take even more aggressive steps to ensure the Government is a good steward of taxpayer money...., mobile phones, smartphones, desktop and laptop computers, and tablet personal computers) issued to...

  16. What Physicists Should Know About High Performance Computing - Circa 2002

    NASA Astrophysics Data System (ADS)

    Frederick, Donald

    2002-08-01

    High Performance Computing (HPC) is a dynamic, cross-disciplinary field that traditionally has involved applied mathematicians, computer scientists, and others primarily from the various disciplines that have been major users of HPC resources - physics, chemistry, engineering, with increasing use by those in the life sciences. There is a technological dynamic that is powered by economic as well as by technical innovations and developments. This talk will discuss practical ideas to be considered when developing numerical applications for research purposes. Even with the rapid pace of development in the field, the author believes that these concepts will not become obsolete for a while, and will be of use to scientists who either are considering, or who have already started down the HPC path. These principles will be applied in particular to current parallel HPC systems, but there will also be references of value to desktop users. The talk will cover such topics as: computing hardware basics, single-cpu optimization, compilers, timing, numerical libraries, debugging and profiling tools and the emergence of Computational Grids.

  17. The Printout: Computers and Reading in the United Kingdom.

    ERIC Educational Resources Information Center

    Ewing, James M.

    1988-01-01

    Offers an overview of some reading and language arts computer projects in the United Kingdom, including language teaching and intelligent knowledge-based systems, assessment of written style by computer, and desktop publishing in the primary school. (ARH)

  18. Self-reported wrist and finger symptoms associated with other physical/mental symptoms and use of computers/mobile phones.

    PubMed

    Korpinen, Leena; Pääkkönen, Rauno; Gobba, Fabriziomaria

    2018-03-01

    Recently, computer, mobile phone and Internet use has increased. This study aimed to determine the possible relation between self-reported wrist and finger symptoms (aches, pain or numbness) and using computers/mobile phones, and to analyze how the symptoms are specifically associated with utilizing desktop computers, portable computers or mini-computers and mobile phones. A questionnaire was sent to 15,000 working-age Finns (age 18-65). Via a questionnaire, 723 persons reported wrist and finger symptoms often or more with use. Over 80% use mobile phones daily and less than 30% use desktop computers or the Internet daily at leisure, e.g., over 89.8% quite often or often experienced pain, numbness or aches in the neck, and 61.3% had aches in the hips and the lower back. Only 33.7% connected their symptoms to computer use. In the future, the development of new devices and Internet services should incorporate the ergonomics of the hands and wrists.

  19. Desktop supercomputer: what can it do?

    NASA Astrophysics Data System (ADS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  20. Tabletop computed lighting for practical digital photography.

    PubMed

    Mohan, Ankit; Bailey, Reynold; Waite, Jonathan; Tumblin, Jack; Grimm, Cindy; Bodenheimer, Bobby

    2007-01-01

    We apply simplified image-based lighting methods to reduce the equipment, cost, time, and specialized skills required for high-quality photographic lighting of desktop-sized static objects such as museum artifacts. We place the object and a computer-steered moving-head spotlight inside a simple foam-core enclosure and use a camera to record photos as the light scans the box interior. Optimization, guided by interactive user sketching, selects a small set of these photos whose weighted sum best matches the user-defined target sketch. Unlike previous image-based relighting efforts, our method requires only a single area light source, yet it can achieve high-resolution light positioning to avoid multiple sharp shadows. A reduced version uses only a handheld light and may be suitable for battery-powered field photography equipment that fits into a backpack.

  1. Effects of Dual Monitor Computer Work Versus Laptop Work on Cervical Muscular and Proprioceptive Characteristics of Males and Females.

    PubMed

    Farias Zuniga, Amanda M; Côté, Julie N

    2017-06-01

    The effects of performing a 90-minute computer task with a laptop versus a dual monitor desktop workstation were investigated in healthy young male and female adults. Work-related musculoskeletal disorders are common among computer (especially female) users. Laptops have surpassed desktop computer sales, and working with multiple monitors has also become popular. However, few studies have provided objective evidence on how they affect the musculoskeletal system in both genders. Twenty-seven healthy participants (mean age = 24.6 years; 13 males) completed a 90-minute computer task while using a laptop or dual monitor (DualMon) desktop. Electromyography (EMG) from eight upper body muscles and visual strain were measured throughout the task. Neck proprioception was tested before and after the computer task using a head-repositioning test. EMG amplitude (root mean square [RMS]), variability (coefficients of variation [CV]), and normalized mutual information (NMI) were computed. Visual strain ( p < .01) and right upper trapezius RMS ( p = .03) increased significantly over time regardless of workstation. Right cervical erector spinae RMS and cervical NMI were smaller, while degrees of overshoot (mean = 4.15°) and end position error (mean = 1.26°) were larger in DualMon regardless of time. Effects on muscle activity were more pronounced in males, whereas effects on proprioception were more pronounced in females. Results suggest that compared to laptop, DualMon work is effective in reducing cervical muscle activity, dissociating cervical connectivity, and maintaining more typical neck repositioning patterns, suggesting some health-protective effects. This evidence could be considered when deciding on computer workstation designs.

  2. A virtual computer lab for distance biomedical technology education.

    PubMed

    Locatis, Craig; Vega, Anibal; Bhagwat, Medha; Liu, Wei-Li; Conde, Jose

    2008-03-13

    The National Library of Medicine's National Center for Biotechnology Information offers mini-courses which entail applying concepts in biochemistry and genetics to search genomics databases and other information sources. They are highly interactive and involve use of 3D molecular visualization software that can be computationally taxing. Methods were devised to offer the courses at a distance so as to provide as much functionality of a computer lab as possible, the venue where they are normally taught. The methods, which can be employed with varied videoconferencing technology and desktop sharing software, were used to deliver mini-courses at a distance in pilot applications where students could see demonstrations by the instructor and the instructor could observe and interact with students working at their remote desktops. Student ratings of the learning experience and comments to open ended questions were similar to those when the courses are offered face to face. The real time interaction and the instructor's ability to access student desktops from a distance in order to provide individual assistance and feedback were considered invaluable. The technologies and methods mimic much of the functionality of computer labs and may be usefully applied in any context where content changes frequently, training needs to be offered on complex computer applications at a distance in real time, and where it is necessary for the instructor to monitor students as they work.

  3. Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0

    NASA Technical Reports Server (NTRS)

    Perry, John; Stroud, C. W.

    1986-01-01

    A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.

  4. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  5. Comfort with Computers in the Library.

    ERIC Educational Resources Information Center

    Agati, Joseph

    2002-01-01

    Sets forth a list of do's and don't's when integrating aesthetics, functionality, and technology into college library computer workstation furniture. The article discusses workstation access for both portable computer users and for staff, whose needs involve desktop computers that are possibly networked with printers and other peripherals. (GR)

  6. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  7. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  8. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  9. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  10. 10 CFR 727.2 - What are the definitions of the terms used in this part?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...

  11. Temporal and spatial organization of doctors' computer usage in a UK hospital department.

    PubMed

    Martins, H M G; Nightingale, P; Jones, M R

    2005-06-01

    This paper describes the use of an application accessible via distributed desktop computing and wireless mobile devices in a specialist department of a UK acute hospital. Data (application logs, in-depth interviews, and ethnographic observation) were simultaneously collected to study doctors' work via this application, when and where they accessed different areas of it, and from what computing devices. These show that the application is widely used, but in significantly different ways over time and space. For example, physicians and surgeons differ in how they use the application and in their choice of mobile or desktop computing. Consultants and junior doctors in the same teams also seem to access different sources of patient information, at different times, and from different locations. Mobile technology was used almost exclusively during the morning by groups of clinicians, predominantly for ward rounds.

  12. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  13. Align and conquer: moving toward plug-and-play color imaging

    NASA Astrophysics Data System (ADS)

    Lee, Ho J.

    1996-03-01

    The rapid evolution of the low-cost color printing and image capture markets has precipitated a huge increase in the use of color imagery by casual end users on desktop systems, as opposed to traditional professional color users working with specialized equipment. While the cost of color equipment and software has decreased dramatically, the underlying system-level problems associated with color reproduction have remained the same, and in many cases are more difficult to address in a casual environment than in a professional setting. The proliferation of color imaging technologies so far has resulted in a wide availability of component solutions which work together poorly. A similar situation in the desktop computing market has led to the various `Plug-and-Play' standards, which provide a degree of interoperability between a range of products on disparate computing platforms. This presentation will discuss some of the underlying issues and emerging trends in the desktop and consumer digital color imaging markets.

  14. Quality user support: Supporting quality users

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolley, T.C.

    1994-12-31

    During the past decade, fundamental changes have occurred in technical computing in the oil industry. Technical computing systems have moved from local, fragmented quantity, to global, integrated, quality. The compute power available to the average geoscientist at his desktop has grown exponentially. Technical computing applications have increased in integration and complexity. At the same time, there has been a significant change in the work force due to the pressures of restructuring, and the increased focus on international opportunities. The profile of the user of technical computing resources has changed. Users are generally more mature, knowledgeable, and team oriented than theirmore » predecessors. In the 1990s, computer literacy is a requirement. This paper describes the steps taken by Oryx Energy Company to address the problems and opportunities created by the explosive growth in computing power and needs, coupled with the contraction of the business. A successful user support strategy will be described. Characteristics of the program include: (1) Client driven support; (2) Empowerment of highly skilled professionals to fill the support role; (3) Routine and ongoing modification to the support plan; (4) Utilization of the support assignment to create highly trained advocates on the line; (5) Integration of the support role to the reservoir management team. Results of the plan include a highly trained work force, stakeholder teams that include support personnel, and global support from a centralized support organization.« less

  15. Assessment of drug information resource preferences of pharmacy students and faculty

    PubMed Central

    Hanrahan, Conor T.; Cole, Sabrina W.

    2014-01-01

    A 39-item survey instrument was distributed to faculty and students at Wingate University School of Pharmacy to assess student and faculty drug information (DI) resource use and access preferences. The response rate was 81% (n = 289). Faculty and professional year 2 to 4 students preferred access on laptop or desktop computers (67% and 75%, respectively), followed by smartphones (27% and 22%, respectively). Most faculty and students preferred using Lexicomp Online for drug information (53% and 74%, respectively). Results indicate that DI resources use is similar between students and faculty; laptop or desktop computers are the preferred platforms for accessing drug information. PMID:24860270

  16. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  17. Large-scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU).

    PubMed

    Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin

    2015-01-15

    Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  19. Microtomographic imaging in the process of bone modeling and simulation

    NASA Astrophysics Data System (ADS)

    Mueller, Ralph

    1999-09-01

    Micro-computed tomography ((mu) CT) is an emerging technique to nondestructively image and quantify trabecular bone in three dimensions. Where the early implementations of (mu) CT focused more on technical aspects of the systems and required equipment not normally available to the general public, a more recent development emphasized practical aspects of micro- tomographic imaging. That system is based on a compact fan- beam type of tomograph, also referred to as desktop (mu) CT. Desk-top (mu) CT has been used extensively for the investigation of osteoporosis related health problems gaining new insight into the organization of trabecular bone and the influence of osteoporotic bone loss on bone architecture and the competence of bone. Osteoporosis is a condition characterized by excessive bone loss and deterioration in bone architecture. The reduced quality of bone increases the risk of fracture. Current imaging technologies do not allow accurate in vivo measurements of bone structure over several decades or the investigation of the local remodeling stimuli at the tissue level. Therefore, computer simulations and new experimental modeling procedures are necessary for determining the long-term effects of age, menopause, and osteoporosis on bone. Microstructural bone models allow us to study not only the effects of osteoporosis on the skeleton but also to assess and monitor the effectiveness of new treatment regimens. The basis for such approaches are realistic models of bone and a sound understanding of the underlying biological and mechanical processes in bone physiology. In this article, strategies for new approaches to bone modeling and simulation in the study and treatment of osteoporosis and age-related bone loss are presented. The focus is on the bioengineering and imaging aspects of osteoporosis research. With the introduction of desk-top (mu) CT, a new generation of imaging instruments has entered the arena allowing easy and relatively inexpensive access to the three-dimensional microstructure of bone, thereby giving bone researchers a powerful tool for the exploration of age-related bone loss and osteoporosis.

  20. The Quake Catcher Network: Cyberinfrastructure Bringing Seismology into Schools and Homes

    NASA Astrophysics Data System (ADS)

    Lawrence, J. F.; Cochran, E. S.

    2007-12-01

    We propose to implement a high density, low cost strong-motion network for rapid response and early warning by placing sensors in schools, homes, and offices. The Quake Catcher Network (QCN) will employ existing networked laptops and desktops to form the world's largest high-density, distributed computing seismic network. Costs for this network will be minimal because the QCN will use 1) strong motion sensors (accelerometers) already internal to many laptops and 2) nearly identical low-cost universal serial bus (USB) accelerometers for use with desktops. The Berkeley Open Infrastructure for Network Computing (BOINC!) provides a free, proven paradigm for involving the public in large-scale computational research projects. As evidenced by the SETI@home program and others, individuals are especially willing to donate their unused computing power to projects that they deem relevant, worthwhile, and educational. The client- and server-side software will rapidly monitor incoming seismic signals, detect the magnitudes and locations of significant earthquakes, and may even provide early warnings to other computers and users before they can feel the earthquake. The software will provide the client-user with a screen-saver displaying seismic data recorded on their laptop, recently detected earthquakes, and general information about earthquakes and the geosciences. Furthermore, this project will install USB sensors in K-12 classrooms as an educational tool for teaching science. Through a variety of interactive experiments students will learn about earthquakes and the hazards earthquakes pose. For example, students can learn how the vibrations of an earthquake decrease with distance by jumping up and down at increasing distances from the sensor and plotting the decreased amplitude of the seismic signal measured on their computer. We hope to include an audio component so that students can hear and better understand the difference between low and high frequency seismic signals. The QCN will provide a natural way to engage students and the public in earthquake detection and research.

  1. Where the Cloud Meets the Commons

    ERIC Educational Resources Information Center

    Ipri, Tom

    2011-01-01

    Changes presented by cloud computing--shared computing services, applications, and storage available to end users via the Internet--have the potential to seriously alter how libraries provide services, not only remotely, but also within the physical library, specifically concerning challenges facing the typical desktop computing experience.…

  2. 76 FR 43278 - Privacy Act; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-20

    ... computer (PC). The Security Management Officer's office remains locked when not in use. RETENTION AND... records to include names, addresses, social security numbers, service computation dates, leave usage data... that resides on a desktop computer. RETRIEVABILITY: Records maintained in file folders are indexed and...

  3. A highly efficient multi-core algorithm for clustering extremely large datasets

    PubMed Central

    2010-01-01

    Background In recent years, the demand for computational power in computational biology has increased due to rapidly growing data sets from microarray and other high-throughput technologies. This demand is likely to increase. Standard algorithms for analyzing data, such as cluster algorithms, need to be parallelized for fast processing. Unfortunately, most approaches for parallelizing algorithms largely rely on network communication protocols connecting and requiring multiple computers. One answer to this problem is to utilize the intrinsic capabilities in current multi-core hardware to distribute the tasks among the different cores of one computer. Results We introduce a multi-core parallelization of the k-means and k-modes cluster algorithms based on the design principles of transactional memory for clustering gene expression microarray type data and categorial SNP data. Our new shared memory parallel algorithms show to be highly efficient. We demonstrate their computational power and show their utility in cluster stability and sensitivity analysis employing repeated runs with slightly changed parameters. Computation speed of our Java based algorithm was increased by a factor of 10 for large data sets while preserving computational accuracy compared to single-core implementations and a recently published network based parallelization. Conclusions Most desktop computers and even notebooks provide at least dual-core processors. Our multi-core algorithms show that using modern algorithmic concepts, parallelization makes it possible to perform even such laborious tasks as cluster sensitivity and cluster number estimation on the laboratory computer. PMID:20370922

  4. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  5. Addressing Small Computers in the First OS Course

    ERIC Educational Resources Information Center

    Nutt, Gary

    2006-01-01

    Small computers are emerging as important components of the contemporary computing scene. Their operating systems vary from specialized software for an embedded system to the same style of OS used on a generic desktop or server computer. This article describes a course in which systems are classified by their hardware capability and the…

  6. 75 FR 32915 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    ... used to authenticate authorized desktop and laptop computer users. Computer servers are scanned monthly... data is also used for management and statistical reports and studies. Routine uses of records... duties. The computer files are password protected with access restricted to authorized users. Records are...

  7. 12 CFR 1235.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...

  8. 12 CFR 1235.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., or stored by electronic means. E-mail means a document created or received on a computer network for... conduct of the business of a regulated entity or the Office of Finance (which business, in the case of the... is stored or located, including network servers, desktop or laptop computers and handheld computers...

  9. The Natural Link between Teaching History and Computer Skills.

    ERIC Educational Resources Information Center

    Farnworth, George M.

    1992-01-01

    Suggests that, because both history and computers are information based, there is an natural link between the two. Argues that history teachers should exploit the technology to help students to understand history while they become computer literate. Points out uses for databases, word processing, desktop publishing, and telecommunications in…

  10. The desktop interface in intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Baudendistel, Stephen; Hua, Grace

    1987-01-01

    The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.

  11. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  12. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  13. A hardware-in-the-loop simulation program for ground-based radar

    NASA Astrophysics Data System (ADS)

    Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna

    2011-06-01

    A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.

  14. Multimedia architectures: from desktop systems to portable appliances

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Vasudev; Konstantinides, Konstantinos; Natarajan, Balas R.

    1997-01-01

    Future desktop and portable computing systems will have as their core an integrated multimedia system. Such a system will seamlessly combine digital video, digital audio, computer animation, text, and graphics. Furthermore, such a system will allow for mixed-media creation, dissemination, and interactive access in real time. Multimedia architectures that need to support these functions have traditionally required special display and processing units for the different media types. This approach tends to be expensive and is inefficient in its use of silicon. Furthermore, such media-specific processing units are unable to cope with the fluid nature of the multimedia market wherein the needs and standards are changing and system manufacturers may demand a single component media engine across a range of products. This constraint has led to a shift towards providing a single-component multimedia specific computing engine that can be integrated easily within desktop systems, tethered consumer appliances, or portable appliances. In this paper, we review some of the recent architectural efforts in developing integrated media systems. We primarily focus on two efforts, namely the evolution of multimedia-capable general purpose processors and a more recent effort in developing single component mixed media co-processors. Design considerations that could facilitate the migration of these technologies to a portable integrated media system also are presented.

  15. Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.

    ERIC Educational Resources Information Center

    Ogg, Harold C.

    This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…

  16. Recommended Computer End-User Skills for Business Students by Inc. 500 Executives and Office Systems Educators.

    ERIC Educational Resources Information Center

    Zhao, Jensen J.; Ray, Charles M.; Dye, Lee J.; Davis, Rodney

    1998-01-01

    Executives (n=63) and office-systems educators (n=88) recommended for workers the following categories of computer end-user skills: hardware, operating systems, word processing, spreadsheets, database, desktop publishing, and presentation. (SK)

  17. Desktop publishing and validation of custom near visual acuity charts.

    PubMed

    Marran, Lynn; Liu, Lei; Lau, George

    2008-11-01

    Customized visual acuity (VA) assessment is an important part of basic and clinical vision research. Desktop computer based distance VA measurements have been utilized, and shown to be accurate and reliable, but computer based near VA measurements have not been attempted, mainly due to the limited spatial resolution of computer monitors. In this paper, we demonstrate how to use desktop publishing to create printed custom near VA charts. We created a set of six near VA charts in a logarithmic progression, 20/20 through 20/63, with multiple lines of the same acuity level, different letter arrangements in each line and a random noise background. This design allowed repeated measures of subjective accommodative amplitude without the potential artifact of familiarity of the optotypes. The background maintained a constant and spatial frequency rich peripheral stimulus for accommodation across the six different acuity levels. The paper describes in detail how pixel-wise accurate black and white bitmaps of Sloan optotypes were used to create the printed custom VA charts. At all acuity levels, the physical sizes of the printed custom optotypes deviated no more than 0.034 log units from that of the standard, satisfying the 0.05 log unit ISO criterion we used to demonstrate physical equivalence. Also, at all acuity levels, log unit differences in the mean target distance for which reliable recognition of letters first occurred for the printed custom optotypes compared to the standard were found to be below 0.05, satisfying the 0.05 log unit ISO criterion we used to demonstrate functional equivalence. It is possible to use desktop publishing to create custom near VA charts that are physically and functionally equivalent to standard VA charts produced by a commercial printing process.

  18. Higher-order ice-sheet modelling accelerated by multigrid on graphics cards

    NASA Astrophysics Data System (ADS)

    Brædstrup, Christian; Egholm, David

    2013-04-01

    Higher-order ice flow modelling is a very computer intensive process owing primarily to the nonlinear influence of the horizontal stress coupling. When applied for simulating long-term glacial landscape evolution, the ice-sheet models must consider very long time series, while both high temporal and spatial resolution is needed to resolve small effects. The use of higher-order and full stokes models have therefore seen very limited usage in this field. However, recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large-scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists working on ice flow models. Our current research focuses on utilising the GPU as a tool in ice-sheet and glacier modelling. To this extent we have implemented the Integrated Second-Order Shallow Ice Approximation (iSOSIA) equations on the device using the finite difference method. To accelerate the computations, the GPU solver uses a non-linear Red-Black Gauss-Seidel iterator coupled with a Full Approximation Scheme (FAS) multigrid setup to further aid convergence. The GPU finite difference implementation provides the inherent parallelization that scales from hundreds to several thousands of cores on newer cards. We demonstrate the efficiency of the GPU multigrid solver using benchmark experiments.

  19. Configuration and Management of a Cluster Computing Facility in Undergraduate Student Computer Laboratories

    ERIC Educational Resources Information Center

    Cornforth, David; Atkinson, John; Spennemann, Dirk H. R.

    2006-01-01

    Purpose: Many researchers require access to computer facilities beyond those offered by desktop workstations. Traditionally, these are offered either through partnerships, to share the cost of supercomputing facilities, or through purpose-built cluster facilities. However, funds are not always available to satisfy either of these options, and…

  20. The Role of Wireless Computing Technology in the Design of Schools.

    ERIC Educational Resources Information Center

    Nair, Prakash

    This document discusses integrating computers logically and affordably into a school building's infrastructure through the use of wireless technology. It begins by discussing why wireless networks using mobile computers are preferable to desktop machines in each classoom. It then explains the features of a wireless local area network (WLAN) and…

  1. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  2. Ultra-Scale Computing for Emergency Evacuation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng

    2010-01-01

    Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, itmore » is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.« less

  3. 48 CFR 23.701 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Definitions. 23.701... DRUG-FREE WORKPLACE Contracting for Environmentally Preferable Products and Services 23.701 Definitions. As used in this subpart— Computer monitor means a video display unit used with a computer. Desktop...

  4. Lock It Up! Computer Security.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    1997-01-01

    The data contained on desktop computer systems and networks pose security issues for virtually every district. Sensitive information can be protected by educating users, altering the physical layout, using password protection, designating access levels, backing up data, reformatting floppy disks, using antivirus software, and installing encryption…

  5. Desktop chaotic systems: Intuition and visualization

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Melcher, Kevin J.; Qammar, Helen K.; Hartley, Tom T.

    1993-01-01

    This paper presents a dynamic study of the Wildwood Pendulum, a commercially available desktop system which exhibits a strange attractor. The purpose of studying this chaotic pendulum is twofold: to gain insight in the paradigmatic approach of modeling, simulating, and determining chaos in nonlinear systems; and to provide a desktop model of chaos as a visual tool. For this study, the nonlinear behavior of this chaotic pendulum is modeled, a computer simulation is performed, and an experimental performance is measured. An assessment of the pendulum in the phase plane shows the strange attractor. Through the use of a box-assisted correlation dimension methodology, the attractor dimension is determined for both the model and the experimental pendulum systems. Correlation dimension results indicate that the pendulum and the model are chaotic and their fractal dimensions are similar.

  6. Pc as Physics Computer for Lhc ?

    NASA Astrophysics Data System (ADS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  7. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  8. Video streaming in nursing education: bringing life to online education.

    PubMed

    Smith-Stoner, Marilyn; Willer, Ann

    2003-01-01

    Distance education is a standard form of instruction for many colleges of nursing. Web-based course and program content has been delivered primarily through text-based presentations such as PowerPoint slides and Web search activities. However, the rapid pace of technological innovation is making available more sophisticated forms of delivery such as video streaming. High-quality video streams, created at the instructor's desktop or in basic recording studios, can be produced that build on PowerPoint or create new media for use on the Web. The technology required to design, produce, and upload short video-streamed course content objects to the Internet is described. The preparation of materials, suggested production guidelines, and examples of information presented via desktop video methods are presented.

  9. Letter Writing Made Easy.

    ERIC Educational Resources Information Center

    Porec, Carol J.

    1989-01-01

    Describes how "The Children's Writing and Publishing Center" (a desktop publishing program for elementary students) combines word processing with computer graphics and motivates students to write letters. (MM)

  10. MLJ Computer Corner

    ERIC Educational Resources Information Center

    Brink, Dan

    1987-01-01

    Reviews the current state of printing software and printing hardware compatibility and capacity. Discusses the changing relationship between author and publisher resulting from the advent of desktop publishing. (LMO)

  11. A Librarian Without Books:Systems Librarianship in Astronomy

    NASA Astrophysics Data System (ADS)

    Kneale, R. A.

    2007-10-01

    The author discusses one aspect of the changing nature of librarianship by focusing on a high-tech microcosm of an already high-tech profession, that of systems librarianship. She is the Systems Librarian for the Advanced Technology Solar Telescope (ATST) project, based in Tucson, Arizona. The project is engaged in the design and development of a 4-meter solar telescope, planned for the summit of Haleakalā, Maui, Hawai'i. Most of the day-to-day tasks at ATST involve software in one form or another; the author makes heavy use of Remote Desktop and Virtual Network Computing (VNC) to manage installations on eight different servers (four Windows, four Unix) in two states, plus staff desktops (Windows XP) from the comfy chair in front of her computer.

  12. Page Recognition: Quantum Leap In Recognition Technology

    NASA Astrophysics Data System (ADS)

    Miller, Larry

    1989-07-01

    No milestone has proven as elusive as the always-approaching "year of the LAN," but the "year of the scanner" might claim the silver medal. Desktop scanners have been around almost as long as personal computers. And everyone thinks they are used for obvious desktop-publishing and business tasks like scanning business documents, magazine articles and other pages, and translating those words into files your computer understands. But, until now, the reality fell far short of the promise. Because it's true that scanners deliver an accurate image of the page to your computer, but the software to recognize this text has been woefully disappointing. Old optical-character recognition (OCR) software recognized such a limited range of pages as to be virtually useless to real users. (For example, one OCR vendor specified 12-point Courier font from an IBM Selectric typewriter: the same font in 10-point, or from a Diablo printer, was unrecognizable!) Computer dealers have told me the chasm between OCR expectations and reality is so broad and deep that nine out of ten prospects leave their stores in disgust when they learn the limitations. And this is a very important, very unfortunate gap. Because the promise of recognition -- what people want it to do -- carries with it tremendous improvements in our productivity and ability to get tons of written documents into our computers where we can do real work with it. The good news is that a revolutionary new development effort has led to the new technology of "page recognition," which actually does deliver the promise we've always wanted from OCR. I'm sure every reader appreciates the breakthrough represented by the laser printer and page-makeup software, a combination so powerful it created new reasons for buying a computer. A similar breakthrough is happening right now in page recognition: the Macintosh (and, I must admit, other personal computers) equipped with a moderately priced scanner and OmniPage software (from Caere Corporation) can recognize not only different fonts (omnifont recogniton) but different page (omnipage) formats, as well.

  13. Electronic Health Record Logs Indicate That Physicians Split Time Evenly Between Seeing Patients And Desktop Medicine.

    PubMed

    Tai-Seale, Ming; Olson, Cliff W; Li, Jinnan; Chan, Albert S; Morikawa, Criss; Durbin, Meg; Wang, Wei; Luft, Harold S

    2017-04-01

    Time spent by physicians is a key resource in health care delivery. This study used data captured by the access time stamp functionality of an electronic health record (EHR) to examine physician work effort. This is a potentially powerful, yet unobtrusive, way to study physicians' use of time. We used data on physicians' time allocation patterns captured by over thirty-one million EHR transactions in the period 2011-14 recorded by 471 primary care physicians, who collectively worked on 765,129 patients' EHRs. Our results suggest that the physicians logged an average of 3.08 hours on office visits and 3.17 hours on desktop medicine each day. Desktop medicine consists of activities such as communicating with patients through a secure patient portal, responding to patients' online requests for prescription refills or medical advice, ordering tests, sending staff messages, and reviewing test results. Over time, log records from physicians showed a decline in the time allocated to face-to-face visits, accompanied by an increase in time allocated to desktop medicine. Staffing and scheduling in the physician's office, as well as provider payment models for primary care practice, should account for these desktop medicine efforts. Project HOPE—The People-to-People Health Foundation, Inc.

  14. Fast Ordered Sampling of DNA Sequence Variants.

    PubMed

    Greenberg, Anthony J

    2018-05-04

    Explosive growth in the amount of genomic data is matched by increasing power of consumer-grade computers. Even applications that require powerful servers can be quickly tested on desktop or laptop machines if we can generate representative samples from large data sets. I describe a fast and memory-efficient implementation of an on-line sampling method developed for tape drives 30 years ago. Focusing on genotype files, I test the performance of this technique on modern solid-state and spinning hard drives, and show that it performs well compared to a simple sampling scheme. I illustrate its utility by developing a method to quickly estimate genome-wide patterns of linkage disequilibrium (LD) decay with distance. I provide open-source software that samples loci from several variant format files, a separate program that performs LD decay estimates, and a C++ library that lets developers incorporate these methods into their own projects. Copyright © 2018 Greenberg.

  15. A microcomputer-based daily living activity recording system.

    PubMed

    Matsuoka, Shingo; Yonezawa, Yoshiharu; Maki, Hiromichi; Ogawa, Hidekuni; Hahn, Allen W; Thayer, Julian F; Caldwell, W Morton

    2003-01-01

    A new daily living activity recording system has been developed for monitoring health conditions and living patterns, such as respiration, posture, activity/rest ratios and general activity level. The system employs a piezoelectric sensor, a dual axis accelerometer, two low-power active filters, a low-power 8-bit single chip microcomputer and a 128 MB compact flash memory. The piezoelectric sensor, whose electrical polarization voltage is produced by mechanical strain, detects body movements. Its high-frequency output components reflect body movements produced by walking and running activities, while the low frequency components are mainly respiratory. The dual axis accelerometer detects, from body X and Y tilt angles, whether the patient is standing, sitting or lying down (prone, supine, left side or right side). The detected respiratory, behavior and posture signals are stored by the compact flash memory. After recording, these data are downloaded to a desktop computer and analyzed.

  16. [Web-ring of sites for pathologists in the internet: a computer-mediated communication environment].

    PubMed

    Khramtsov, A I; Isianov, N N; Khorzhevskiĭ, V A

    2009-01-01

    The recently developed Web-ring of pathology-related Web-sites has transformed computer-mediated communications for Russian-speaking pathologists. Though the pathologists may be geographically dispersed, the network provides a complex of asynchronous and synchronous conferences for the purposes of diagnosis, consultations, education, communication, and collaboration in the field of pathology. This paper describes approaches to be used by participants of the pathology-related Web-ring. The approaches are analogous to the tools employed in telepathology and digital microscopy. One of the novel methodologies is the use of Web-based conferencing systems, in which the whole slide digital images of tissue microarrays were jointly reviewed online by pathologists at distant locations. By using ImageScope (Aperio Technologies) and WebEx connect desktop management technology, they shared presentations and images and communicated in realtime. In this manner, the Web-based forums and conferences will be a powerful addition to a telepathology.

  17. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    PubMed

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.

  18. LTCP 2D Graphical User Interface. Application Description and User's Guide

    NASA Technical Reports Server (NTRS)

    Ball, Robert; Navaz, Homayun K.

    1996-01-01

    A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.

  19. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  20. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  1. 48 CFR 352.239-71 - Standard for encryption language.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... product has been validated under the Cryptographic Module Validation Program (see http://csrc.nist.gov... of the validation documentation to the Contracting Officer and the Contracting Officer's Technical... computers, desktop computers, and other mobile devices and portable media that store or process sensitive...

  2. 7 CFR 2.98 - Director, Management Services.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...

  3. 7 CFR 2.98 - Director, Management Services.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...

  4. 7 CFR 2.98 - Director, Management Services.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management services; information technology services related to end user office automation, desktop computers, enterprise networking support, handheld devices and voice telecommunications; with authority to take actions...

  5. A Magneto-Inductive Sensor Based Wireless Tongue-Computer Interface

    PubMed Central

    Huo, Xueliang; Wang, Jia; Ghovanloo, Maysam

    2015-01-01

    We have developed a noninvasive, unobtrusive magnetic wireless tongue-computer interface, called “Tongue Drive,” to provide people with severe disabilities with flexible and effective computer access and environment control. A small permanent magnet secured on the tongue by implantation, piercing, or tissue adhesives, is utilized as a tracer to track the tongue movements. The magnetic field variations inside and around the mouth due to the tongue movements are detected by a pair of three-axial linear magneto-inductive sensor modules mounted bilaterally on a headset near the user’s cheeks. After being wirelessly transmitted to a portable computer, the sensor output signals are processed by a differential field cancellation algorithm to eliminate the external magnetic field interference, and translated into user control commands, which could then be used to access a desktop computer, maneuver a powered wheelchair, or control other devices in the user’s environment. The system has been successfully tested on six able-bodied subjects for computer access by defining six individual commands to resemble mouse functions. Results show that the Tongue Drive system response time for 87% correctly completed commands is 0.8 s, which yields to an information transfer rate of ~130 b/min. PMID:18990653

  6. The changing nature of spacecraft operations: From the Vikings of the 1970's to the great observatories of the 1990's and beyond

    NASA Technical Reports Server (NTRS)

    Ledbetter, Kenneth W.

    1992-01-01

    Four trends in spacecraft flight operations are discussed which will reduce overall program costs. These trends are the use of high-speed, highly reliable data communications systems for distributing operations functions to more convenient and cost-effective sites; the improved capability for remote operation of sensors; a continued rapid increase in memory and processing speed of flight qualified computer chips; and increasingly capable ground-based hardware and software systems, notably those augmented by artificial intelligence functions. Changes reflected by these trends are reviewed starting from the NASA Viking missions of the early 70s, when mission control was conducted at one location using expensive and cumbersome mainframe computers and communications equipment. In the 1980s, powerful desktop computers and modems enabled the Magellan project team to operate the spacecraft remotely. In the 1990s, the Hubble Space Telescope project uses multiple color screens and automated sequencing software on small computers. Given a projection of current capabilities, future control centers will be even more cost-effective.

  7. Computational biomedicine: a challenge for the twenty-first century.

    PubMed

    Coveney, Peter V; Shublaq, Nour W

    2012-01-01

    With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.

  8. Using "Audacity" and One Classroom Computer to Experiment with Timbre

    ERIC Educational Resources Information Center

    Smith, Kenneth H.

    2011-01-01

    One computer, one class, and one educator can be an effective combination to engage students as a group in music composition, performance, and analysis. Having one desktop computer and a television monitor in the music classroom is not an uncommon or new scenario, especially in a time when many school budgets are being cut. This article…

  9. Numerical Optimization Using Desktop Computers

    DTIC Science & Technology

    1980-09-11

    concentrating compound parabolic trough solar collector . Thermophysical, geophysical, optical and economic analyses were used to compute a life-cycle...third computer program, NISCO, was developed to model a nonimaging concentrating compound parabolic trough solar collector using thermophysical...concentrating compound parabolic trough Solar Collector . C. OBJECTIVE The objective of this thesis was to develop a system of interactive programs for the Hewlett

  10. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software

    PubMed Central

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong

    2017-01-01

    Abstract Background: Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line–based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. Results: We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. Conclusions: As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. PMID:28327936

  11. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    PubMed

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  12. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed Central

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E.; Ramakrishnan, IV

    2017-01-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access—an early forerunner of true ubiquitous access—screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments. PMID:28782061

  13. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    PubMed

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V

    2017-05-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.

  14. Parallelization of a spatial random field characterization process using the Method of Anchored Distributions and the HTCondor high throughput computing system

    NASA Astrophysics Data System (ADS)

    Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    A new software application called MAD# has been coupled with the HTCondor high throughput computing system to aid scientists and educators with the characterization of spatial random fields and enable understanding the spatial distribution of parameters used in hydrogeologic and related modeling. MAD# is an open source desktop software application used to characterize spatial random fields using direct and indirect information through Bayesian inverse modeling technique called the Method of Anchored Distributions (MAD). MAD relates indirect information with a target spatial random field via a forward simulation model. MAD# executes inverse process running the forward model multiple times to transfer information from indirect information to the target variable. MAD# uses two parallelization profiles according to computational resources available: one computer with multiple cores and multiple computers - multiple cores through HTCondor. HTCondor is a system that manages a cluster of desktop computers for submits serial or parallel jobs using scheduling policies, resources monitoring, job queuing mechanism. This poster will show how MAD# reduces the time execution of the characterization of random fields using these two parallel approaches in different case studies. A test of the approach was conducted using 1D problem with 400 cells to characterize saturated conductivity, residual water content, and shape parameters of the Mualem-van Genuchten model in four materials via the HYDRUS model. The number of simulations evaluated in the inversion was 10 million. Using the one computer approach (eight cores) were evaluated 100,000 simulations in 12 hours (10 million - 1200 hours approximately). In the evaluation on HTCondor, 32 desktop computers (132 cores) were used, with a processing time of 60 hours non-continuous in five days. HTCondor reduced the processing time for uncertainty characterization by a factor of 20 (1200 hours reduced to 60 hours.)

  15. Technology in Education: Research Says!!

    ERIC Educational Resources Information Center

    Canuel, Ron

    2011-01-01

    A large amount of research existed in the field of technology in the classroom; however, almost all was focused on the impact of desktop computers and the infamous "school computer room". However, the activities in a classroom represent a multitude of behaviours and interventions, including personal dynamics, classroom management and…

  16. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    USDA-ARS?s Scientific Manuscript database

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  17. Distributing Data from Desktop to Hand-Held Computers

    NASA Technical Reports Server (NTRS)

    Elmore, Jason L.

    2005-01-01

    A system of server and client software formats and redistributes data from commercially available desktop to commercially available hand-held computers via both wired and wireless networks. This software is an inexpensive means of enabling engineers and technicians to gain access to current sensor data while working in locations in which such data would otherwise be inaccessible. The sensor data are first gathered by a data-acquisition server computer, then transmitted via a wired network to a data-distribution computer that executes the server portion of the present software. Data in all sensor channels -- both raw sensor outputs in millivolt units and results of conversion to engineering units -- are made available for distribution. Selected subsets of the data are transmitted to each hand-held computer via the wired and then a wireless network. The selection of the subsets and the choice of the sequences and formats for displaying the data is made by means of a user interface generated by the client portion of the software. The data displayed on the screens of hand-held units can be updated at rates from 1 to

  18. "Software Tools" to Improve Student Writing.

    ERIC Educational Resources Information Center

    Oates, Rita Haugh

    1987-01-01

    Reviews several software packages that analyze text readability, check for spelling and style problems, offer desktop publishing capabilities, teach interviewing skills, and teach grammar using a computer game. (SRT)

  19. Desktop system for accounting, audit, and research in A&E.

    PubMed Central

    Taylor, C J; Brain, S G; Bull, F; Crosby, A C; Ferguson, D G

    1997-01-01

    The development of a database for audit, research, and accounting in accident and emergency (A&E) is described. The system uses a desktop computer, an optical scanner, sophisticated optical mark reader software, and workload management data. The system is highly flexible, easy to use, and at a cost of around 16,000 pounds affordable for larger departments wishing to move towards accounting. For smaller departments, it may be an alternative to full computerisation. Images Figure 1 Figure 2 Figure 3 Figure 5 Figure 6 PMID:9132200

  20. Bringing the medical library to the office desktop.

    PubMed

    Brown, S R; Decker, G; Pletzke, C J

    1991-01-01

    This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.

  1. Development, implementation, and analysis of desktop-scale model industrial equipment and a critical thinking rubric for use in chemical engineering education

    NASA Astrophysics Data System (ADS)

    Golter, Paul B.

    In order to address some of the challenges facing engineering education, namely the demand that students be better prepared to practice professional as well as technical skills, we have developed an intervention consisting of equipment, assessments and a novel pedagogy. The equipment consists of desktop-scale replicas of common industrial equipment. These are implemented in the form of modular cartridges that can be interchanged in a base unit containing water, power and instrumentation. These Desktop Learning Modules (DLMs) are effective at providing a hands on experience in most classroom environments without requiring either water or power hook-ups. Furthermore, the DLMs respond quickly enough that multiple experiments by multiple groups can be run in a single one hour class. We refined an existing critical thinking rubric to be more specific to the realm of engineering problem solving. By altering our pedagogy to a project based environment using the critical thinking rubric as a primary grading tool, we are able to observe and measure the critical thinking skills of student groups. This rubric is corroborated with an industrial perspective and measures constructs that are important to the students' future careers.

  2. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  3. The social computing room: a multi-purpose collaborative visualization environment

    NASA Astrophysics Data System (ADS)

    Borland, David; Conway, Michael; Coposky, Jason; Ginn, Warren; Idaszak, Ray

    2010-01-01

    The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.

  4. 48 CFR 352.239-70 - Standard for security configurations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...

  5. 48 CFR 352.239-70 - Standard for security configurations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...

  6. A Survey of Students Participating in a Computer-Assisted Education Programme

    ERIC Educational Resources Information Center

    Yel, Elif Binboga; Korhan, Orhan

    2015-01-01

    This paper mainly examines anthropometric data, data regarding the habits, experiences, and attitudes of the students about their tablet/laptop/desktop computer use, in addition to self-reported musculoskeletal discomfort levels and frequencies of students participating in a tablet-assisted interactive education programme. A two-part questionnaire…

  7. Teaching for CAD Expertise

    ERIC Educational Resources Information Center

    Chester, Ivan

    2007-01-01

    CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…

  8. 48 CFR 352.239-70 - Standard for security configurations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...

  9. 48 CFR 352.239-70 - Standard for security configurations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...

  10. 48 CFR 352.239-70 - Standard for security configurations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... configure its computers that contain HHS data with the applicable Federal Desktop Core Configuration (FDCC) (see http://nvd.nist.gov/fdcc/index.cfm) and ensure that its computers have and maintain the latest... technology (IT) that is used to process information on behalf of HHS. The following security configuration...

  11. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    ERIC Educational Resources Information Center

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  12. Interpretation of Coronary Angiograms Recorded Using Google Glass: A Comparative Analysis.

    PubMed

    Duong, Thao; Wosik, Jedrek; Christakopoulos, Georgios E; Martínez Parachini, José Roberto; Karatasakis, Aris; Tarar, Muhammad Nauman Javed; Resendes, Erica; Rangan, Bavana V; Roesle, Michele; Grodin, Jerrold; Abdullah, Shuaib M; Banerjee, Subhash; Brilakis, Emmanouil S

    2015-10-01

    Google Glass (Google, Inc) is a voice-activated, hands-free, optical head-mounted display device capable of taking pictures, recording videos, and transmitting data via wi-fi. In the present study, we examined the accuracy of coronary angiogram interpretation, recorded using Google Glass. Google Glass was used to record 15 angiograms with 17 major findings and the participants were asked to interpret those recordings on: (1) an iPad (Apple, Inc); or (2) a desktop computer. Interpretation was compared with the original angiograms viewed on a desktop. Ten physicians (2 interventional cardiologists and 8 cardiology fellows) participated. One point was assigned for each correct finding, for a maximum of 17 points. The mean angiogram interpretation score for Google Glass angiogram recordings viewed on an iPad or a desktop vs the original angiograms viewed on a desktop was 14.9 ± 1.1, 15.2 ± 1.8, and 15.9 ± 1.1, respectively (P=.06 between the iPad and the original angiograms, P=.51 between the iPad and recordings viewed on a desktop, and P=.43 between the recordings viewed on a desktop and the original angiograms). In a post-study survey, one of the 10 physicians (10%) was "neutral" with the quality of the recordings using Google Glass, 6 physicians (60%) were "somewhat satisfied," and 3 physicians (30%) were "very satisfied." This small pilot study suggests that the quality of coronary angiogram video recordings obtained using Google Glass may be adequate for recognition of major findings, supporting its expanding use in telemedicine.

  13. 12 CFR 1732.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., electronic tapes and back-up tapes, optical discs, CD-ROMS, and DVDs), and voicemail records; (2) Where the information is stored or located, including network servers, desktop or laptop computers and handheld...

  14. Comparison of bias analysis strategies applied to a large data set.

    PubMed

    Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M

    2014-07-01

    Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.

  15. Technology for Kids' Desktops: How One School Brought Its Computers Out of the Lab and into Classrooms.

    ERIC Educational Resources Information Center

    Bozzone, Meg A.

    1997-01-01

    Purchasing custom-made desks with durable glass tops to house computers and double as student work space solved the problem of how to squeeze in additional classroom computers at Johnson Park Elementary School in Princeton, New Jersey. This article describes a K-5 grade school's efforts to overcome barriers to integrating technology. (PEN)

  16. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing

    PubMed Central

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis. PMID:26884678

  17. HTSFinder: Powerful Pipeline of DNA Signature Discovery by Parallel and Distributed Computing.

    PubMed

    Karimi, Ramin; Hajdu, Andras

    2016-01-01

    Comprehensive effort for low-cost sequencing in the past few years has led to the growth of complete genome databases. In parallel with this effort, a strong need, fast and cost-effective methods and applications have been developed to accelerate sequence analysis. Identification is the very first step of this task. Due to the difficulties, high costs, and computational challenges of alignment-based approaches, an alternative universal identification method is highly required. Like an alignment-free approach, DNA signatures have provided new opportunities for the rapid identification of species. In this paper, we present an effective pipeline HTSFinder (high-throughput signature finder) with a corresponding k-mer generator GkmerG (genome k-mers generator). Using this pipeline, we determine the frequency of k-mers from the available complete genome databases for the detection of extensive DNA signatures in a reasonably short time. Our application can detect both unique and common signatures in the arbitrarily selected target and nontarget databases. Hadoop and MapReduce as parallel and distributed computing tools with commodity hardware are used in this pipeline. This approach brings the power of high-performance computing into the ordinary desktop personal computers for discovering DNA signatures in large databases such as bacterial genome. A considerable number of detected unique and common DNA signatures of the target database bring the opportunities to improve the identification process not only for polymerase chain reaction and microarray assays but also for more complex scenarios such as metagenomics and next-generation sequencing analysis.

  18. An ECG electrode-mounted heart rate, respiratory rhythm, posture and behavior recording system.

    PubMed

    Yoshimura, Takahiro; Yonezawa, Yoshiharu; Maki, Hiromichi; Ogawa, Hidekuni; Ninomiya, Ishio; Morton Caldwell, W

    2004-01-01

    R-R interval, respiration rhythm, posture and behavior recording system has been developed for monitoring a patient's cardiovascular regulatory system in daily life. The recording system consists of three ECG chest electrodes, a variable gain instrumentation amplifier, a dual axis accelerometer, a low power 8-bit single-chip microcomputer and a 1024 KB EEPROM. The complete system is mounted on the chest electrodes. R-R interval and respiration rhythm are calculated by the R waves detected from the ECG. Posture and behavior such as walking and running are detected from the body movements recorded by the accelerometer. The detected data are stored by the EEPROM and, after recording, are downloaded to a desktop computer for analysis.

  19. RICA: a reliable and image configurable arena for cyborg bumblebee based on CAN bus.

    PubMed

    Gong, Fan; Zheng, Nenggan; Xue, Lei; Xu, Kedi; Zheng, Xiaoxiang

    2014-01-01

    In this paper, we designed a reliable and image configurable flight arena, RICA, for developing cyborg bumblebees. To meet the spatial and temporal requirements of bumblebees, the Controller Area Network (CAN) bus is adopted to interconnect the LED display modules to ensure the reliability and real-time performance of the arena system. Easily-configurable interfaces on a desktop computer implemented by python scripts are provided to transmit the visual patterns to the LED distributor online and configure RICA dynamically. The new arena system will be a power tool to investigate the quantitative relationship between the visual inputs and induced flight behaviors and also will be helpful to the visual-motor research in other related fields.

  20. Software for Managing Parametric Studies

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; McCann, Karen M.; DeVivo, Adrian

    2003-01-01

    The Information Power Grid Virtual Laboratory (ILab) is a Practical Extraction and Reporting Language (PERL) graphical-user-interface computer program that generates shell scripts to facilitate parametric studies performed on the Grid. (The Grid denotes a worldwide network of supercomputers used for scientific and engineering computations involving data sets too large to fit on desktop computers.) Heretofore, parametric studies on the Grid have been impeded by the need to create control language scripts and edit input data files painstaking tasks that are necessary for managing multiple jobs on multiple computers. ILab reflects an object-oriented approach to automation of these tasks: All data and operations are organized into packages in order to accelerate development and debugging. A container or document object in ILab, called an experiment, contains all the information (data and file paths) necessary to define a complex series of repeated, sequenced, and/or branching processes. For convenience and to enable reuse, this object is serialized to and from disk storage. At run time, the current ILab experiment is used to generate required input files and shell scripts, create directories, copy data files, and then both initiate and monitor the execution of all computational processes.

  1. "EcoRadiology"--pulling the plug on wasted energy in the radiology department.

    PubMed

    McCarthy, Colin J; Gerstenmaier, Jan F; O' Neill, Ailbhe C; McEvoy, Sinead H; Hegarty, Chris; Heffernan, Eric J

    2014-12-01

    We sought to evaluate the power consumption of various devices around the radiology department, audit our use of recycling, and review efforts by vendors to reduce the environmental impact of their products. Using a readily available power monitor, we calculated the power consumption of different devices around our department. In particular, we calculated the financial and environmental cost of leaving equipment on overnight and/or at weekends. When it was not possible to measure energy usage directly, we obtained and reviewed relevant technical manuals. We contacted vendors directly to document how the environmental impact of new technology and decommissioning aging technology is being tackled. We found that 29 of 43 desktop computers and 25 of 27 picture archiving and communications system (PACS) reporting stations were left on needlessly overnight and/or at weekends, resulting in estimated electrical running costs while not in use of approximately $7253 per year, and CO2 emissions equivalent to the annual emissions of over 10 passenger cars. We discovered that none of our PACS reporting stations supported energy-saving modes such as "sleep" or "hibernate." Despite encouraging staff to turn off computers when not in use, a reaudit found no improvement in results. Simple steps such as turning off computers and air-conditioning units can produce very significant financial and environmental savings. Radiology can lead the way in making hospitals more energy efficient. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  2. Scalable computing for evolutionary genomics.

    PubMed

    Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert

    2012-01-01

    Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.

  3. A Streaming Language Implementation of the Discontinuous Galerkin Method

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Knight, Timothy

    2005-01-01

    We present a Brook streaming language implementation of the 3-D discontinuous Galerkin method for compressible fluid flow on tetrahedral meshes. Efficient implementation of the discontinuous Galerkin method using the streaming model of computation introduces several algorithmic design challenges. Using a cycle-accurate simulator, performance characteristics have been obtained for the Stanford Merrimac stream processor. The current Merrimac design achieves 128 Gflops per chip and the desktop board is populated with 16 chips yielding a peak performance of 2 Teraflops. Total parts cost for the desktop board is less than $20K. Current cycle-accurate simulations for discretizations of the 3-D compressible flow equations yield approximately 40-50% of the peak performance of the Merrimac streaming processor chip. Ongoing work includes the assessment of the performance of the same algorithm on the 2 Teraflop desktop board with a target goal of achieving 1 Teraflop performance.

  4. Utilization of KSC Present Broadband Communications Data System for Digital Video Services

    NASA Technical Reports Server (NTRS)

    Andrawis, Alfred S.

    2002-01-01

    This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.

  5. From Virtual Environments to Physical Environments: Exploring Interactivity in Ubiquitous-Learning Systems

    ERIC Educational Resources Information Center

    Peng, Hsinyi; Chou, Chien; Chang, Chun-Yu

    2008-01-01

    Computing devices and applications are now used beyond the desktop, in diverse environments, and this trend toward ubiquitous computing is evolving. In this study, we re-visit the interactivity concept and its applications for interactive function design in a ubiquitous-learning system (ULS). Further, we compare interactivity dimensions and…

  6. Taking the Plunge: Districts Leap into Virtualization

    ERIC Educational Resources Information Center

    Demski, Jennifer

    2010-01-01

    Moving from a traditional desktop computing environment to a virtualized solution is a daunting task. In this article, the author presents case histories of three districts that have made the conversion to virtual computing to learn about their experiences: What prompted them to make the move, and what were their objectives? Which obstacles prove…

  7. Computer assisted yarding cost analysis.

    Treesearch

    Ronald W. Mifflin

    1980-01-01

    Programs for a programable calculator and a desk-top computer are provided for quickly determining yarding cost and comparing the economics of alternative yarding systems. The programs emphasize the importance of the relationship between production rate and machine rate, which is the hourly cost of owning and operating yarding equipment. In addition to generating the...

  8. Desktop Social Science: Coming of Age.

    ERIC Educational Resources Information Center

    Dwyer, David C.; And Others

    Beginning in 1985, Apple Computer, Inc. and several school districts began a collaboration to examine the impact of intensive computer use on instruction and learning in K-12 classrooms. This paper follows the development of a Macintosh II-based management and retrieval system for text data undertaken to store and retrieve oral reflections of…

  9. Refocusing the Vision: The Future of Instructional Technology

    ERIC Educational Resources Information Center

    Pence, Harry E.; McIntosh, Steven

    2011-01-01

    Two decades ago, many campuses mobilized a major effort to deal with a clear problem; faculty and students needed access to desktop computing technologies. Now the situation is much more complex. Responding to the current challenges, like mobile computing and social networking, will be ore difficult but equally important. There is a clear need for…

  10. 75 FR 1625 - Privacy Act of 1974; Report of Amended or Altered System; Medical, Health and Billing Records System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-12

    ...., desktop, laptop, handheld or other computer types) containing protected personal identifiers or PHI is... as the National Indian Women's Resource Center, to conduct analytical and evaluation studies. 8... SYSTEM: STORAGE: File folders, ledgers, card files, microfiche, microfilm, computer tapes, disk packs...

  11. Utilization of KSC Present Broadband Communications Data System For Digital Video Services

    NASA Technical Reports Server (NTRS)

    Andrawis, Alfred S.

    2001-01-01

    This report covers a visibility study of utilizing present KSC broadband communications data system (BCDS) for digital video services. Digital video services include compressed digital TV delivery and video-on-demand. Furthermore, the study examines the possibility of providing interactive video on demand to desktop personal computers via KSC computer network.

  12. The effect of psychosocial stress on muscle activity during computer work: Comparative study between desktop computer and mobile computing products.

    PubMed

    Taib, Mohd Firdaus Mohd; Bahn, Sangwoo; Yun, Myung Hwan

    2016-06-27

    The popularity of mobile computing products is well known. Thus, it is crucial to evaluate their contribution to musculoskeletal disorders during computer usage under both comfortable and stressful environments. This study explores the effect of different computer products' usages with different tasks used to induce psychosocial stress on muscle activity. Fourteen male subjects performed computer tasks: sixteen combinations of four different computer products with four different tasks used to induce stress. Electromyography for four muscles on the forearm, shoulder and neck regions and task performances were recorded. The increment of trapezius muscle activity was dependent on the task used to induce the stress where a higher level of stress made a greater increment. However, this relationship was not found in the other three muscles. Besides that, compared to desktop and laptop use, the lowest activity for all muscles was obtained during the use of a tablet or smart phone. The best net performance was obtained in a comfortable environment. However, during stressful conditions, the best performance can be obtained using the device that a user is most comfortable with or has the most experience with. Different computer products and different levels of stress play a big role in muscle activity during computer work. Both of these factors must be taken into account in order to reduce the occurrence of musculoskeletal disorders or problems.

  13. FloorspaceJS - A New, Open Source, Web-Based Geometry Editor for Building Energy Modeling (BEM): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie

    Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less

  14. Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Arnegard, Ruth J.; Comstock, J. R., Jr.

    1991-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  15. The multi-attribute task battery for human operator workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Arnegard, Ruth J.

    1992-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  16. Evaluation of document location during computer use in terms of neck muscle activity and neck movement.

    PubMed

    Goostrey, Sonya; Treleaven, Julia; Johnston, Venerina

    2014-05-01

    This study evaluated the impact on neck movement and muscle activity of placing documents in three commonly used locations: in-line, flat desktop left of the keyboard and laterally placed level with the computer screen. Neck excursion during three standard head movements between the computer monitor and each document location and neck extensor and upper trapezius muscle activity during a 5 min typing task for each of the document locations was measured in 20 healthy participants. Results indicated that muscle activity and neck flexion were least when documents were placed laterally suggesting it may be the optimal location. The desktop option produced both the greatest neck movement and muscle activity in all muscle groups. The in-line document location required significantly more neck flexion but less lateral flexion and rotation than the laterally placed document. Evaluation of other holders is needed to guide decision making for this commonly used office equipment. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  17. Advanced Stirling Radioisotope Generator Thermal Power Model in Thermal Desktop SINDA/FLUINT Analyzer

    NASA Technical Reports Server (NTRS)

    Wang, Xiao-Yen; Fabanich, William A.; Schmitz, Paul C.

    2012-01-01

    This paper presents a three-dimensional Advanced Stirling Radioisotope Generator (ASRG) thermal power model that was built using the Thermal Desktop SINDA/FLUINT thermal analyzer. The model was correlated with ASRG engineering unit (EU) test data and ASRG flight unit predictions from Lockheed Martin's Ideas TMG thermal model. ASRG performance under (1) ASC hot-end temperatures, (2) ambient temperatures, and (3) years of mission for the general purpose heat source fuel decay was predicted using this model for the flight unit. The results were compared with those reported by Lockheed Martin and showed good agreement. In addition, the model was used to study the performance of the ASRG flight unit for operations on the ground and on the surface of Titan, and the concept of using gold film to reduce thermal loss through insulation was investigated.

  18. DataPlus™ - a revolutionary applications generator for DOS hand-held computers

    Treesearch

    David Dean; Linda Dean

    2000-01-01

    DataPlus allows the user to easily design data collection templates for DOS-based hand-held computers that mimic clipboard data sheets. The user designs and tests the application on the desktop PC and then transfers it to a DOS field computer. Other features include: error checking, missing data checks, and sensor input from RS-232 devices such as bar code wands,...

  19. 2-D Animation's Not Just for Mickey Mouse.

    ERIC Educational Resources Information Center

    Weinman, Lynda

    1995-01-01

    Discusses characteristics of two-dimensional (2-D) animation; highlights include character animation, painting issues, and motion graphics. Sidebars present Silicon Graphics animations tools and 2-D animation programs for the desktop computer. (DGM)

  20. Effective Web and Desktop Retrieval with Enhanced Semantic Spaces

    NASA Astrophysics Data System (ADS)

    Daoud, Amjad M.

    We describe the design and implementation of the NETBOOK prototype system for collecting, structuring and efficiently creating semantic vectors for concepts, noun phrases, and documents from a corpus of free full text ebooks available on the World Wide Web. Automatic generation of concept maps from correlated index terms and extracted noun phrases are used to build a powerful conceptual index of individual pages. To ensure scalabilty of our system, dimension reduction is performed using Random Projection [13]. Furthermore, we present a complete evaluation of the relative effectiveness of the NETBOOK system versus the Google Desktop [8].

  1. Interactive physically-based sound simulation

    NASA Astrophysics Data System (ADS)

    Raghuvanshi, Nikunj

    The realization of interactive, immersive virtual worlds requires the ability to present a realistic audio experience that convincingly compliments their visual rendering. Physical simulation is a natural way to achieve such realism, enabling deeply immersive virtual worlds. However, physically-based sound simulation is very computationally expensive owing to the high-frequency, transient oscillations underlying audible sounds. The increasing computational power of desktop computers has served to reduce the gap between required and available computation, and it has become possible to bridge this gap further by using a combination of algorithmic improvements that exploit the physical, as well as perceptual properties of audible sounds. My thesis is a step in this direction. My dissertation concentrates on developing real-time techniques for both sub-problems of sound simulation: synthesis and propagation. Sound synthesis is concerned with generating the sounds produced by objects due to elastic surface vibrations upon interaction with the environment, such as collisions. I present novel techniques that exploit human auditory perception to simulate scenes with hundreds of sounding objects undergoing impact and rolling in real time. Sound propagation is the complementary problem of modeling the high-order scattering and diffraction of sound in an environment as it travels from source to listener. I discuss my work on a novel numerical acoustic simulator (ARD) that is hundred times faster and consumes ten times less memory than a high-accuracy finite-difference technique, allowing acoustic simulations on previously-intractable spaces, such as a cathedral, on a desktop computer. Lastly, I present my work on interactive sound propagation that leverages my ARD simulator to render the acoustics of arbitrary static scenes for multiple moving sources and listener in real time, while accounting for scene-dependent effects such as low-pass filtering and smooth attenuation behind obstructions, reverberation, scattering from complex geometry and sound focusing. This is enabled by a novel compact representation that takes a thousand times less memory than a direct scheme, thus reducing memory footprints to fit within available main memory. To the best of my knowledge, this is the only technique and system in existence to demonstrate auralization of physical wave-based effects in real-time on large, complex 3D scenes.

  2. Near Retirement Age (≥55 Years) Self-Reported Physical Symptoms and Use of Computers/Mobile Phones at Work and at Leisure

    PubMed Central

    Gobba, Fabriziomaria

    2017-01-01

    The aim of this research is to study the symptoms and use of computers/mobile phones of individuals nearing retirement age (≥55 years). A questionnaire was sent to 15,000 Finns (aged 18–65). People who were ≥55 years of age were compared to the rest of the population. Six thousand one hundred and twenty-one persons responded to the questionnaire; 1226 of them were ≥55 years of age. Twenty-four percent of the ≥55-year-old respondents used desktop computers daily for leisure; 47.8% of them frequently experienced symptoms in the neck, and 38.5% in the shoulders. Workers aged ≥55 years had many more physical symptoms than younger people, except with respect to symptoms of the neck. Female daily occupational users of desktop computers had more physical symptoms in the neck. It is essential to take into account that, for people aged ≥55 years, the use of technology can be a sign of wellness. However, physical symptoms in the neck can be associated with the use of computers. PMID:28991182

  3. An embedded processor for real-time atmoshperic compensation

    NASA Astrophysics Data System (ADS)

    Bodnar, Michael R.; Curt, Petersen F.; Ortiz, Fernando E.; Carrano, Carmen J.; Kelmelis, Eric J.

    2009-05-01

    Imaging over long distances is crucial to a number of defense and security applications, such as homeland security and launch tracking. However, the image quality obtained from current long-range optical systems can be severely degraded by the turbulent atmosphere in the path between the region under observation and the imager. While this obscured image information can be recovered using post-processing techniques, the computational complexity of such approaches has prohibited deployment in real-time scenarios. To overcome this limitation, we have coupled a state-of-the-art atmospheric compensation algorithm, the average-bispectrum speckle method, with a powerful FPGA-based embedded processing board. The end result is a light-weight, lower-power image processing system that improves the quality of long-range imagery in real-time, and uses modular video I/O to provide a flexible interface to most common digital and analog video transport methods. By leveraging the custom, reconfigurable nature of the FPGA, a 20x speed increase over a modern desktop PC was achieved in a form-factor that is compact, low-power, and field-deployable.

  4. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  5. MICA: desktop software for comprehensive searching of DNA databases

    PubMed Central

    Stokes, William A; Glick, Benjamin S

    2006-01-01

    Background Molecular biologists work with DNA databases that often include entire genomes. A common requirement is to search a DNA database to find exact matches for a nondegenerate or partially degenerate query. The software programs available for such purposes are normally designed to run on remote servers, but an appealing alternative is to work with DNA databases stored on local computers. We describe a desktop software program termed MICA (K-Mer Indexing with Compact Arrays) that allows large DNA databases to be searched efficiently using very little memory. Results MICA rapidly indexes a DNA database. On a Macintosh G5 computer, the complete human genome could be indexed in about 5 minutes. The indexing algorithm recognizes all 15 characters of the DNA alphabet and fully captures the information in any DNA sequence, yet for a typical sequence of length L, the index occupies only about 2L bytes. The index can be searched to return a complete list of exact matches for a nondegenerate or partially degenerate query of any length. A typical search of a long DNA sequence involves reading only a small fraction of the index into memory. As a result, searches are fast even when the available RAM is limited. Conclusion MICA is suitable as a search engine for desktop DNA analysis software. PMID:17018144

  6. Online Tracking

    MedlinePlus

    ... used to track you on all kinds of internet-connected devices that have browsers, such as smart phones, tablets, laptop and desktop computers. How does tracking in mobile apps occur? When you access mobile applications, companies don’t have access to ...

  7. Greenhouse Gas Emissions Model (GEM) for Medium- and Heavy-Duty Vehicle Compliance

    EPA Pesticide Factsheets

    EPA’s Greenhouse Gas Emissions Model (GEM) is a free, desktop computer application that estimates the greenhouse gas (GHG) emissions and fuel efficiency performance of specific aspects of heavy-duty vehicles.

  8. Demonstrate provider accessibility with desktop and online services.

    PubMed

    2001-10-01

    It's available on personal computers with a CD or through Internet access. Assess instantly the accessibility of your provider network or the most promising areas to establish a health service with new GIS tools.

  9. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  10. Distributed and collaborative synthetic environments

    NASA Technical Reports Server (NTRS)

    Bajaj, Chandrajit L.; Bernardini, Fausto

    1995-01-01

    Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.

  11. Pest management in Douglas-fir seed orchards: a microcomputer decision method

    Treesearch

    James B. Hoy; Michael I. Haverty

    1988-01-01

    The computer program described provides a Douglas-fir seed orchard manager (user) with a quantitative method for making insect pest management decisions on a desk-top computer. The decision system uses site-specific information such as estimates of seed crop size, insect attack rates, insecticide efficacy and application costs, weather, and crop value. At sites where...

  12. Computerized preparation of a scientific poster.

    PubMed

    Lugo, M; Speaker, M G; Cohen, E J

    1989-01-01

    We prepared an attractive and effective poster using a Macintosh computer and Laserwriter and Imagewriter II printers. The advantages of preparing the poster in this fashion were increased control of the final product, decreased cost, and a sense of artistic satisfaction. Although we employed only the above mentioned computer, the desktop publishing techniques described can be used with other systems.

  13. Detecting Target Data in Network Traffic

    DTIC Science & Technology

    2017-03-01

    COMPUTER SCIENCE from the NAVAL POSTGRADUATE SCHOOL March 2017 Approved by: Michael McCarrin Thesis Co-Advisor Robert Beverly Thesis Co-Advisor Peter...Denning Chair, Department of Computer Science iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT Data exfiltration over a n etwork p oses a t hreat...phone. Further, Guri et al. were able to use these GSM frequencies to obtain information from a desktop computer by manipulating memory to produce GSM

  14. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    PubMed

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  15. Classification of Aerial Photogrammetric 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Becker, C.; Häni, N.; Rosinskaya, E.; d'Angelo, E.; Strecha, C.

    2017-05-01

    We present a powerful method to extract per-point semantic class labels from aerial photogrammetry data. Labelling this kind of data is important for tasks such as environmental modelling, object classification and scene understanding. Unlike previous point cloud classification methods that rely exclusively on geometric features, we show that incorporating color information yields a significant increase in accuracy in detecting semantic classes. We test our classification method on three real-world photogrammetry datasets that were generated with Pix4Dmapper Pro, and with varying point densities. We show that off-the-shelf machine learning techniques coupled with our new features allow us to train highly accurate classifiers that generalize well to unseen data, processing point clouds containing 10 million points in less than 3 minutes on a desktop computer.

  16. Liz Torres | NREL

    Science.gov Websites

    of Expertise Customer service Technically savvy Event planning Word processing/desktop publishing Database management Research Interests Website design Database design Computational science Technology Consulting, Westminster, CO (2007-2012) Administrative Assistant, Source One Management, Denver, CO (2005

  17. National Mobile Inventory Model (NMIM)

    EPA Pesticide Factsheets

    The National Mobile Inventory Model (NMIM) is a free, desktop computer application developed by EPA to help you develop estimates of current and future emission inventories for on-road motor vehicles and nonroad equipment. To learn more search the archive

  18. Lumped Parameter Model (LPM) for Light-Duty Vehicles

    EPA Pesticide Factsheets

    EPA’s Lumped Parameter Model (LPM) is a free, desktop computer application that estimates the effectiveness (CO2 Reduction) of various technology combinations or “packages,” in a manner that accounts for synergies between technologies.

  19. Accelerating Monte Carlo simulations with an NVIDIA ® graphics processor

    NASA Astrophysics Data System (ADS)

    Martinsen, Paul; Blaschke, Johannes; Künnemeyer, Rainer; Jordan, Robert

    2009-10-01

    Modern graphics cards, commonly used in desktop computers, have evolved beyond a simple interface between processor and display to incorporate sophisticated calculation engines that can be applied to general purpose computing. The Monte Carlo algorithm for modelling photon transport in turbid media has been implemented on an NVIDIA ® 8800 GT graphics card using the CUDA toolkit. The Monte Carlo method relies on following the trajectory of millions of photons through the sample, often taking hours or days to complete. The graphics-processor implementation, processing roughly 110 million scattering events per second, was found to run more than 70 times faster than a similar, single-threaded implementation on a 2.67 GHz desktop computer. Program summaryProgram title: Phoogle-C/Phoogle-G Catalogue identifier: AEEB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEB_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 51 264 No. of bytes in distributed program, including test data, etc.: 2 238 805 Distribution format: tar.gz Programming language: C++ Computer: Designed for Intel PCs. Phoogle-G requires a NVIDIA graphics card with support for CUDA 1.1 Operating system: Windows XP Has the code been vectorised or parallelized?: Phoogle-G is written for SIMD architectures RAM: 1 GB Classification: 21.1 External routines: Charles Karney Random number library. Microsoft Foundation Class library. NVIDA CUDA library [1]. Nature of problem: The Monte Carlo technique is an effective algorithm for exploring the propagation of light in turbid media. However, accurate results require tracing the path of many photons within the media. The independence of photons naturally lends the Monte Carlo technique to implementation on parallel architectures. Generally, parallel computing can be expensive, but recent advances in consumer grade graphics cards have opened the possibility of high-performance desktop parallel-computing. Solution method: In this pair of programmes we have implemented the Monte Carlo algorithm described by Prahl et al. [2] for photon transport in infinite scattering media to compare the performance of two readily accessible architectures: a standard desktop PC and a consumer grade graphics card from NVIDIA. Restrictions: The graphics card implementation uses single precision floating point numbers for all calculations. Only photon transport from an isotropic point-source is supported. The graphics-card version has no user interface. The simulation parameters must be set in the source code. The desktop version has a simple user interface; however some properties can only be accessed through an ActiveX client (such as Matlab). Additional comments: The random number library used has a LGPL ( http://www.gnu.org/copyleft/lesser.html) licence. Running time: Runtime can range from minutes to months depending on the number of photons simulated and the optical properties of the medium. References:http://www.nvidia.com/object/cuda_home.html. S. Prahl, M. Keijzer, Sl. Jacques, A. Welch, SPIE Institute Series 5 (1989) 102.

  20. Centrifuge: rapid and sensitive classification of metagenomic sequences

    PubMed Central

    Song, Li; Breitwieser, Florian P.

    2016-01-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649

  1. Los Alamos radiation transport code system on desktop computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. Themore » current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines.« less

  2. Visualization and Interaction in Research, Teaching, and Scientific Communication

    NASA Astrophysics Data System (ADS)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  3. Payload accommodation and development planning tools - A Desktop Resource Leveling Model (DRLM)

    NASA Technical Reports Server (NTRS)

    Hilchey, John D.; Ledbetter, Bobby; Williams, Richard C.

    1989-01-01

    The Desktop Resource Leveling Model (DRLM) has been developed as a tool to rapidly structure and manipulate accommodation, schedule, and funding profiles for any kind of experiments, payloads, facilities, and flight systems or other project hardware. The model creates detailed databases describing 'end item' parameters, such as mass, volume, power requirements or costs and schedules for payload, subsystem, or flight system elements. It automatically spreads costs by calendar quarters and sums costs or accommodation parameters by total project, payload, facility, payload launch, or program phase. Final results can be saved or printed out, automatically documenting all assumptions, inputs, and defaults.

  4. Virtual reality hardware for use in interactive 3D data fusion and visualization

    NASA Astrophysics Data System (ADS)

    Gourley, Christopher S.; Abidi, Mongi A.

    1997-09-01

    Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.

  5. Lsiviewer 2.0 - a Client-Oriented Online Visualization Tool for Geospatial Vector Data

    NASA Astrophysics Data System (ADS)

    Manikanta, K.; Rajan, K. S.

    2017-09-01

    Geospatial data visualization systems have been predominantly through applications that are installed and run in a desktop environment. Over the last decade, with the advent of web technologies and its adoption by Geospatial community, the server-client model for data handling, data rendering and visualization respectively has been the most prevalent approach in Web-GIS. While the client devices have become functionally more powerful over the recent years, the above model has largely ignored it and is still in a mode of serverdominant computing paradigm. In this paper, an attempt has been made to develop and demonstrate LSIViewer - a simple, easy-to-use and robust online geospatial data visualisation system for the user's own data that harness the client's capabilities for data rendering and user-interactive styling, with a reduced load on the server. The developed system can support multiple geospatial vector formats and can be integrated with other web-based systems like WMS, WFS, etc. The technology stack used to build this system is Node.js on the server side and HTML5 Canvas and JavaScript on the client side. Various tests run on a range of vector datasets, upto 35 MB, showed that the time taken to render the vector data using LSIViewer is comparable to a desktop GIS application, QGIS, over an identical system.

  6. A comparison between three electronic media and in-person learning for continuing education in physical rehabilitation.

    PubMed

    Lemaire, Edward; Greene, G

    2003-01-01

    We produced continuing education material in physical rehabilitation using a variety of electronic media. We compared four methods of delivering the learning modules: in person with a computer projector, desktop videoconferencing, Web pages and CD-ROM. Health-care workers at eight community hospitals and two nursing homes were asked to participate in the project. A total of 394 questionnaires were received for all modalities: 73 for in-person sessions, 50 for desktop conferencing, 227 for Web pages and 44 for CD-ROM. This represents a 100% response rate from the in-person, desktop conferencing and CD-ROM groups; the response rate for the Web group is unknown, since the questionnaires were completed online. Almost all participants found the modules to be helpful in their work. The CD-ROM group gave significantly higher ratings than the Web page group, although all four learning modalities received high ratings. A combination of all four modalities would be required to provide the best possible learning opportunity.

  7. Fabrication of low cost soft tissue prostheses with the desktop 3D printer

    NASA Astrophysics Data System (ADS)

    He, Yong; Xue, Guang-Huai; Fu, Jian-Zhong

    2014-11-01

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  8. Fabrication of low cost soft tissue prostheses with the desktop 3D printer

    PubMed Central

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-01-01

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods. PMID:25427880

  9. Cybersickness and desktop simulations: field of view effects and user experience

    NASA Astrophysics Data System (ADS)

    Toet, Alexander; de Vries, Sjoerd C.; van Emmerik, Martijn L.; Bos, Jelte E.

    2008-04-01

    We used a desktop computer game environment to study the effect Field-of-View (FOV) on cybersickness. In particular, we examined the effect of differences between the internal FOV (iFOV, the FOV which the graphics generator is using to render its images) and the external FOV (eFOV, the FOV of the presented images as seen from the physical viewpoint of the observer). Somewhat counter-intuitively, we find that congruent iFOVs and eFOVs lead to a higher incidence of cybersickness. A possible explanation is that the incongruent conditions were too extreme, thereby reducing the experience of vection. We also studied the user experience (appraisal) of this virtual environment as a function of the degree of cybersickness. We find that cybersick participants experience the simulated environment as less pleasant and more arousing, and possibly also as more distressing. Our present findings have serious implications for desktop simulations used both in military and in civilian training, instruction and planning applications.

  10. Fabrication of low cost soft tissue prostheses with the desktop 3D printer.

    PubMed

    He, Yong; Xue, Guang-huai; Fu, Jian-zhong

    2014-11-27

    Soft tissue prostheses such as artificial ear, eye and nose are widely used in the maxillofacial rehabilitation. In this report we demonstrate how to fabricate soft prostheses mold with a low cost desktop 3D printer. The fabrication method used is referred to as Scanning Printing Polishing Casting (SPPC). Firstly the anatomy is scanned with a 3D scanner, then a tissue casting mold is designed on computer and printed with a desktop 3D printer. Subsequently, a chemical polishing method is used to polish the casting mold by removing the staircase effect and acquiring a smooth surface. Finally, the last step is to cast medical grade silicone into the mold. After the silicone is cured, the fine soft prostheses can be removed from the mold. Utilizing the SPPC method, soft prostheses with smooth surface and complicated structure can be fabricated at a low cost. Accordingly, the total cost of fabricating ear prosthesis is about $30, which is much lower than the current soft prostheses fabrication methods.

  11. Architectures for single-chip image computing

    NASA Astrophysics Data System (ADS)

    Gove, Robert J.

    1992-04-01

    This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.

  12. Practical advantages of evolutionary computation

    NASA Astrophysics Data System (ADS)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  13. Chemical Inventory Management at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Kraft, Shirley S.; Homan, Joseph R.; Bajorek, Michael J.; Dominguez, Manuel B.; Smith, Vanessa L.

    1997-01-01

    The Chemical Management System (CMS) is a client/server application developed with Power Builder and Sybase for the Lewis Research Center (LeRC). Power Builder is a client-server application development tool, Sybase is a Relational Database Management System. The entire LeRC community can access the CMS from any desktop environment. The multiple functions and benefits of the CMS are addressed.

  14. Hands in space: gesture interaction with augmented-reality interfaces.

    PubMed

    Billinghurst, Mark; Piumsomboon, Tham; Huidong Bai

    2014-01-01

    Researchers at the Human Interface Technology Laboratory New Zealand (HIT Lab NZ) are investigating free-hand gestures for natural interaction with augmented-reality interfaces. They've applied the results to systems for desktop computers and mobile devices.

  15. 3 CFR 13589 - Executive Order 13589 of November 9, 2011. Promoting Efficient Spending

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... more aggressive steps to ensure the Government is a good steward of taxpayer money. Sec. 2. Agency... the number of IT devices (e.g., mobile phones, smartphones, desktop and laptop computers, and tablet...

  16. Computerized Fortune Cookies--a Classroom Treat.

    ERIC Educational Resources Information Center

    Reissman, Rose

    1996-01-01

    Discusses the use of fortune cookie fortunes in a middle school class combined with computer graphics, desktop publishing, and word processing technology to create writing assignments, games, and discussions. Topics include cultural contexts, and students creating their own fortunes. (LRW)

  17. 75 FR 20385 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... Carolina. The notice will be published soon in the Federal Register. At the request of the State Agency... production of desktop computers. The company reports that workers leased from Staffing Solutions, South East...

  18. Development of automated electromagnetic compatibility test facilities at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Harrison, Cecil A.

    1986-01-01

    The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.

  19. From Desktop to Teraflop: Exploiting the U.S. Lead in High Performance Computing. NSF Blue Ribbon Panel on High Performance Computing.

    ERIC Educational Resources Information Center

    National Science Foundation, Washington, DC.

    This report addresses an opportunity to accelerate progress in virtually every branch of science and engineering concurrently, while also boosting the American economy as business firms also learn to exploit these new capabilities. The successful rapid advancement in both science and technology creates its own challenges, four of which are…

  20. Use an Interactive Whiteboard: Get a Handle on How This Technology Can Spice up the Classroom

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2006-01-01

    Interactive whiteboards are desirable peripherals these days. When hooked up to a computer, the whiteboard's screen becomes a "live" computer desktop, which can be tapped to pull down menus, highlight, and move or open files. Users can also circle relevant sections on the projected image, draw geometric figures, and underline. Then they can save…

  1. Information Technology: A Road to the Future? To Promote Academic Justice and Excellence Series.

    ERIC Educational Resources Information Center

    Gilbert, Steven W.; Green, Kenneth C.

    This publication is intended to provide college faculty and staff with a guide to information technology issues in higher education. Mid-Way through the 1990s, higher education confronts the second phase of the information technology (IT) revolution, a shift in emphasis from the computer as a desktop tool to the computer as a communications…

  2. WinHPC System User Basics | High-Performance Computing | NREL

    Science.gov Websites

    guidance for starting to use this high-performance computing (HPC) system at NREL. Also see WinHPC policies ) when you are finished. Simply quitting Remote Desktop will keep your session active and using resources node). 2. Log in with your NREL.gov username/password. Remember to log out when finished. Mac 1. If you

  3. Situated Computing: The Next Frontier for HCI Research

    DTIC Science & Technology

    2002-01-01

    population works and lives with information. Most individuals interact with information through a single portal: a personal desktop or laptop...of single devices, nor will one person necessarily own each device. This leap of imagination requires that human-computer interaction (HCI...wireless technologies, including Bluetooth [16], IrDA [22] (Infrared Data Association- standards for infrared communications) and HomeRF TM [21

  4. MultiPhyl: a high-throughput phylogenomics webserver using distributed computing

    PubMed Central

    Keane, Thomas M.; Naughton, Thomas J.; McInerney, James O.

    2007-01-01

    With the number of fully sequenced genomes increasing steadily, there is greater interest in performing large-scale phylogenomic analyses from large numbers of individual gene families. Maximum likelihood (ML) has been shown repeatedly to be one of the most accurate methods for phylogenetic construction. Recently, there have been a number of algorithmic improvements in maximum-likelihood-based tree search methods. However, it can still take a long time to analyse the evolutionary history of many gene families using a single computer. Distributed computing refers to a method of combining the computing power of multiple computers in order to perform some larger overall calculation. In this article, we present the first high-throughput implementation of a distributed phylogenetics platform, MultiPhyl, capable of using the idle computational resources of many heterogeneous non-dedicated machines to form a phylogenetics supercomputer. MultiPhyl allows a user to upload hundreds or thousands of amino acid or nucleotide alignments simultaneously and perform computationally intensive tasks such as model selection, tree searching and bootstrapping of each of the alignments using many desktop machines. The program implements a set of 88 amino acid models and 56 nucleotide maximum likelihood models and a variety of statistical methods for choosing between alternative models. A MultiPhyl webserver is available for public use at: http://www.cs.nuim.ie/distributed/multiphyl.php. PMID:17553837

  5. A desktop 3D printer with dual extruders to produce customised electronic circuitry

    NASA Astrophysics Data System (ADS)

    Butt, Javaid; Onimowo, Dominic Adaoiza; Gohrabian, Mohammed; Sharma, Tinku; Shirvani, Hassan

    2018-03-01

    3D printing has opened new horizons for the manufacturing industry in general, and 3D printers have become the tools for technological advancements. There is a huge divide between the pricing of industrial and desktop 3D printers with the former being on the expensive side capable of producing excellent quality products and latter being on the low-cost side with moderate quality results. However, there is a larger room for improvements and enhancements for the desktop systems as compared to the industrial ones. In this paper, a desktop 3D printer called Prusa Mendel i2 has been modified and integrated with an additional extruder so that the system can work with dual extruders and produce bespoke electronic circuits. The communication between the two extruders has been established by making use of the In-Chip Serial Programming port on the Arduino Uno controlling the printer. The biggest challenge is to control the flow of electric paint (to be dispensed by the new extruder) and CFD (Computational Fluid Dynamics) analysis has been carried out to ascertain the optimal conditions for proper dispensing. The final product is a customised electronic circuit with the base of plastic (from the 3D printer's extruder) and electronic paint (from the additional extruder) properly dispensed to create a live circuit on a plastic platform. This low-cost enhancement to a desktop 3D printer can provide a new prospect to produce multiple material parts where the additional extruder can be filled with any material that can be properly dispensed from its nozzle.

  6. GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.

    PubMed

    Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan

    2011-05-01

    Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.

  7. Design and construction of a desktop AC susceptometer using an Arduino and a Bluetooth for serial interface

    NASA Astrophysics Data System (ADS)

    Pérez, Israel; Ángel Hernández Cuevas, José; Trinidad Elizalde Galindo, José

    2018-05-01

    We designed and developed a desktop AC susceptometer for the characterization of materials. The system consists of a lock-in amplifier, an AC function generator, a couple of coils, a sample holder, a computer system with a designed software in freeware C++ code, and an Arduino card coupled to a Bluetooth module. The Arduino/Bluetooth serial interface allows the user to have a connection to almost any computer and thus avoids the problem of connectivity between the computer and the peripherals, such as the lock-in amplifier and the function generator. The Bluetooth transmitter/receiver used is a commercial device which is robust and fast. These new features reduce the size and increase the versatility of the susceptometer, for it can be used with a simple laptop. To test our instrument, we performed measurements on magnetic materials and show that the system is reliable at both room temperature and cryogenic temperatures (77 K). The instrument is suitable for any physics or engineering laboratory either for research or academic purposes.

  8. Lessons from a doctoral thesis.

    PubMed

    Peiris, A N; Mueller, R A; Sheridan, D P

    1990-01-01

    The production of a doctoral thesis is a time-consuming affair that until recently was done in conjunction with professional publishing services. Advances in computer technology have made many sophisticated desktop publishing techniques available to the microcomputer user. We describe the computer method used, the problems encountered, and the solutions improvised in the production of a doctoral thesis by computer. The Apple Macintosh was selected for its ease of use and intrinsic graphics capabilities. A scanner was used to incorporate text from published papers into a word processing program. The body of the text was updated and supplemented with new sections. Scanned graphics from the published papers were less suitable for publication, and the original data were replotted and modified with a graphics-drawing program. Graphics were imported and incorporated in the text. Final hard copy was produced by a laser printer and bound with both conventional and rapid new binding techniques. Microcomputer-based desktop processing methods provide a rapid and cost-effective means of communicating the written word. We anticipate that this evolving technology will have increased use by physicians in both the private and academic sectors.

  9. What caused the breach? An examination of use of information technology and health data breaches.

    PubMed

    Wikina, Suanu Bliss

    2014-01-01

    Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states-Virginia, Illinois, California, Florida, New York, and Tennessee-in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates.

  10. What Caused the Breach? An Examination of Use of Information Technology and Health Data Breaches

    PubMed Central

    Wikina, Suanu Bliss

    2014-01-01

    Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states—Virginia, Illinois, California, Florida, New York, and Tennessee—in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates. PMID:25593574

  11. GPU-based Green's function simulations of shear waves generated by an applied acoustic radiation force in elastic and viscoelastic models.

    PubMed

    Yang, Yiqun; Urban, Matthew W; McGough, Robert J

    2018-05-15

    Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green's functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green's function approach are ideally suited for high-performance GPUs.

  12. Venus Quadrangle Geological Mapping: Use of Geoscience Data Visualization Systems in Mapping and Training

    NASA Technical Reports Server (NTRS)

    Head, James W.; Huffman, J. N.; Forsberg, A. S.; Hurwitz, D. M.; Basilevsky, A. T.; Ivanov, M. A.; Dickson, J. L.; Kumar, P. Senthil

    2008-01-01

    We are currently investigating new technological developments in computer visualization and analysis in order to assess their importance and utility in planetary geological analysis and mapping [1,2]. Last year we reported on the range of technologies available and on our application of these to various problems in planetary mapping [3]. In this contribution we focus on the application of these techniques and tools to Venus geological mapping at the 1:5M quadrangle scale. In our current Venus mapping projects we have utilized and tested the various platforms to understand their capabilities and assess their usefulness in defining units, establishing stratigraphic relationships, mapping structures, reaching consensus on interpretations and producing map products. We are specifically assessing how computer visualization display qualities (e.g., level of immersion, stereoscopic vs. monoscopic viewing, field of view, large vs. small display size, etc.) influence performance on scientific analysis and geological mapping. We have been exploring four different environments: 1) conventional desktops (DT), 2) semi-immersive Fishtank VR (FT) (i.e., a conventional desktop with head-tracked stereo and 6DOF input), 3) tiled wall displays (TW), and 4) fully immersive virtual reality (IVR) (e.g., "Cave Automatic Virtual Environment," or Cave system). Formal studies demonstrate that fully immersive Cave environments are superior to desktop systems for many tasks [e.g., 4].

  13. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    NASA Astrophysics Data System (ADS)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  14. A Parallel Finite Set Statistical Simulator for Multi-Target Detection and Tracking

    NASA Astrophysics Data System (ADS)

    Hussein, I.; MacMillan, R.

    2014-09-01

    Finite Set Statistics (FISST) is a powerful Bayesian inference tool for the joint detection, classification and tracking of multi-target environments. FISST is capable of handling phenomena such as clutter, misdetections, and target birth and decay. Implicit within the approach are solutions to the data association and target label-tracking problems. Finally, FISST provides generalized information measures that can be used for sensor allocation across different types of tasks such as: searching for new targets, and classification and tracking of known targets. These FISST capabilities have been demonstrated on several small-scale illustrative examples. However, for implementation in a large-scale system as in the Space Situational Awareness problem, these capabilities require a lot of computational power. In this paper, we implement FISST in a parallel environment for the joint detection and tracking of multi-target systems. In this implementation, false alarms and misdetections will be modeled. Target birth and decay will not be modeled in the present paper. We will demonstrate the success of the method for as many targets as we possibly can in a desktop parallel environment. Performance measures will include: number of targets in the simulation, certainty of detected target tracks, computational time as a function of clutter returns and number of targets, among other factors.

  15. Artificial neural network-aided image analysis system for cell counting.

    PubMed

    Sjöström, P J; Frydel, B R; Wahlberg, L U

    1999-05-01

    In histological preparations containing debris and synthetic materials, it is difficult to automate cell counting using standard image analysis tools, i.e., systems that rely on boundary contours, histogram thresholding, etc. In an attempt to mimic manual cell recognition, an automated cell counter was constructed using a combination of artificial intelligence and standard image analysis methods. Artificial neural network (ANN) methods were applied on digitized microscopy fields without pre-ANN feature extraction. A three-layer feed-forward network with extensive weight sharing in the first hidden layer was employed and trained on 1,830 examples using the error back-propagation algorithm on a Power Macintosh 7300/180 desktop computer. The optimal number of hidden neurons was determined and the trained system was validated by comparison with blinded human counts. System performance at 50x and lO0x magnification was evaluated. The correlation index at 100x magnification neared person-to-person variability, while 50x magnification was not useful. The system was approximately six times faster than an experienced human. ANN-based automated cell counting in noisy histological preparations is feasible. Consistent histology and computer power are crucial for system performance. The system provides several benefits, such as speed of analysis and consistency, and frees up personnel for other tasks.

  16. Review of Collaborative Tools for Planning and Engineering

    DTIC Science & Technology

    2007-10-01

    including PDAs) and Operating Systems 1 In general, should support laptops, desktops, Windows OS, Mac OS, Palm OS, Windows CE, Blackberry , Sun...better), voting (to establish operating parameters), reactor design, wind tunnel simulation Display same material on every computer, synchronisation

  17. Hyped Type: An Exercise in Creative Typography.

    ERIC Educational Resources Information Center

    Osterer, Irv

    2001-01-01

    Provides a history of typography and discusses the effects of technology. Describes an art project in which high school students designed contemporary typographic specimen sheets. Explains that the students created their own broadsheets using Macintosh computers and QuarkXPress desktop publishing. (CMK)

  18. Report: EPA Needs to Improve Management Practices to Ensure a Successful Customer Technology Solutions Project

    EPA Pesticide Factsheets

    Report #10-P-0194, August 23, 2010. Although EPA indicated it could avoid spending more than $115.4 million over 8.5 years by consolidating the desktop computing environment, improved management practices are needed.

  19. Connecting to HPC VPN | High-Performance Computing | NREL

    Science.gov Websites

    and password will match your NREL network account login/password. From OS X or Linux, open a terminal finalized. Open a Remote Desktop connection using server name WINHPC02 (this is the login node). Mac Mac

  20. Infographic Strategies: Publications' Staffs, Assisted by Desktop Publishing, Tell Stories in Visuals Second to None.

    ERIC Educational Resources Information Center

    Jordan, Jim

    1988-01-01

    Summarizes how infograhics are produced and how they provide information graphically in high school publications. Offers suggestions concerning information gathering, graphic format, and software selection, and provides examples of computer/student designed infographics. (MM)

  1. Designing Communication and Learning Environments.

    ERIC Educational Resources Information Center

    Gayeski, Diane M., Ed.

    Designing and remodeling educational facilities are becoming more complex with options that include computer-based collaboration, classrooms with multimedia podiums, conference centers, and workplaces with desktop communication systems. This book provides a collection of articles that address educational facility design categorized in the…

  2. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications

    PubMed Central

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-01-01

    Background Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. Results The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. Conclusion The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools. PMID:18021453

  3. The Firegoose: two-way integration of diverse data from different bioinformatics web resources with desktop applications.

    PubMed

    Bare, J Christopher; Shannon, Paul T; Schmid, Amy K; Baliga, Nitin S

    2007-11-19

    Information resources on the World Wide Web play an indispensable role in modern biology. But integrating data from multiple sources is often encumbered by the need to reformat data files, convert between naming systems, or perform ongoing maintenance of local copies of public databases. Opportunities for new ways of combining and re-using data are arising as a result of the increasing use of web protocols to transmit structured data. The Firegoose, an extension to the Mozilla Firefox web browser, enables data transfer between web sites and desktop tools. As a component of the Gaggle integration framework, Firegoose can also exchange data with Cytoscape, the R statistical package, Multiexperiment Viewer (MeV), and several other popular desktop software tools. Firegoose adds the capability to easily use local data to query KEGG, EMBL STRING, DAVID, and other widely-used bioinformatics web sites. Query results from these web sites can be transferred to desktop tools for further analysis with a few clicks. Firegoose acquires data from the web by screen scraping, microformats, embedded XML, or web services. We define a microformat, which allows structured information compatible with the Gaggle to be embedded in HTML documents. We demonstrate the capabilities of this software by performing an analysis of the genes activated in the microbe Halobacterium salinarum NRC-1 in response to anaerobic environments. Starting with microarray data, we explore functions of differentially expressed genes by combining data from several public web resources and construct an integrated view of the cellular processes involved. The Firegoose incorporates Mozilla Firefox into the Gaggle environment and enables interactive sharing of data between diverse web resources and desktop software tools without maintaining local copies. Additional web sites can be incorporated easily into the framework using the scripting platform of the Firefox browser. Performing data integration in the browser allows the excellent search and navigation capabilities of the browser to be used in combination with powerful desktop tools.

  4. Cloud Based Electronic Health Record Applications are Essential to Expeditionary Patient Care

    DTIC Science & Technology

    2017-05-01

    security46 and privacy concerns47). Privacy/Security Risks of Cloud Computing A quantitative study based on the preceding literature review...to medical IT wherever there is a Wi-Fi connection and a computing device (desktop, laptop , tablet, phone, etc.). In 2015 the DoD launched MiCare, a...Hosting Services: a Study on Students’ Acceptance,” Computers in Human Behavior, 2013. Takai, Teri. DoD CIO’s 10-Point Plan for IT Modernization

  5. Utilizing Computer and Multimedia Technology in Generating Choreography for the Advanced Dance Student at the High School Level.

    ERIC Educational Resources Information Center

    Griffin, Irma Amado

    This study describes a pilot program utilizing various multimedia computer programs on a MacQuadra 840 AV. The target group consisted of six advanced dance students who participated in the pilot program within the dance curriculum by creating a database of dance movement using video and still photography. The students combined desktop publishing,…

  6. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  7. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic speedup in execution time. NRM is sufficiently generic to support applications in any domain, as long as the application is parallelizable (i.e., can be subdivided into multiple individual processing tasks). At present, NRM has been effective in decreasing the overall runtime of several algorithms: 1) the generation of a global 3D model of the compressional velocity distribution in the Earth using tomographic inversion, 2) the calculation of the model resolution matrix, model covariance matrix, and travel time uncertainty for the aforementioned velocity model, and 3) the correlation of waveforms with archival data on a massive scale for seismic event detection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  8. Evaluating Mobile Graphics Processing Units (GPUs) for Real-Time Resource Constrained Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meredith, J; Conger, J; Liu, Y

    2005-11-11

    Modern graphics processing units (GPUs) can provide tremendous performance boosts for some applications beyond what a single CPU can accomplish, and their performance is growing at a rate faster than CPUs as well. Mobile GPUs available for laptops have the small form factor and low power requirements suitable for use in embedded processing. We evaluated several desktop and mobile GPUs and CPUs on traditional and non-traditional graphics tasks, as well as on the most time consuming pieces of a full hyperspectral imaging application. Accuracy remained high despite small differences in arithmetic operations like rounding. Performance improvements are summarized here relativemore » to a desktop Pentium 4 CPU.« less

  9. Liquid Crystals, PIV and IR-Photography in Selected Technical and Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Stasiek, Jan; Jewartowski, Marcin

    2017-10-01

    Thermochromic liquid crystals (TLC), Particle Image Velocimetry (PIV), Infrared Imaging Themography (IR) and True-Colour Digital Image Processing (TDIP) have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. These four tools (based on the desktop computers) have come together during the past two decades to produce a powerful advanced experimental technique as a judgment of quality of information that cannot be obtained from any other imaging procedure. The brief summary of the history of this technique is reviewed, principal methods and tools are described and some examples are presented. With this objective, a new experimental technique have been developed and applied to the study of heat and mass transfer and for biomedical diagnosis. Automated evaluation allows determining the heat and flow visualisation and locate the area of suspicious tissue of human body.

  10. Magazine Development: Creative Arts Magazines Can Take on More Creativity through Staff Innovation, Desktop Publishing.

    ERIC Educational Resources Information Center

    Cutsinger, John

    1988-01-01

    Explains how a high school literary magazine staff accessed the journalism department's Apple Macintosh computers to typeset its publication. Provides examples of magazine layouts designed partially or completely by "Pagemaker" software on a Macintosh. (MM)

  11. Group Velocity Dispersion Curves from Wigner-Ville Distributions

    NASA Astrophysics Data System (ADS)

    Lloyd, Simon; Bokelmann, Goetz; Sucic, Victor

    2013-04-01

    With the widespread adoption of ambient noise tomography, and the increasing number of local earthquakes recorded worldwide due to dense seismic networks and many very dense temporary experiments, we consider it worthwhile to evaluate alternative Methods to measure surface wave group velocity dispersions curves. Moreover, the increased computing power of even a simple desktop computer makes it feasible to routinely use methods other than the typically employed multiple filtering technique (MFT). To that end we perform tests with synthetic and observed seismograms using the Wigner-Ville distribution (WVD) frequency time analysis, and compare dispersion curves measured with WVD and MFT with each other. Initial results suggest WVD to be at least as good as MFT at measuring dispersion, albeit at a greater computational expense. We therefore need to investigate if, and under which circumstances, WVD yields better dispersion curves than MFT, before considering routinely applying the method. As both MFT and WVD generally work well for teleseismic events and at longer periods, we explore how well the WVD method performs at shorter periods and for local events with smaller epicentral distances. Such dispersion information could potentially be beneficial for improving velocity structure resolution within the crust.

  12. Democratic population decisions result in robust policy-gradient learning: a parametric study with GPU simulations.

    PubMed

    Richmond, Paul; Buesing, Lars; Giugliano, Michele; Vasilaki, Eleni

    2011-05-04

    High performance computing on the Graphics Processing Unit (GPU) is an emerging field driven by the promise of high computational power at a low cost. However, GPU programming is a non-trivial task and moreover architectural limitations raise the question of whether investing effort in this direction may be worthwhile. In this work, we use GPU programming to simulate a two-layer network of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and investigate its ability to learn a simplified navigation task using a policy-gradient learning rule stemming from Reinforcement Learning. The purpose of this paper is twofold. First, we want to support the use of GPUs in the field of Computational Neuroscience. Second, using GPU computing power, we investigate the conditions under which the said architecture and learning rule demonstrate best performance. Our work indicates that networks featuring strong Mexican-Hat-shaped recurrent connections in the top layer, where decision making is governed by the formation of a stable activity bump in the neural population (a "non-democratic" mechanism), achieve mediocre learning results at best. In absence of recurrent connections, where all neurons "vote" independently ("democratic") for a decision via population vector readout, the task is generally learned better and more robustly. Our study would have been extremely difficult on a desktop computer without the use of GPU programming. We present the routines developed for this purpose and show that a speed improvement of 5x up to 42x is provided versus optimised Python code. The higher speed is achieved when we exploit the parallelism of the GPU in the search of learning parameters. This suggests that efficient GPU programming can significantly reduce the time needed for simulating networks of spiking neurons, particularly when multiple parameter configurations are investigated.

  13. Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer.

    PubMed

    Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas

    2016-04-01

    Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .

  14. A Tropical Marine Microbial Natural Products Geobibliography as an Example of Desktop Exploration of Current Research Using Web Visualisation Tools

    PubMed Central

    Mukherjee, Joydeep; Llewellyn, Lyndon E; Evans-Illidge, Elizabeth A

    2008-01-01

    Microbial marine biodiscovery is a recent scientific endeavour developing at a time when information and other technologies are also undergoing great technical strides. Global visualisation of datasets is now becoming available to the world through powerful and readily available software such as Worldwind™, ArcGIS Explorer™ and Google Earth™. Overlaying custom information upon these tools is within the hands of every scientist and more and more scientific organisations are making data available that can also be integrated into these global visualisation tools. The integrated global view that these tools enable provides a powerful desktop exploration tool. Here we demonstrate the value of this approach to marine microbial biodiscovery by developing a geobibliography that incorporates citations on tropical and near-tropical marine microbial natural products research with Google Earth™ and additional ancillary global data sets. The tools and software used are all readily available and the reader is able to use and install the material described in this article. PMID:19172194

  15. A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].

    DOT National Transportation Integrated Search

    2013-01-01

    The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...

  16. Teach Your Computer to Read: Scanners and Optical Character Recognition.

    ERIC Educational Resources Information Center

    Marsden, Jim

    1993-01-01

    Desktop scanners can be used with a software technology called optical character recognition (OCR) to convert the text on virtually any paper document into an electronic form. OCR offers educators new flexibility in incorporating text into tests, lesson plans, and other materials. (MLF)

  17. Personal Computers and Laser Printers Are Becoming Popular Tools for Creating Documents on Campuses.

    ERIC Educational Resources Information Center

    DeLoughry, Thomas J.

    1987-01-01

    Desktop publishing techniques are bringing control over institutional newsletters, catalogues, brochures, and many other print materials directly to the author's office. The technology also has the potential for integrating campus information systems and saving much time and money. (MSE)

  18. CWRUnet--Case History of a Campus-Wide Fiber-to-the-Desktop Network.

    ERIC Educational Resources Information Center

    Neff, Raymond K.; Haigh, Peter J.

    1992-01-01

    This article describes the development at Case Western Reserve University of an all-fiber optic communications network linking 7,300 outlets (faculty offices, student residences, classrooms, libraries, and laboratories) with computer data, television, audio, facsimile, and image information services. (Author/DB)

  19. Centrifuge: rapid and sensitive classification of metagenomic sequences.

    PubMed

    Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L

    2016-12-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.

  20. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  1. Illuminator, a desktop program for mutation detection using short-read clonal sequencing.

    PubMed

    Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T

    2011-10-01

    Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Propagation Environment Assessment Using UAV Electromagnetic Sensors

    DTIC Science & Technology

    2018-03-01

    could be added, we limit this study to two dimensions.) The computer program then processes the data and determines the existence of any atmospheric... computer to have large processing capacity, and a typical workstation desktop or laptop can perform the function. E. FLIGHT PATTERNS AND DATA...different types of flight patterns were studied , and our findings show that the vertical flight pattern using a rotary platform is more efficient

  3. A System Engineering Study and Concept Development for a Humanitarian Aid and Disaster Relief Operations Management Platform

    DTIC Science & Technology

    2016-09-01

    and network. The computing and network hardware are identified and include routers, servers, firewalls, laptops , backup hard drives, smart phones...deployable hardware units will be necessary. This includes the use of ruggedized laptops and desktop computers , a projector system, communications system...ENGINEERING STUDY AND CONCEPT DEVELOPMENT FOR A HUMANITARIAN AID AND DISASTER RELIEF OPERATIONS MANAGEMENT PLATFORM by Julie A. Reed September

  4. The Virtual Desktop: Options and Challenges in Selecting a Secure Desktop Infrastructure Based on Virtualization

    DTIC Science & Technology

    2011-10-01

    Fortunately, some products offer centralized management and deployment tools for local desktop implementation . Figure 5 illustrates the... implementation of a secure desktop infrastructure based on virtualization. It includes an overview of desktop virtualization, including an in-depth...environment in the data centre, whereas LHVD places it on the endpoint itself. Desktop virtualization implementation considerations and potential

  5. Automatic mobile device synchronization and remote control system for high-performance medical applications.

    PubMed

    Constantinescu, L; Kim, J; Chan, C; Feng, D

    2007-01-01

    The field of telemedicine is in need of generic solutions that harness the power of small, easily carried computing devices to increase efficiency and decrease the likelihood of medical errors. Our study resolved to build a framework to bridge the gap between handheld and desktop solutions by developing an automated network protocol that wirelessly propagates application data and images prepared by a powerful workstation to handheld clients for storage, display and collaborative manipulation. To this end, we present the Mobile Active Medical Protocol (MAMP), a framework capable of nigh-effortlessly linking medical workstation solutions to corresponding control interfaces on handheld devices for remote storage, control and display. The ease-of-use, encapsulation and applicability of this automated solution is designed to provide significant benefits to the rapid development of telemedical solutions. Our results demonstrate that the design of this system allows an acceptable data transfer rate, a usable framerate for diagnostic solutions and enough flexibility to enable its use in a wide variety of cases. To this end, we also present a large-scale multi-modality image viewer as an example application based on the MAMP.

  6. Digital LED Pixels: Instructions for use and a characterization of their properties.

    PubMed

    Jones, Pete R; Garcia, Sara E; Nardini, Marko

    2016-12-01

    This article details how to control light emitting diodes (LEDs) using an ordinary desktop computer. By combining digitally addressable LEDs with an off-the-shelf microcontroller (Arduino), multiple LEDs can be controlled independently and with a high degree of temporal, chromatic, and luminance precision. The proposed solution is safe (can be powered by a 5-V battery), tested (has been used in published research), inexpensive (∼ $60 + $2 per LED), highly interoperable (can be controlled by any type of computer/operating system via a USB or Bluetooth connection), requires no prior knowledge of electrical engineering (components simply require plugging together), and uses widely available components for which established help forums already exist. Matlab code is provided, including a 'minimal working example' of use suitable for use by beginners. Properties of the recommended LEDs are also characterized, including their response time, luminance profile, and color gamut. Based on these, it is shown that the LEDs are highly stable in terms of both luminance and chromaticity, and do not suffer from issues of warm-up, chromatic shift, and slow response times associated with traditional CRT and LCD monitor technology.

  7. Development of a Wireless Computer Vision Instrument to Detect Biotic Stress in Wheat

    PubMed Central

    Casanova, Joaquin J.; O'Shaughnessy, Susan A.; Evett, Steven R.; Rush, Charles M.

    2014-01-01

    Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM). In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV), vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32) than stressed wheat (111.34). In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014), as did the conventional camera (p < 0.0001). Vegetation hue obtained through a wireless computer vision system in this study is a viable option for determining biotic crop stress in irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications. PMID:25251410

  8. Screencasts

    ERIC Educational Resources Information Center

    Yee, Kevin; Hargis, Jace

    2010-01-01

    This article discusses the benefits of screencasts and its instructional uses. Well-known for some years to advanced technology users, Screen Capture Software (SCS) offers the promise of recording action on the computer desktop together with voiceover narration, all combined into a single movie file that can be shared, emailed, or uploaded.…

  9. The Technological Evolution in Schools: Reflections and Projections.

    ERIC Educational Resources Information Center

    Higgins, James E.

    1991-01-01

    Presents a first-person account of one teacher's experiences with computer hardware and software. The article discusses various programs and applications, such as integrated learning systems, database searching via CD-ROM, desktop publishing, authoring programs, and indicates future changes in instruction with increasing use of technology. (SM)

  10. Incorporating a Human-Computer Interaction Course into Software Development Curriculums

    ERIC Educational Resources Information Center

    Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph

    2015-01-01

    Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…

  11. Massive Query Resolution for Rapid Selective Dissemination of Information.

    ERIC Educational Resources Information Center

    Cohen, Jonathan D.

    1999-01-01

    Outlines an efficient approach to performing query resolution which, when matched with a keyword scanner, offers rapid selecting and routing for massive Boolean queries, and which is suitable for implementation on a desktop computer. Demonstrates the system's operation with large examples in a practical setting. (AEF)

  12. Medical Recording Tools for Biodosimetry in Radiation Incidents

    DTIC Science & Technology

    2005-01-01

    assistant) devices. To ac- complish this, the second edition of the AFRRI handbook will be redesigned, using Adobe FrameMaker desk- top publishing...Handbook will be redes- igned for display on hand-held computer devices, using Adobe FrameMaker desktop publishing software. Portions of the text

  13. Controlling Robots with Personal Computers.

    ERIC Educational Resources Information Center

    Singer, Andrew; Rony, Peter

    1983-01-01

    Discusses new robots that are mechanical arms small enough to sit on a desktop. They offer scaled-down price and performance, but are able to handle light production tasks such as spray painting or part orientation. (Available from W. C. Publications Inc., P.O. Box 1578, Montclair, NJ 07042.) (JOW)

  14. GPU-based Green’s function simulations of shear waves generated by an applied acoustic radiation force in elastic and viscoelastic models

    NASA Astrophysics Data System (ADS)

    Yang, Yiqun; Urban, Matthew W.; McGough, Robert J.

    2018-05-01

    Shear wave calculations induced by an acoustic radiation force are very time-consuming on desktop computers, and high-performance graphics processing units (GPUs) achieve dramatic reductions in the computation time for these simulations. The acoustic radiation force is calculated using the fast near field method and the angular spectrum approach, and then the shear waves are calculated in parallel with Green’s functions on a GPU. This combination enables rapid evaluation of shear waves for push beams with different spatial samplings and for apertures with different f/#. Relative to shear wave simulations that evaluate the same algorithm on an Intel i7 desktop computer, a high performance nVidia GPU reduces the time required for these calculations by a factor of 45 and 700 when applied to elastic and viscoelastic shear wave simulation models, respectively. These GPU-accelerated simulations also compared to measurements in different viscoelastic phantoms, and the results are similar. For parametric evaluations and for comparisons with measured shear wave data, shear wave simulations with the Green’s function approach are ideally suited for high-performance GPUs.

  15. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  16. A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies

    DOE PAGES

    Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; ...

    2015-01-21

    Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with themore » data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. Furthermore, a tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial.« less

  17. A Toolkit for ARB to Integrate Custom Databases and Externally Built Phylogenies

    PubMed Central

    Essinger, Steven D.; Reichenberger, Erin; Morrison, Calvin; Blackwood, Christopher B.; Rosen, Gail L.

    2015-01-01

    Researchers are perpetually amassing biological sequence data. The computational approaches employed by ecologists for organizing this data (e.g. alignment, phylogeny, etc.) typically scale nonlinearly in execution time with the size of the dataset. This often serves as a bottleneck for processing experimental data since many molecular studies are characterized by massive datasets. To keep up with experimental data demands, ecologists are forced to choose between continually upgrading expensive in-house computer hardware or outsourcing the most demanding computations to the cloud. Outsourcing is attractive since it is the least expensive option, but does not necessarily allow direct user interaction with the data for exploratory analysis. Desktop analytical tools such as ARB are indispensable for this purpose, but they do not necessarily offer a convenient solution for the coordination and integration of datasets between local and outsourced destinations. Therefore, researchers are currently left with an undesirable tradeoff between computational throughput and analytical capability. To mitigate this tradeoff we introduce a software package to leverage the utility of the interactive exploratory tools offered by ARB with the computational throughput of cloud-based resources. Our pipeline serves as middleware between the desktop and the cloud allowing researchers to form local custom databases containing sequences and metadata from multiple resources and a method for linking data outsourced for computation back to the local database. A tutorial implementation of the toolkit is provided in the supporting information, S1 Tutorial. Availability: http://www.ece.drexel.edu/gailr/EESI/tutorial.php. PMID:25607539

  18. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  19. Computing at DESY — current setup, trends and strategic directions

    NASA Astrophysics Data System (ADS)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  20. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  1. Desktop Publishing: New Right Brain Documents.

    ERIC Educational Resources Information Center

    Williams, James B.; Murr, Lawrence E.

    1987-01-01

    Supporting evidence from both neurological research in brain hemisphere functions and comparisons of the use of symbols in Eastern and Western cultures are used to advance the position that the capability of graphics software for microcomputers to combine textual and visual elements makes them a powerful and revolutionary communications tool. (CLB)

  2. User's guide to noise data acquisition and analysis programs for HP9845: Nicolet analyzers

    NASA Technical Reports Server (NTRS)

    Mcgary, M. C.

    1982-01-01

    A software interface package was written for use with a desktop computer and two models of single channel Fast Fourier analyzers. This software features a portable measurement and analysis system with several options. Two types of interface hardware can alternately be used in conjunction with the software. Either an IEEE-488 Bus interface or a 16-bit parallel system may be used. Two types of storage medium, either tape cartridge or floppy disc can be used with the software. Five types of data may be stored, plotted, and/or printed. The data types include time histories, narrow band power spectra, and narrow band, one-third octave band, or octave band sound pressure level. The data acquisition programming includes a front panel remote control option for the FFT analyzers. Data analysis options include choice of line type and pen color for plotting.

  3. Metallurgical Plant Optimization Through the use of Flowsheet Simulation Modelling

    NASA Astrophysics Data System (ADS)

    Kennedy, Mark William

    Modern metallurgical plants typically have complex flowsheets and operate on a continuous basis. Real time interactions within such processes can be complex and the impacts of streams such as recycles on process efficiency and stability can be highly unexpected prior to actual operation. Current desktop computing power, combined with state-of-the-art flowsheet simulation software like Metsim, allow for thorough analysis of designs to explore the interaction between operating rate, heat and mass balances and in particular the potential negative impact of recycles. Using plant information systems, it is possible to combine real plant data with simple steady state models, using dynamic data exchange links to allow for near real time de-bottlenecking of operations. Accurate analytical results can also be combined with detailed unit operations models to allow for feed-forward model-based-control. This paper will explore some examples of the application of Metsim to real world engineering and plant operational issues.

  4. When Everyone Is a Probe, Everyone Is a Learner

    ERIC Educational Resources Information Center

    Berenfeld, Boris; Krupa, Tatiana; Lebedev, Arseny; Stafeev, Sergey

    2014-01-01

    Most students globally have mobile devices and the Global Students Laboratory (GlobalLab) project is integrating mobility into learning. First launched in 1991, GlobalLab builds a community of learners engaged in collaborative, distributed investigations. Long relying on stationary desktop computers, or students inputting their observations by…

  5. Software Reviews: Programs Worth a Second Look.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1989

    1989-01-01

    Reviews three computer software programs: (1) "The Children's Writing and Publishing Center"--writing and creative arts, grades 2-8, Apple II; (2) "Slide Shop"--graphics and desktop presentations, grades 4-12, Apple II and IBM; and (3) "Solve It"--problem solving and language arts, grades 4-12, Apple II. (MVL)

  6. Managing Information Technology in Academic Medical Centers: A "Multicultural" Experience.

    ERIC Educational Resources Information Center

    Friedman, Charles P.; Corn, Milton; Krumrey, Arthur; Perry, David R.; Stevens, Ronald H.

    1998-01-01

    Examines how beliefs and concerns of academic medicine's diverse professional cultures affect management of information technology. Two scenarios, one dealing with standardization of desktop personal computers and the other with publication of syllabi on an institutional intranet, form the basis for an exercise in which four prototypical members…

  7. The Library Macintosh at SCIL [Small Computers in Libraries]'88.

    ERIC Educational Resources Information Center

    Valauskas, Edward J.; And Others

    1988-01-01

    The first of three papers describes the role of Macintosh workstations in a library. The second paper explains why the Macintosh was selected for end-user searching in an academic library, and the third discusses advantages and disadvantages of desktop publishing for librarians. (8 references) (MES)

  8. Ideas without Words--Internationalizing Business Presentations.

    ERIC Educational Resources Information Center

    Sondak, Norman; Sondak, Eileen

    This paper presents elements of the computer graphics environment including information on: Lotus 1-2-3; Apple Macintosh; Desktop Publishing; Object-Oriented Programming; and Microsoft's Windows 3. A brief scenario illustrates the use of the minimization principle in presenting a new product to a group of international financiers. A taxonomy of…

  9. Most Social Scientists Shun Free Use of Supercomputers.

    ERIC Educational Resources Information Center

    Kiernan, Vincent

    1998-01-01

    Social scientists, who frequently complain that the federal government spends too little on them, are passing up what scholars in the physical and natural sciences see as the government's best give-aways: free access to supercomputers. Some social scientists say the supercomputers are difficult to use; others find desktop computers provide…

  10. Classrooms for the Millennials: An Approach for the Next Generation

    ERIC Educational Resources Information Center

    Gerber, Lindsey N.; Ward, Debra D.

    2016-01-01

    The purpose of this paper is to introduce educators to three types of applets that are compatible with smartphones, tablets, and desktop computers: screencasting applets, graphing calculator applets, and student response applets. The applets discussed can be seamlessly and effectively integrated into classrooms to help facilitate lectures, collect…

  11. Evaluating Technology Integration in the Elementary School: A Site-Based Approach.

    ERIC Educational Resources Information Center

    Mowe, Richard

    This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…

  12. The Impact on Homes with a School-Provided Laptop

    ERIC Educational Resources Information Center

    Bu Shell, Shawna M.

    2012-01-01

    For decades, educational policy advocates have argued for providing technology to students to enhance their learning environments. From filmstrips (Oppenheimer, 1997) to desktop computers (Cuban, 2002) to laptops (Silvernail, 2009; Warschauer, 2006), they have attempted to change the environment and style in which students learn, and how tools can…

  13. Digital Video: Get with It!

    ERIC Educational Resources Information Center

    Van Horn, Royal

    2001-01-01

    Several years after the first audiovisual Macintosh computer appeared, most educators are still oblivious of this technology. Almost every other economic sector (including the porn industry) makes abundant use of digital and streaming video. Desktop movie production is so easy that primary grade students can do it. Tips are provided. (MLH)

  14. Colleges' Effort To Prepare for Y2K May Yield Benefits for Many Years.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2000-01-01

    Suggests that the money spent ($100 billion) to fix the Y2K bug in the United States resulted in improved campus computer systems. Reports from campuses around the country indicate that both mainframe and desktop systems experienced fewer problems than expected. (DB)

  15. Student Record Automating Using Desktop Computer Technologies.

    ERIC Educational Resources Information Center

    Almerico, Gina M.; Baker, Russell K.; Matassini, Norma

    Teacher education programs nationwide are required by state and federal governments to maintain comprehensive student records of all current and graduated students in their programs. A private, mid-sized university established a faculty team to analyze record-keeping procedures to comply with these government requirements. The team's mandate was…

  16. CUDA GPU based full-Stokes finite difference modelling of glaciers

    NASA Astrophysics Data System (ADS)

    Brædstrup, C. F.; Egholm, D. L.

    2012-04-01

    Many have stressed the limitations of using the shallow shelf and shallow ice approximations when modelling ice streams or surging glaciers. Using a full-stokes approach requires either large amounts of computer power or time and is therefore seldom an option for most glaciologists. Recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists. Our full-stokes ice sheet model implements a Red-Black Gauss-Seidel iterative linear solver to solve the full stokes equations. This technique has proven very effective when applied to the stokes equation in geodynamics problems, and should therefore also preform well in glaciological flow probems. The Gauss-Seidel iterator is known to be robust but several other linear solvers have a much faster convergence. To aid convergence, the solver uses a multigrid approach where values are interpolated and extrapolated between different grid resolutions to minimize the short wavelength errors efficiently. This reduces the iteration count by several orders of magnitude. The run-time is further reduced by using the GPGPU technology where each card has up to 448 cores. Researchers utilizing the GPGPU technique in other areas have reported between 2 - 11 times speedup compared to multicore CPU implementations on similar problems. The goal of these initial investigations into the possible usage of GPGPU technology in glacial modelling is to apply the enhanced resolution of a full-stokes solver to ice streams and surging glaciers. This is a area of growing interest because ice streams are the main drainage conjugates for large ice sheets. It is therefore crucial to understand this streaming behavior and it's impact up-ice.

  17. Goal-seismic computer programs in BASIC: Part I; Store, plot, and edit array data

    USGS Publications Warehouse

    Hasbrouck, Wilfred P.

    1979-01-01

    Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in an extended BASIC language specially augmented for acceptance by the Tektronix 4051 Graphic System. This report presents five computer programs used to store, plot, and edit array data for the line, cross, and triangle arrays commonly employed in our coal-seismic investigations. * Use of brand names in this report is for descriptive purposes only and does not constitute endorsement by the U.S. Geological Survey.

  18. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    PubMed Central

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  19. A real-time capable software-defined receiver using GPU for adaptive anti-jam GPS sensors.

    PubMed

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  20. The spinal posture of computing adolescents in a real-life setting

    PubMed Central

    2014-01-01

    Background It is assumed that good postural alignment is associated with the less likelihood of musculoskeletal pain symptoms. Encouraging good sitting postures have not reported consequent musculoskeletal pain reduction in school-based populations, possibly due to a lack of clear understanding of good posture. Therefore this paper describes the variability of postural angles in a cohort of asymptomatic high-school students whilst working on desk-top computers in a school computer classroom and to report on the relationship between the postural angles and age, gender, height, weight and computer use. Methods The baseline data from a 12 month longitudinal study is reported. The study was conducted in South African school computer classrooms. 194 Grade 10 high-school students, from randomly selected high-schools, aged 15–17 years, enrolled in Computer Application Technology for the first time, asymptomatic during the preceding month, and from whom written informed consent were obtained, participated in the study. The 3D Posture Analysis Tool captured five postural angles (head flexion, neck flexion, cranio-cervical angle, trunk flexion and head lateral bend) while the students were working on desk-top computers. Height, weight and computer use were also measured. Individual and combinations of postural angles were analysed. Results 944 Students were screened for eligibility of which the data of 194 students are reported. Trunk flexion was the most variable angle. Increased neck flexion and the combination of increased head flexion, neck flexion and trunk flexion were significantly associated with increased weight and BMI (p = 0.0001). Conclusions High-school students sit with greater ranges of trunk flexion (leaning forward or reclining) when using the classroom computer. Increased weight is significantly associated with increased sagittal plane postural angles. PMID:24950887

  1. Evaluating computer capabilities in a primary care practice-based research network.

    PubMed

    Ariza, Adolfo J; Binns, Helen J; Christoffel, Katherine Kaufer

    2004-01-01

    We wanted to assess computer capabilities in a primary care practice-based research network and to understand how receptive the practices were to new ideas for automation of practice activities and research. This study was conducted among members of the Pediatric Practice Research Group (PPRG). A survey to assess computer capabilities was developed to explore hardware types, software programs, Internet connectivity and data transmission; views on privacy and security; and receptivity to future electronic data collection approaches. Of the 40 PPRG practices participating in the study during the autumn of 2001, all used IBM-compatible systems. Of these, 45% used stand-alone desktops, 40% had networked desktops, and approximately 15% used laptops and minicomputers. A variety of software packages were used, with most practices (82%) having software for some aspect of patient care documentation, patient accounting (90%), business support (60%), and management reports and analysis (97%). The main obstacles to expanding use of computers in patient care were insufficient staff training (63%) and privacy concerns (82%). If provided with training and support, most practices indicated they were willing to consider an array of electronic data collection options for practice-based research activities. There is wide variability in hardware and software use in the pediatric practice setting. Implementing electronic data collection in the PPRG would require a substantial start-up effort and ongoing training and support at the practice site.

  2. Developing and validating an instrument for measuring mobile computing self-efficacy.

    PubMed

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  3. Economic analysis of cloud-based desktop virtualization implementation at a hospital

    PubMed Central

    2012-01-01

    Background Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with anydevice. However, the economic validity of investing in the adoption of the system at a hospital has not been established. Methods This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. Results The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. Conclusions This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting. PMID:23110661

  4. Economic analysis of cloud-based desktop virtualization implementation at a hospital.

    PubMed

    Yoo, Sooyoung; Kim, Seok; Kim, Taeki; Baek, Rong-Min; Suh, Chang Suk; Chung, Chin Youb; Hwang, Hee

    2012-10-30

    Cloud-based desktop virtualization infrastructure (VDI) is known as providing simplified management of application and desktop, efficient management of physical resources, and rapid service deployment, as well as connection to the computer environment at anytime, anywhere with any device. However, the economic validity of investing in the adoption of the system at a hospital has not been established. This study computed the actual investment cost of the hospital-wide VDI implementation at the 910-bed Seoul National University Bundang Hospital in Korea and the resulting effects (i.e., reductions in PC errors and difficulties, application and operating system update time, and account management time). Return on investment (ROI), net present value (NPV), and internal rate of return (IRR) indexes used for corporate investment decision-making were used for the economic analysis of VDI implementation. The results of five-year cost-benefit analysis given for 400 Virtual Machines (VMs; i.e., 1,100 users in the case of SNUBH) showed that the break-even point was reached in the fourth year of the investment. At that point, the ROI was 122.6%, the NPV was approximately US$192,000, and the IRR showed an investment validity of 10.8%. From our sensitivity analysis to changing the number of VMs (in terms of number of users), the greater the number of adopted VMs was the more investable the system was. This study confirms that the emerging VDI can have an economic impact on hospital information system (HIS) operation and utilization in a tertiary hospital setting.

  5. Technical Writing Teachers and the Challenges of Desktop Publishing.

    ERIC Educational Resources Information Center

    Kalmbach, James

    1988-01-01

    Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

  6. High-performance image reconstruction in fluorescence tomography on desktop computers and graphics hardware.

    PubMed

    Freiberger, Manuel; Egger, Herbert; Liebmann, Manfred; Scharfetter, Hermann

    2011-11-01

    Image reconstruction in fluorescence optical tomography is a three-dimensional nonlinear ill-posed problem governed by a system of partial differential equations. In this paper we demonstrate that a combination of state of the art numerical algorithms and a careful hardware optimized implementation allows to solve this large-scale inverse problem in a few seconds on standard desktop PCs with modern graphics hardware. In particular, we present methods to solve not only the forward but also the non-linear inverse problem by massively parallel programming on graphics processors. A comparison of optimized CPU and GPU implementations shows that the reconstruction can be accelerated by factors of about 15 through the use of the graphics hardware without compromising the accuracy in the reconstructed images.

  7. Development of a PC-based ground support system for a small satellite instrument

    NASA Astrophysics Data System (ADS)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  8. Automated micromanipulation desktop station based on mobile piezoelectric microrobots

    NASA Astrophysics Data System (ADS)

    Fatikow, Sergej

    1996-12-01

    One of the main problems of present-day research on microsystem technology (MST) is to assemble a whole micro- system from different microcomponents. This paper presents a new concept of an automated micromanipulation desktop- station including piezoelectrically driven microrobots placed on a high-precise x-y-stage of a light microscope, a CCD-camera as a local sensor subsystem, a laser sensor unit as a global sensor subsystem, a parallel computer system with C167 microcontrollers, and a Pentium PC equipped additionally with an optical grabber. The microrobots can perform high-precise manipulations (with an accuracy of up to 10 nm) and a nondestructive transport (at a speed of about 3 cm/sec) of very small objects under the microscope. To control the desktop-station automatically, an advanced control system that includes a task planning level and a real-time execution level is being developed. The main function of the task planning sub-system is to interpret the implicit action plan and to generate a sequence of explicit operations which are sent to the execution level of the control system. The main functions of the execution control level are the object recognition, image processing and feedback position control of the microrobot and the microscope stage.

  9. Feedback about Astronomical Application Developments for Mobile Devices

    NASA Astrophysics Data System (ADS)

    Schaaff, A.; Boch, T.; Fernique, P.; Houpin, R.; Kaestlé, V.; Royer, M.; Scheffmann, J.; Weiler, A.

    2013-10-01

    Within a few years, Smartphones have become the standard for mobile telephony, and we are now witnessing a rapid development of Internet tablets. These mobile devices have enough powerful hardware features to run more and more complex applications. In the field of astronomy it is not only possible to use these tools to access data via a simple browser, but also to develop native applications reusing libraries (Java for Android, Objective-C for iOS) developed for desktops. We have been working for two years on mobile application development and we now have the skills in native iOS and Android development, Web development (especially HTML5, JavaScript, CSS3) and conversion tools (PhoneGap) from Web development to native applications. The biggest change comes from human/computer interaction that is radically changed by the use of multitouch. This interaction requires a redesign of interfaces to take advantage of new features (simultaneous selections in different parts of the screen, etc.). In the case of native applications, the distribution is usually done through online stores (App Store, Google Play, etc.) which gives visibility to a wider audience. Our approach is not only to perform testing of materials and developing of prototypes, but also operational applications. The native application development is costly in development time, but the possibilities are broader because it is possible to use native hardware such as the gyroscope and the accelerometer, to point out an object in the sky. Development depends on the Web browser and the rendering and performance are often very different between different browsers. It is also possible to convert Web developments to native applications, but currently it is better to restrict this possibility to light applications in terms of functionality. Developments in HTML5 are promising but are far behind those available on desktops. HTML5 has the advantage of allowing development independent from the evolution of the mobile platforms (“write once, run everywhere”). The upcoming Windows 8 support on desktops and Internet tablets as well as a mobile version for smartphones will further expand the native systems family. This will enhance the interest of Web development.

  10. Three Approaches to Green Computing on Campus

    ERIC Educational Resources Information Center

    Thompson, John T.

    2009-01-01

    A "carbon footprint" is the "total set of greenhouse gas emissions caused directly and indirectly by an (individual, event, organization, and product) expressed as CO2" emissions. Since CO2 emissions are indicative of energy use, the higher the associated CO2 emissions, typically the greater the associated costs. A typical desktop PC system…

  11. Cross-Platform User Interface of E-Learning Applications

    ERIC Educational Resources Information Center

    Stoces, Michal; Masner, Jan; Jarolímek, Jan; Šimek, Pavel; Vanek, Jirí; Ulman, Miloš

    2015-01-01

    The paper discusses the development of Web educational services for specific groups. A key feature is to allow the display and use of educational materials and training services to the widest possible set of different devices, especially in the browser classic desktop computers, notebooks, tablets, mobile phones and also on different readers for…

  12. Blocking of Goal-Location Learning Based on Shape

    ERIC Educational Resources Information Center

    Alexander, Tim; Wilson, Stuart P.; Wilson, Paul N.

    2009-01-01

    Using desktop, computer-simulated virtual environments (VEs), the authors conducted 5 experiments to investigate blocking of learning about a goal location based on Shape B as a consequence of preliminary training to locate that goal using Shape A. The shapes were large 2-dimensional horizontal figures on the ground. Blocking of spatial learning…

  13. High Resolution Displays In The Apple Macintosh And IBM PC Environments

    NASA Astrophysics Data System (ADS)

    Winegarden, Steven

    1989-07-01

    High resolution displays are one of the key elements that distinguish user oriented document finishing or publishing stations. A number of factors have been involved in bringing these to the desktop environment. At Sigma Designs we have concentrated on enhancing the capabilites of IBM PCs and compatibles and Apple Macintosh computer systems.

  14. Notebooks, Handhelds, and Software in Physical Education (Grades 5-8)

    ERIC Educational Resources Information Center

    Mohnsen, Bonnie

    2005-01-01

    Heart monitors, pedometers, and now virtual reality-based equipment (e.g., Cyberbikes, "Dance Dance Revolution") have been embraced by physical educators as technologies worth using in the physical education program; however, the use of computers (be it a desktop, notebook, or handheld) in the physical education instructional program, has not been…

  15. Epilepsy Forewarning Using A Hand-Held Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hively, LM

    2005-02-21

    Over the last decade, ORNL has developed and patented a novel approach for forewarning of a large variety of machine and biomedical events. The present implementation uses desktop computers to analyze archival data. This report describes the next logical step in this effort, namely use of a hand-held device for the analysis.

  16. Full Immersive Virtual Environment Cave[TM] in Chemistry Education

    ERIC Educational Resources Information Center

    Limniou, M.; Roberts, D.; Papadopoulos, N.

    2008-01-01

    By comparing two-dimensional (2D) chemical animations designed for computer's desktop with three-dimensional (3D) chemical animations designed for the full immersive virtual reality environment CAVE[TM] we studied how virtual reality environments could raise student's interest and motivation for learning. By using the 3ds max[TM], we can visualize…

  17. Reading, Writing, and Documentation and Managing the Development of User Documentation.

    ERIC Educational Resources Information Center

    Lindberg, Wayne; Hoffman, Terrye

    1987-01-01

    The first of two articles addressing the issue of user documentation for computer software discusses the need to teach users how to read documentation. The second presents a guide for writing documentation that is based on the instructional systems design model, and makes suggestions for the desktop publishing of user manuals. (CLB)

  18. Learning Computer Hardware by Doing: Are Tablets Better than Desktops?

    ERIC Educational Resources Information Center

    Raven, John; Qalawee, Mohamed; Atroshi, Hanar

    2016-01-01

    In this world of rapidly evolving technologies, educational institutions often struggle to keep up with change. Change often requires a state of readiness at both the micro and macro levels. This paper looks at a tertiary institution that undertook a significant technology change initiative by introducing tablet based components for teaching a…

  19. Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner

    ERIC Educational Resources Information Center

    Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.

    2004-01-01

    The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.

  20. WLANs for the 21st Century Library

    ERIC Educational Resources Information Center

    Calamari, Cal

    2009-01-01

    As educational and research needs have changed, libraries have changed as well. They must meet ever-increasing demand for access to online media, subscriptions to archives, video, audio, and other content. The way a user/patron accesses this information has also changed. Gone are the days of a few hardwired desktops or computer carts. While…

  1. Infrastructure Suitability Assessment Modeling for Cloud Computing Solutions

    DTIC Science & Technology

    2011-09-01

    Virtualization vs . Para-Virtualization .......................................................10 Figure 4. Modeling alternatives in relation to model...the conceptual difference between full virtualization and para-virtualization. Figure 3. Full Virtualization vs . Para-Virtualization 2. XEN...Besides Microsoft’s own client implementations, dubbed “Remote Desktop Con- nection Client” for Windows® and Apple ® operating systems, various open

  2. Redesigning a Library Space for Collaborative Learning

    ERIC Educational Resources Information Center

    Gabbard, Ralph B.; Kaiser, Anthony; Kaunelis, David

    2007-01-01

    The reference desk at Indiana State University's (ISU) library offers an excellent view of student work areas on the first floor. From this vantage point, the reference librarians noticed students, especially in the evening and on weekends, huddled together in small groups, with one student at the keyboard of a laptop or desktop computer. The…

  3. The Promise of Zoomable User Interfaces

    ERIC Educational Resources Information Center

    Bederson, Benjamin B.

    2011-01-01

    Zoomable user interfaces (ZUIs) have received a significant amount of attention in the 18 years since they were introduced. They have enjoyed some success, and elements of ZUIs are widely used in computers today, although the grand vision of a zoomable desktop has not materialised. This paper describes the premise and promise of ZUIs along with…

  4. Growth-simulation model for lodgepole pine in central Oregon.

    Treesearch

    Walter G. Dahms

    1983-01-01

    A growth-simulation model for central Oregon lodgepole pine (Pinus contorta Dougl.) has been constructed by combining data from temporary and permanent sample plots. The model is similar to a conventional yield table with the added capacity for dealing with the stand-density variable. The simulator runs on a desk-top computer.

  5. The Influence of Textual Cues on First Impressions of an Email Sender

    ERIC Educational Resources Information Center

    Marlow, Shannon L.; Lacerenza, Christina N.; Iwig, Chelsea

    2018-01-01

    The present study experimentally manipulated the gender of an email sender, closing salutation, and sending mode (i.e., email sent via desktop computer/laptop as compared with email sent via a mobile device) to determine if these specific cues influence first impressions of the sender's competence, professionalism, positive affect, and negative…

  6. Increasing Interactive Activity: Using Technology to Enhance Interaction between Teachers, Students and Learning Material.

    ERIC Educational Resources Information Center

    Bucknall, Ruary

    1996-01-01

    Overview of the interactive technologies used by the Northern Territory Secondary Correspondence School in Australia: print media utilizing desktop publishing and electronic transfer; telephone or H-F radio; interactive television; and interactive computing. More fully describes its interactive CD-ROM courses. Emphasizes that the programs are…

  7. From Floppies to Flash--Your Guide to Removable Media

    ERIC Educational Resources Information Center

    Berdinka, Matthew J.

    2005-01-01

    Technology that once involved a scary, mysterious machine the size of a small house now fits on desktops and commonly appears in offices, schools, and homes. Computers allow for processing, storing and transmitting data between two or more people virtually anywhere in the world. They also allow users to save documents, presentations, photos and…

  8. Upper quadrant postural changes of school children in response to interaction with different information technologies.

    PubMed

    Briggs, Andrew; Straker, Leon; Greig, Alison

    2004-06-10

    The objective of this study was to quantitatively analyse the sitting posture of school children interacting with both old (book) and new (laptop and desktop computers) information technologies to test the hypothesis that posture is effected by the type of information technology (IT) used. A mixed model design was used to test the effect of IT type (within subjects) and age and gender (between subjects). The sitting posture of 32 children aged 4-17 years was measured whilst they read from a book, laptop, and desktop computer at a standard school chair and desk. Video images were captured and then digitized to calculate mean angles for head tilt, neck flexion, trunk flexion, and gaze angle. Posture was found to be influenced by IT type (p < 0.001), age (p < 0.001) and gender (p = 0.024) and significantly correlated to the stature of the participants. Measurement of resting posture and the maximal range of motion of the upper and lower cervical spines in the sagittal plane was also undertaken. The biophysical impact and the suitability of the three different information technologies are discussed.

  9. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists.

    PubMed

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior.

  10. Deep Unsupervised Learning on a Desktop PC: A Primer for Cognitive Scientists

    PubMed Central

    Testolin, Alberto; Stoianov, Ivilin; De Filippo De Grazia, Michele; Zorzi, Marco

    2013-01-01

    Deep belief networks hold great promise for the simulation of human cognition because they show how structured and abstract representations may emerge from probabilistic unsupervised learning. These networks build a hierarchy of progressively more complex distributed representations of the sensory data by fitting a hierarchical generative model. However, learning in deep networks typically requires big datasets and it can involve millions of connection weights, which implies that simulations on standard computers are unfeasible. Developing realistic, medium-to-large-scale learning models of cognition would therefore seem to require expertise in programing parallel-computing hardware, and this might explain why the use of this promising approach is still largely confined to the machine learning community. Here we show how simulations of deep unsupervised learning can be easily performed on a desktop PC by exploiting the processors of low cost graphic cards (graphic processor units) without any specific programing effort, thanks to the use of high-level programming routines (available in MATLAB or Python). We also show that even an entry-level graphic card can outperform a small high-performance computing cluster in terms of learning time and with no loss of learning quality. We therefore conclude that graphic card implementations pave the way for a widespread use of deep learning among cognitive scientists for modeling cognition and behavior. PMID:23653617

  11. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  12. Transformation of personal computers and mobile phones into genetic diagnostic systems.

    PubMed

    Walker, Faye M; Ahmad, Kareem M; Eisenstein, Michael; Soh, H Tom

    2014-09-16

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone--devices that have become readily accessible in developing countries--into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite.

  13. Transformation of Personal Computers and Mobile Phones into Genetic Diagnostic Systems

    PubMed Central

    2014-01-01

    Molecular diagnostics based on the polymerase chain reaction (PCR) offer rapid and sensitive means for detecting infectious disease, but prohibitive costs have impeded their use in resource-limited settings where such diseases are endemic. In this work, we report an innovative method for transforming a desktop computer and a mobile camera phone—devices that have become readily accessible in developing countries—into a highly sensitive DNA detection system. This transformation was achieved by converting a desktop computer into a de facto thermal cycler with software that controls the temperature of the central processing unit (CPU), allowing for highly efficient PCR. Next, we reconfigured the mobile phone into a fluorescence imager by adding a low-cost filter, which enabled us to quantitatively measure the resulting PCR amplicons. Our system is highly sensitive, achieving quantitative detection of as little as 9.6 attograms of target DNA, and we show that its performance is comparable to advanced laboratory instruments at approximately 1/500th of the cost. Finally, in order to demonstrate clinical utility, we have used our platform for the successful detection of genomic DNA from the parasite that causes Chagas disease, Trypanosoma cruzi, directly in whole, unprocessed human blood at concentrations 4-fold below the clinical titer of the parasite. PMID:25223929

  14. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  15. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  16. Simulation training tools for nonlethal weapons using gaming environments

    NASA Astrophysics Data System (ADS)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  17. Wireless live streaming video of laparoscopic surgery: a bandwidth analysis for handheld computers.

    PubMed

    Gandsas, Alex; McIntire, Katherine; George, Ivan M; Witzke, Wayne; Hoskins, James D; Park, Adrian

    2002-01-01

    Over the last six years, streaming media has emerged as a powerful tool for delivering multimedia content over networks. Concurrently, wireless technology has evolved, freeing users from desktop boundaries and wired infrastructures. At the University of Kentucky Medical Center, we have integrated these technologies to develop a system that can wirelessly transmit live surgery from the operating room to a handheld computer. This study establishes the feasibility of using our system to view surgeries and describes the effect of bandwidth on image quality. A live laparoscopic ventral hernia repair was transmitted to a single handheld computer using five encoding speeds at a constant frame rate, and the quality of the resulting streaming images was evaluated. No video images were rendered when video data were encoded at 28.8 kilobytes per second (Kbps), the slowest encoding bitrate studied. The highest quality images were rendered at encoding speeds greater than or equal to 150 Kbps. Of note, a 15 second transmission delay was experienced using all four encoding schemes that rendered video images. We believe that the wireless transmission of streaming video to handheld computers has tremendous potential to enhance surgical education. For medical students and residents, the ability to view live surgeries, lectures, courses and seminars on handheld computers means a larger number of learning opportunities. In addition, we envision that wireless enabled devices may be used to telemonitor surgical procedures. However, bandwidth availability and streaming delay are major issues that must be addressed before wireless telementoring becomes a reality.

  18. Automated Data Handling And Instrument Control Using Low-Cost Desktop Computers And An IEEE 488 Compatible Version Of The ODETA V.

    NASA Astrophysics Data System (ADS)

    van Leunen, J. A. J.; Dreessen, J.

    1984-05-01

    The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.

  19. Biological Visualization, Imaging and Simulation(Bio-VIS) at NASA Ames Research Center: Developing New Software and Technology for Astronaut Training and Biology Research in Space

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey

    2003-01-01

    The Bio- Visualization, Imaging and Simulation (BioVIS) Technology Center at NASA's Ames Research Center is dedicated to developing and applying advanced visualization, computation and simulation technologies to support NASA Space Life Sciences research and the objectives of the Fundamental Biology Program. Research ranges from high resolution 3D cell imaging and structure analysis, virtual environment simulation of fine sensory-motor tasks, computational neuroscience and biophysics to biomedical/clinical applications. Computer simulation research focuses on the development of advanced computational tools for astronaut training and education. Virtual Reality (VR) and Virtual Environment (VE) simulation systems have become important training tools in many fields from flight simulation to, more recently, surgical simulation. The type and quality of training provided by these computer-based tools ranges widely, but the value of real-time VE computer simulation as a method of preparing individuals for real-world tasks is well established. Astronauts routinely use VE systems for various training tasks, including Space Shuttle landings, robot arm manipulations and extravehicular activities (space walks). Currently, there are no VE systems to train astronauts for basic and applied research experiments which are an important part of many missions. The Virtual Glovebox (VGX) is a prototype VE system for real-time physically-based simulation of the Life Sciences Glovebox where astronauts will perform many complex tasks supporting research experiments aboard the International Space Station. The VGX consists of a physical display system utilizing duel LCD projectors and circular polarization to produce a desktop-sized 3D virtual workspace. Physically-based modeling tools (Arachi Inc.) provide real-time collision detection, rigid body dynamics, physical properties and force-based controls for objects. The human-computer interface consists of two magnetic tracking devices (Ascention Inc.) attached to instrumented gloves (Immersion Inc.) which co-locate the user's hands with hand/forearm representations in the virtual workspace. Force-feedback is possible in a work volume defined by a Phantom Desktop device (SensAble inc.). Graphics are written in OpenGL. The system runs on a 2.2 GHz Pentium 4 PC. The prototype VGX provides astronauts and support personnel with a real-time physically-based VE system to simulate basic research tasks both on Earth and in the microgravity of Space. The immersive virtual environment of the VGX also makes it a useful tool for virtual engineering applications including CAD development, procedure design and simulation of human-system systems in a desktop-sized work volume.

  20. Influence of direct computer experience on older adults' attitudes toward computers.

    PubMed

    Jay, G M; Willis, S L

    1992-07-01

    This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.

  1. Desktop Publishing Choices: Making an Appropriate Decision.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1991-01-01

    Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…

  2. Integrated IMA (Information Mission Areas) IC (Information Center) Guide

    DTIC Science & Technology

    1989-06-01

    COMPUTER AIDED DESIGN / COMPUTER AIDED MANUFACTURE 8-8 8.3.7 LIQUID CRYSTAL DISPLAY PANELS 8-8 8.3.8 ARTIFICIAL INTELLIGENCE APPLIED TO VI 8-9 8.4...2 10.3.1 DESKTOP PUBLISHING 10-3 10.3.2 INTELLIGENT COPIERS 10-5 10.3.3 ELECTRONIC ALTERNATIVES TO PRINTED DOCUMENTS 10-5 10.3.4 ELECTRONIC FORMS...Optical Disk LCD Units Storage Image Scanners Graphics Forms Output Generation Copiers Devices Software Optical Disk Intelligent Storage Copiers Work Group

  3. CERN's Common Unix and X Terminal Environment

    NASA Astrophysics Data System (ADS)

    Cass, Tony

    The Desktop Infrastructure Group of CERN's Computing and Networks Division has developed a Common Unix and X Terminal Environment to ease the migration to Unix based Interactive Computing. The CUTE architecture relies on a distributed filesystem—currently Trans arc's AFS—to enable essentially interchangeable client work-stations to access both "home directory" and program files transparently. Additionally, we provide a suite of programs to configure workstations for CUTE and to ensure continued compatibility. This paper describes the different components and the development of the CUTE architecture.

  4. Data Transfers Among the HP-75, HP-86, and HP-9845 Microcomputers.

    DTIC Science & Technology

    1983-01-01

    AD-A139 438 DAT TRANSFERS AMONG THE HP-75 HP-86 AND HP-9845 / MICROCOMPUTENS(U) AIR FORCE INST OF TECH WNIOHT-PATTERSON AFN OH D P CONNOR 1983...hereafter called the ඓ") and the HP-86 (hereafter called the ඞ"). The computers are to be used for classroom instruction and research at SOC. On...the main campus another Hewlett-Packard desktop computer, the HP-9845 (hereafter called the 񕚕"), is already in use; it controls and processes data

  5. Video control system for a drilling in furniture workpiece

    NASA Astrophysics Data System (ADS)

    Khmelev, V. L.; Satarov, R. N.; Zavyalova, K. V.

    2018-05-01

    During last 5 years, Russian industry has being starting to be a robotic, therefore scientific groups got new tasks. One of new tasks is machine vision systems, which should solve problem of automatic quality control. This type of systems has a cost of several thousand dollars each. The price is impossible for regional small business. In this article, we describe principle and algorithm of cheap video control system, which one uses web-cameras and notebook or desktop computer as a computing unit.

  6. Computer virus information update CIAC-2301

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information inmore » this document was extracted from CIAC`s Virus database.« less

  7. Gallium

    USGS Publications Warehouse

    Foley, Nora K.; Jaskula, Brian W.; Kimball, Bryn E.; Schulte, Ruth F.; Schulz, Klaus J.; DeYoung,, John H.; Seal, Robert R.; Bradley, Dwight C.

    2017-12-19

    Gallium is a soft, silvery metallic element with an atomic number of 31 and the chemical symbol Ga. Gallium is used in a wide variety of products that have microelectronic components containing either gallium arsenide (GaAs) or gallium nitride (GaN). GaAs is able to change electricity directly into laser light and is used in the manufacture of optoelectronic devices (laser diodes, light-emitting diodes [LEDs], photo detectors, and solar cells), which are important for aerospace and telecommunications applications and industrial and medical equipment. GaAs is also used in the production of highly specialized integrated circuits, semiconductors, and transistors; these are necessary for defense applications and high-performance computers. For example, cell phones with advanced personal computer-like functionality (smartphones) use GaAs-rich semiconductor components. GaN is used principally in the manufacture of LEDs and laser diodes, power electronics, and radio-frequency electronics. Because GaN power transistors operate at higher voltages and with a higher power density than GaAs devices, the uses for advanced GaN-based products are expected to increase in the future. Gallium technologies also have large power-handling capabilities and are used for cable television transmission, commercial wireless infrastructure, power electronics, and satellites. Gallium is also used for such familiar applications as screen backlighting for computer notebooks, flat-screen televisions, and desktop computer monitors.Gallium is dispersed in small amounts in many minerals and rocks where it substitutes for elements of similar size and charge, such as aluminum and zinc. For example, gallium is found in small amounts (about 50 parts per million) in such aluminum-bearing minerals as diaspore-boehmite and gibbsite, which form bauxite deposits, and in the zinc-sulfide mineral sphalerite, which is found in many mineral deposits. At the present time, gallium metal is derived mainly as a byproduct of the processing of bauxite ore for aluminum; lesser amounts of gallium metal are produced from the processing of sphalerite ore from three types of deposits (sediment-hosted, Mississippi Valley-type, and volcanogenic massive sulfide) for zinc. The United States is expected to meet its current and expected future needs for gallium through imports of primary, recycled, and refined gallium, as well as through domestic production of recycled and refined gallium. The U.S. Geological Survey estimates that world resources of gallium in bauxite exceed 1 billion kilograms, and a considerable quantity of gallium could be present in world zinc reserves.

  8. Neural simulations on multi-core architectures.

    PubMed

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing.

  9. Neural Simulations on Multi-Core Architectures

    PubMed Central

    Eichner, Hubert; Klug, Tobias; Borst, Alexander

    2009-01-01

    Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i.e. user-transparent load balancing. PMID:19636393

  10. Distributed-Memory Computing With the Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA)

    NASA Technical Reports Server (NTRS)

    Riley, Christopher J.; Cheatwood, F. McNeil

    1997-01-01

    The Langley Aerothermodynamic Upwind Relaxation Algorithm (LAURA), a Navier-Stokes solver, has been modified for use in a parallel, distributed-memory environment using the Message-Passing Interface (MPI) standard. A standard domain decomposition strategy is used in which the computational domain is divided into subdomains with each subdomain assigned to a processor. Performance is examined on dedicated parallel machines and a network of desktop workstations. The effect of domain decomposition and frequency of boundary updates on performance and convergence is also examined for several realistic configurations and conditions typical of large-scale computational fluid dynamic analysis.

  11. Using Desk-Top Publishing to Develop Literacy.

    ERIC Educational Resources Information Center

    Wray, David; Medwell, Jane

    1989-01-01

    Examines the learning benefits which may accrue from using desk-top publishing techniques with children, especially in terms of the development of literacy skills. Analyzes desk-top publishing as an extension of word processing and describes some ways of using desk-top publishing in the classroom. (RS)

  12. A large-signal dynamic simulation for the series resonant converter

    NASA Technical Reports Server (NTRS)

    King, R. J.; Stuart, T. A.

    1983-01-01

    A simple nonlinear discrete-time dynamic model for the series resonant dc-dc converter is derived using approximations appropriate to most power converters. This model is useful for the dynamic simulation of a series resonant converter using only a desktop calculator. The model is compared with a laboratory converter for a large transient event.

  13. Beacon Editor: Capturing Signal Transduction Pathways Using the Systems Biology Graphical Notation Activity Flow Language.

    PubMed

    Elmarakeby, Haitham; Arefiyan, Mostafa; Myers, Elijah; Li, Song; Grene, Ruth; Heath, Lenwood S

    2017-12-01

    The Beacon Editor is a cross-platform desktop application for the creation and modification of signal transduction pathways using the Systems Biology Graphical Notation Activity Flow (SBGN-AF) language. Prompted by biologists' requests for enhancements, the Beacon Editor includes numerous powerful features for the benefit of creation and presentation.

  14. A Fine-Tuned Look at White Space Variation in Desktop Publishing.

    ERIC Educational Resources Information Center

    Knupfer, Nancy Nelson; McIsaac, Marina Stock

    This investigation of the use of white space in print-based, computer-generated text focused on the point at which the white space interferes with reading speed and comprehension. It was hypothesized that reading speed and comprehension would be significantly greater when text was wrapped tightly around the graphic than when it had one-half inch…

  15. Simulation Packages Expand Aircraft Design Options

    NASA Technical Reports Server (NTRS)

    2013-01-01

    In 2001, NASA released a new approach to computational fluid dynamics that allows users to perform automated analysis on complex vehicle designs. In 2010, Palo Alto, California-based Desktop Aeronautics acquired a license from Ames Research Center to sell the technology. Today, the product assists organizations in the design of subsonic aircraft, space planes, spacecraft, and high speed commercial jets.

  16. iChat[TM] Do You? Using Desktop Web Conferencing in Education

    ERIC Educational Resources Information Center

    Bell, Randy L.; Garofalo, Joe

    2006-01-01

    Videoconferencing is not a new technology and it has been widely used in educational settings since the mid-1980s. Videoconferencing has evolved into the integration of personal computers to what is now referred to as Web conferencing. In the mid-1990s, Internet Protocol (IP) was introduced into the mainstream but the educational community has…

  17. 3D Printing in Technology and Engineering Education

    ERIC Educational Resources Information Center

    Martin, Robert L.; Bowden, Nicholas S.; Merrill, Chris

    2014-01-01

    In the past five years, there has been tremendous growth in the production and use of desktop 3D printers. This growth has been driven by the increasing availability of inexpensive computing and electronics technologies. The ability to rapidly share ideas and intelligence over the Internet has also played a key role in the growth. Growth is also…

  18. The Role of Theory and Technology in Learning Video Production: The Challenge of Change

    ERIC Educational Resources Information Center

    Shewbridge, William; Berge, Zane L.

    2004-01-01

    The video production field has evolved beyond being exclusively relevant to broadcast television. The convergence of low-cost consumer cameras and desktop computer editing has led to new applications of video in a wide range of areas, including the classroom. This presents educators with an opportunity to rethink how students learn video…

  19. Projectors: The Big Picture

    ERIC Educational Resources Information Center

    Gamble-Risley, Michelle

    2006-01-01

    In the past, projection systems were large, heavy, and unwieldy and cost $3,000 to $5,000. Setup was fraught with the challenges of multiple wires plugged into the backs of desktop computers, often causing confusion about what went where. Systems were sometimes so difficult to set up that teachers had to spend pre-class time putting them together.…

  20. Training Learners to Use Quizlet Vocabulary Activities on Mobile Phones in Vietnam with Facebook

    ERIC Educational Resources Information Center

    Tran, Phuong

    2016-01-01

    Mobile phone ownership among university students in Vietnam has reached almost 100%, exceeding that of Internet-capable desktop computers. This has made them increasingly popular to allow learners to carry out learning activities outside of the classroom, but some studies have suggested that learners are not always willing to engage in activities…

  1. CALLing All Foreign Language Teachers: Computer-Assisted Language Learning in the Classroom

    ERIC Educational Resources Information Center

    Erben, Tony, Ed.; Sarieva, Iona, Ed.

    2008-01-01

    This book is a comprehensive guide to help foreign language teachers use technology in their classrooms. It offers the best ways to integrate technology into teaching for student-centered learning. CALL Activities include: Email; Building a Web site; Using search engines; Powerpoint; Desktop publishing; Creating sound files; iMovie; Internet chat;…

  2. Teacher Associations: Extending Our Advocacy Reach

    ERIC Educational Resources Information Center

    Boitnott, Kitty

    2012-01-01

    School library media specialists have seen more fundamental changes in their jobs and in their roles within their schools than any other group of education professionals. The author started her first job as a school librarian when there were no desktop computers, and card catalogs were the order of the day. Over the course of the thirty-seven…

  3. Programs for road network planning.

    Treesearch

    Ward W. Carson; Dennis P. Dykstra

    1978-01-01

    This paper describes four computer programs developed to assist logging engineers to plan transportation in a forest. The objective of these programs, to be used together, is to find the shortest path through a transportation network from a point of departure to a destination. Three of the programs use the digitizing and plotting capabilities of a programable desk-top...

  4. Fire characteristics charts for fire behavior and U.S. fire danger rating

    Treesearch

    Faith Ann Heinsch; Pat Andrews

    2010-01-01

    The fire characteristics chart is a graphical method of presenting U.S. National Fire Danger Rating indices or primary surface or crown fire behavior characteristics. A desktop computer application has been developed to produce fire characteristics charts in a format suitable for inclusion in reports and presentations. Many options include change of scales, colors,...

  5. Far beyond Show and Tell: Strategies for Integration of Desktop Documentary Making into History Classrooms

    ERIC Educational Resources Information Center

    Fehn, Bruce; Johnson, Melanie; Smith, Tyson

    2010-01-01

    Elementary and secondary school history students demonstrate a great deal of enthusiasm for making documentary films. With free and easy-to-use software, as well as vast online, archival resources containing images and sounds, students can sit at a computer and make serious and engaging documentary productions. With students affectively engaged by…

  6. Designing a Mobile Training System in Rural Areas with Bayesian Factor Models

    ERIC Educational Resources Information Center

    Omidi Najafabadi, Maryam; Mirdamadi, Seyed Mehdi; Payandeh Najafabadi, Amir Teimour

    2014-01-01

    The facts that the wireless technologies (1) are more convenient; and (2) need less skill than desktop computers, play a crucial role to decrease digital gap in rural areas. This study employed the Bayesian Confirmatory Factor Analysis (CFA) to design a mobile training system in rural areas of Iran. It categorized challenges, potential, and…

  7. The Effects of Instructor-Avatar Immediacy in Second Life, an Immersive and Interactive Three-Dimensional Virtual Environment

    ERIC Educational Resources Information Center

    Lawless-Reljic, Sabine Karine

    2010-01-01

    Growing interest of educational institutions in desktop 3D graphic virtual environments for hybrid and distance education prompts questions on the efficacy of such tools. Virtual worlds, such as Second Life[R], enable computer-mediated immersion and interactions encompassing multimodal communication channels including audio, video, and text-.…

  8. 75 FR 76040 - Dell Products LP, Winston-Salem (WS-1) Division, Including On-Site Leased Workers From Adecco...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ..., Winston-Salem, North Carolina. The notice was published in the Federal Register on April 23, 2010 (75 FR... from Staffing Solutions, South East, and Omni Resources and Recovery. The notices were published in the... firm. The workers are engaged in employment related to the production of desktop computers. New...

  9. 76 FR 2710 - Dell Products LP, Winston-Salem (WS-1) Division, Including On-Site Leased Workers From Adecco...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-14

    ..., APN and ICONMA, Winston-Salem, North Carolina. The notice was published in the Federal Register on... Seaton Corporation. The notices were published on the Federal Register on April 19, 2010 (75 FR 20385... workers are engaged in employment related to the production of desktop computers. New information shows...

  10. Ethnography at a Distance: Globally Mobile Parents Choosing International Schools

    ERIC Educational Resources Information Center

    Forsey, Martin; Breidenstein, Georg; Krüger, Oliver; Roch, Anna

    2015-01-01

    The research we report on was conducted from our computer desktops. We have not met the people we have studied; they are part of what Eichhorn described as a "textual community", gathered around the threads of online conversations associated with a website servicing the needs of English-language speakers in Germany. The thread in…

  11. When Neurons Meet Electrons: Three Trends That Are Sparking Change in Computer Publishing.

    ERIC Educational Resources Information Center

    Cranney, Charles

    1992-01-01

    Three important trends in desktop publishing include (1) use of multiple media in presentation of information; (2) networking; and (3) "hot links" (integrated file-exchange formats). It is also important for college publications professionals to be familiar with sources of information about technological change and to be able to sort out the…

  12. Teaching Information Literacy Using Electronic Resources for Grades 6-12. Professional Growth Series.

    ERIC Educational Resources Information Center

    Anderson, Mary Alice, Ed.

    This notebook is a compilation of 53 lesson plans for grades 6-12, written by various authors and focusing on the integration of technology into the curriculum. Lesson plans include topics such as online catalog searching, electronic encyclopedias, CD-ROM databases, exploring the Internet, creating a computer slide show, desktop publishing, and…

  13. 76 FR 47114 - Wireless E911 Location Accuracy Requirements; E911 Requirements for IP-Enabled Service Providers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... desktop computers and mobile devices, we expect significant innovation to continue in the provision of... inside of buildings.'' 47. Discussion. Publicly available reports, such as a March 2011 study from J. D... installed Wi-Fi access points, and a growing number of mobile devices (e.g., smartphones, laptops, and...

  14. An Ethical Dilemma: Talking about Plagiarism and Academic Integrity in the Digital Age

    ERIC Educational Resources Information Center

    Thomas, Ebony Elizabeth; Sassi, Kelly

    2011-01-01

    Today, many students not only access the Internet through desktop and laptop computers at home or at school but also have copious amounts of information at their fingertips via portable devices (e.g., iPods, iPads, netbooks, smartphones). While some teachers welcome the proliferation of portable technologies and easy wireless Internet access, and…

  15. Makin' It Happen with Business & Marketing Education. Annual Atlantic Coast Business & Marketing Education Conference Proceedings (13th, Raleigh, North Carolina, February 16-17, 1996). Volume 7.

    ERIC Educational Resources Information Center

    Goins, L. Keith, Ed.

    This proceedings includes the following papers: "Dealing with Discipline Problems in Schools" (Allen); "Developing Global Awareness" (Arnold); "Desktop Publishing Using WordPerfect 6.0 for Windows" (Broughton); "Learn and Earn" (Cauley); "Using the Computer to Teach Merchandising Math"…

  16. Code White: A Signed Code Protection Mechanism for Smartphones

    DTIC Science & Technology

    2010-09-01

    analogous to computer security is the use of antivirus (AV) software . 12 AV software is a brute force approach to security. The software ...these users, numerous malicious programs have also surfaced. And while smartphones have desktop-like capabilities to execute software , they do not...11 2.3.1 Antivirus and Mobile Phones ............................................................... 11 2.3.2

  17. Thermal and energy battery management optimization in electric vehicles using Pontryagin's maximum principle

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Suchaneck, Andre; Puente León, Fernando

    2014-01-01

    Depending on the actual battery temperature, electrical power demands in general have a varying impact on the life span of a battery. As electrical energy provided by the battery is needed to temper it, the question arises at which temperature which amount of energy optimally should be utilized for tempering. Therefore, the objective function that has to be optimized contains both the goal to maximize life expectancy and to minimize the amount of energy used for obtaining the first goal. In this paper, Pontryagin's maximum principle is used to derive a causal control strategy from such an objective function. The derivation of the causal strategy includes the determination of major factors that rule the optimal solution calculated with the maximum principle. The optimization is calculated offline on a desktop computer for all possible vehicle parameters and major factors. For the practical implementation in the vehicle, it is sufficient to have the values of the major factors determined only roughly in advance and the offline calculation results available. This feature sidesteps the drawback of several optimization strategies that require the exact knowledge of the future power demand. The resulting strategy's application is not limited to batteries in electric vehicles.

  18. Real-time embedded atmospheric compensation for long-range imaging using the average bispectrum speckle method

    NASA Astrophysics Data System (ADS)

    Curt, Petersen F.; Bodnar, Michael R.; Ortiz, Fernando E.; Carrano, Carmen J.; Kelmelis, Eric J.

    2009-02-01

    While imaging over long distances is critical to a number of security and defense applications, such as homeland security and launch tracking, current optical systems are limited in resolving power. This is largely a result of the turbulent atmosphere in the path between the region under observation and the imaging system, which can severely degrade captured imagery. There are a variety of post-processing techniques capable of recovering this obscured image information; however, the computational complexity of such approaches has prohibited real-time deployment and hampers the usability of these technologies in many scenarios. To overcome this limitation, we have designed and manufactured an embedded image processing system based on commodity hardware which can compensate for these atmospheric disturbances in real-time. Our system consists of a reformulation of the average bispectrum speckle method coupled with a high-end FPGA processing board, and employs modular I/O capable of interfacing with most common digital and analog video transport methods (composite, component, VGA, DVI, SDI, HD-SDI, etc.). By leveraging the custom, reconfigurable nature of the FPGA, we have achieved performance twenty times faster than a modern desktop PC, in a form-factor that is compact, low-power, and field-deployable.

  19. Analyzing Radio-Frequency Coverage for the ISS

    NASA Technical Reports Server (NTRS)

    Bolen, Steven M.; Sham, Catherine C.

    2007-01-01

    The Interactive Coverage Analysis Tool (iCAT) is an interactive desktop computer program serving to (1) support planning of coverage, and management of usage of frequencies, of current and proposed radio communication systems on and near the International Space Station (ISS) and (2) enable definition of requirements for development of future such systems. The iCAT can also be used in design trade studies for other (both outer-space and terrestrial) communication systems. A user can enter the parameters of a communication-system link budget in a table in a worksheet. The nominal (onaxis) link values for the bit-to-noise-energy ratio, received isotropic power (RIP), carrier-to-noise ratio (C/N), power flux density (PFD), and link margin of the system are calculated and displayed in the table. Plots of field gradients for the RIP, C/N, PFD, and link margin are constructed in an ISS coordinate system, at a specified link range, for both the forward and return link parameters, and are displayed in worksheets. The forward and reverse link antenna gain patterns are also constructed and displayed. Line-of-sight (LOS) obstructions can be both incorporated into the gradient plots and displayed on separate plots.

  20. Benchmarking multimedia performance

    NASA Astrophysics Data System (ADS)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  1. AltiVec performance increases for autonomous robotics for the MARSSCAPE architecture program

    NASA Astrophysics Data System (ADS)

    Gothard, Benny M.

    2002-02-01

    One of the main tall poles that must be overcome to develop a fully autonomous vehicle is the inability of the computer to understand its surrounding environment to a level that is required for the intended task. The military mission scenario requires a robot to interact in a complex, unstructured, dynamic environment. Reference A High Fidelity Multi-Sensor Scene Understanding System for Autonomous Navigation The Mobile Autonomous Robot Software Self Composing Adaptive Programming Environment (MarsScape) perception research addresses three aspects of the problem; sensor system design, processing architectures, and algorithm enhancements. A prototype perception system has been demonstrated on robotic High Mobility Multi-purpose Wheeled Vehicle and All Terrain Vehicle testbeds. This paper addresses the tall pole of processing requirements and the performance improvements based on the selected MarsScape Processing Architecture. The processor chosen is the Motorola Altivec-G4 Power PC(PPC) (1998 Motorola, Inc.), a highly parallized commercial Single Instruction Multiple Data processor. Both derived perception benchmarks and actual perception subsystems code will be benchmarked and compared against previous Demo II-Semi-autonomous Surrogate Vehicle processing architectures along with desktop Personal Computers(PC). Performance gains are highlighted with progress to date, and lessons learned and future directions are described.

  2. Basics of Desktop Publishing. Second Edition.

    ERIC Educational Resources Information Center

    Beeby, Ellen; Crummett, Jerrie

    This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…

  3. Desktop Publishing: A Brave New World and Publishing from the Desktop.

    ERIC Educational Resources Information Center

    Lormand, Robert; Rowe, Jane J.

    1988-01-01

    The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…

  4. Democratic Population Decisions Result in Robust Policy-Gradient Learning: A Parametric Study with GPU Simulations

    PubMed Central

    Richmond, Paul; Buesing, Lars; Giugliano, Michele; Vasilaki, Eleni

    2011-01-01

    High performance computing on the Graphics Processing Unit (GPU) is an emerging field driven by the promise of high computational power at a low cost. However, GPU programming is a non-trivial task and moreover architectural limitations raise the question of whether investing effort in this direction may be worthwhile. In this work, we use GPU programming to simulate a two-layer network of Integrate-and-Fire neurons with varying degrees of recurrent connectivity and investigate its ability to learn a simplified navigation task using a policy-gradient learning rule stemming from Reinforcement Learning. The purpose of this paper is twofold. First, we want to support the use of GPUs in the field of Computational Neuroscience. Second, using GPU computing power, we investigate the conditions under which the said architecture and learning rule demonstrate best performance. Our work indicates that networks featuring strong Mexican-Hat-shaped recurrent connections in the top layer, where decision making is governed by the formation of a stable activity bump in the neural population (a “non-democratic” mechanism), achieve mediocre learning results at best. In absence of recurrent connections, where all neurons “vote” independently (“democratic”) for a decision via population vector readout, the task is generally learned better and more robustly. Our study would have been extremely difficult on a desktop computer without the use of GPU programming. We present the routines developed for this purpose and show that a speed improvement of 5x up to 42x is provided versus optimised Python code. The higher speed is achieved when we exploit the parallelism of the GPU in the search of learning parameters. This suggests that efficient GPU programming can significantly reduce the time needed for simulating networks of spiking neurons, particularly when multiple parameter configurations are investigated. PMID:21572529

  5. Approaches in highly parameterized inversion-PESTCommander, a graphical user interface for file and run management across networks

    USGS Publications Warehouse

    Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.

    2012-01-01

    Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the parameter estimation software PEST; the discussion presented in this report focuses on the use of the PESTCommander together with Parallel PEST. However, PESTCommander can be used with a wide variety of programs and models that require management, distribution, and cleanup of files before or after model execution. In addition to its use with the Parallel PEST program suite, discussion is also included in this report regarding the use of PESTCommander with the Global Run Manager GENIE, which was developed simultaneously with PESTCommander.

  6. Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Alruwaili, Manal

    With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.

  7. An interactive physics-based unmanned ground vehicle simulator leveraging open source gaming technology: progress in the development and application of the virtual autonomous navigation environment (VANE) desktop

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Crawford, Justin; Toschlog, Matthew; Iagnemma, Karl D.; Kewlani, Guarav; Cummins, Christopher L.; Jones, Randolph A.; Horner, David A.

    2009-05-01

    It is widely recognized that simulation is pivotal to vehicle development, whether manned or unmanned. There are few dedicated choices, however, for those wishing to perform realistic, end-to-end simulations of unmanned ground vehicles (UGVs). The Virtual Autonomous Navigation Environment (VANE), under development by US Army Engineer Research and Development Center (ERDC), provides such capabilities but utilizes a High Performance Computing (HPC) Computational Testbed (CTB) and is not intended for on-line, real-time performance. A product of the VANE HPC research is a real-time desktop simulation application under development by the authors that provides a portal into the HPC environment as well as interaction with wider-scope semi-automated force simulations (e.g. OneSAF). This VANE desktop application, dubbed the Autonomous Navigation Virtual Environment Laboratory (ANVEL), enables analysis and testing of autonomous vehicle dynamics and terrain/obstacle interaction in real-time with the capability to interact within the HPC constructive geo-environmental CTB for high fidelity sensor evaluations. ANVEL leverages rigorous physics-based vehicle and vehicle-terrain interaction models in conjunction with high-quality, multimedia visualization techniques to form an intuitive, accurate engineering tool. The system provides an adaptable and customizable simulation platform that allows developers a controlled, repeatable testbed for advanced simulations. ANVEL leverages several key technologies not common to traditional engineering simulators, including techniques from the commercial video-game industry. These enable ANVEL to run on inexpensive commercial, off-the-shelf (COTS) hardware. In this paper, the authors describe key aspects of ANVEL and its development, as well as several initial applications of the system.

  8. Analysis of helium-ion scattering with a desktop computer

    NASA Astrophysics Data System (ADS)

    Butler, J. W.

    1986-04-01

    This paper describes a program written in an enhanced BASIC language for a desktop computer, for simulating the energy spectra of high-energy helium ions scattered into two concurrent detectors (backward and glancing). The program is designed for 512-channel spectra from samples containing up to 8 elements and 55 user-defined layers. The program is intended to meet the needs of analyses in materials sciences, such as metallurgy, where more than a few elements may be present, where several elements may be near each other in the periodic table, and where relatively deep structure may be important. These conditions preclude the use of completely automatic procedures for obtaining the sample composition directly from the scattered ion spectrum. Therefore, efficient methods are needed for entering and editing large amounts of composition data, with many iterations and with much feedback of information from the computer to the user. The internal video screen is used exclusively for verbal and numeric communications between user and computer. The composition matrix is edited on screen with a two-dimension forms-fill-in text editor and with many automatic procedures, such as doubling the number of layers with appropriate interpolations and extrapolations. The control center of the program is a bank of 10 keys that initiate on-event branching of program flow. The experimental and calculated spectra, including those of individual elements if desired, are displayed on an external color monitor, with an optional inset plot of the depth concentration profiles of the elements in the sample.

  9. Reducing nurses'. Workload using a computerized nursing support system linked to the hospital information system.

    PubMed

    Ito, C; Satoh, I; Michiya, H; Kitayama, Y; Miyazaki, K; Ota, S; Satoh, H; Sakurai, T; Shirato, H; Miyasaka, K

    1997-01-01

    A computerised nursing support system (CNSS) linked to the hospital information system (HIS) was developed and has been in use for one year, in order to reduce the workload of nurses. CNSS consists of (1) a hand held computer for each nurse (2) desk-top computers in the nurses' station and doctors' rooms (3) a data server (4) an interface with the main hospital information system. Nurses enter vital signs, food intake and other information about the patients into the hand held computer at the bed-side. The information is then sent automatically to the CNSS data server, which also receives patients' details (prescribed medicines etc.) from the HIS. Nurses and doctors can see all the information on the desk-top and hand held computers. This system was introduced in May 1995 into a university hospital ward with 40 beds. A questionnaire was completed by 23 nurses before and after the introduction of CNSS. The mean time required to post vital data was significantly reduced from 121 seconds to 54 seconds (p < 0.01). After three months 30% of nurses felt CNSS had reduced their workload, while 30% felt it had complicated their work; after five months 70% noted a reduction and 0% reported that CNSS had made their work more complex. The study therefore concludes that the interface between a computerised nursing support system and the hospital information system reduced the workload of nurses.

  10. Electronic patient data confidentiality practices among surgical trainees: questionnaire study.

    PubMed

    Mole, Damian J; Fox, Colin; Napolitano, Giulio

    2006-10-01

    The objective of this work was to evaluate the safeguards implemented by surgical trainees to protect the confidentiality of electronic patient data through a structured questionnaire sent to Northern Ireland surgical trainees. A group of 32 basic and higher surgical trainees attending a meeting of the Northern Ireland Association of Surgeons-in-Training were invited to complete a questionnaire regarding their computer use, UK Data Protection Act, 1988 registration and electronic data confidentiality practices. Of these 32 trainees, 29 returned completed questionnaires of whom 26 trainees regularly stored sensitive patient data for audit or research purposes on a computer. Only one person was registered under the Data Protection Act, 1988. Of the computers used to store and analyse sensitive data, only 3 of 14 desktops, 8 of 19 laptops and 3 of 14 hand-held computers forced a password logon. Of the 29 trainees, 16 used the same password for all machines, and 25 of 27 passwords were less than 8 characters long. Two respondents declined to reveal details of their secure passwords. Half of all trainees had never adjusted their internet security settings, despite all 14 desktops, 16 of 19 laptops and 5 of 14 hand-helds being routinely connected to the internet. Of the 29 trainees, 28 never encrypted their sensitive data files. Ten trainees had sent unencrypted sensitive patient data over the internet, using a non-secure server. Electronic data confidentiality practices amongst Northern Ireland surgical trainees are unsafe. Simple practical measures to safeguard confidentiality are recommended.

  11. Desktop Video Productions. ICEM Guidelines Publications No. 6.

    ERIC Educational Resources Information Center

    Taufour, P. A.

    Desktop video consists of integrating the processing of the video signal in a microcomputer. This definition implies that desktop video can take multiple forms such as virtual editing or digital video. Desktop video, which does not imply any particular technology, has been approached in different ways in different technical fields. It remains a…

  12. Desktop Publishing: Organizational Considerations for Adoption and Implementation. TDC Research Report No. 6.

    ERIC Educational Resources Information Center

    Lee, Paul

    This report explores the implementation of desktop publishing in the Minnesota Extension Service (MES) and provides a framework for its implementation in other organizations. The document begins with historical background on the development of desktop publishing. Criteria for deciding whether to purchase a desktop publishing system, advantages and…

  13. Prevalence of Invalid Computerized Baseline Neurocognitive Test Results in High School and Collegiate Athletes

    PubMed Central

    Schatz, Philip; Moser, Rosemarie Scolaro; Solomon, Gary S.; Ott, Summer D.; Karpf, Robin

    2012-01-01

    Context: Limited data are available regarding the prevalence and nature of invalid computerized baseline neurocognitive test data. Objective: To identify the prevalence of invalid baselines on the desktop and online versions of ImPACT and to document the utility of correcting for left-right (L-R) confusion on the desktop version of ImPACT. Design: Cross-sectional study of independent samples of high school (HS) and collegiate athletes who completed the desktop or online versions of ImPACT. Participants or Other Participants: A total of 3769 HS (desktop  =  1617, online  =  2152) and 2130 collegiate (desktop  =  742, online  =  1388) athletes completed preseason baseline assessments. Main Outcome Measure(s): Prevalence of 5 ImPACT validity indicators, with correction for L-R confusion (reversing left and right mouse-click responses) on the desktop version, by test version and group. Chi-square analyses were conducted for sex and attentional or learning disorders. Results: At least 1 invalid indicator was present on 11.9% (desktop) versus 6.3% (online) of the HS baselines and 10.2% (desktop) versus 4.1% (online) of collegiate baselines; correcting for L-R confusion (desktop) decreased this overall prevalence to 8.4% (HS) and 7.5% (collegiate). Online Impulse Control scores alone yielded 0.4% (HS) and 0.9% (collegiate) invalid baselines, compared with 9.0% (HS) and 5.4% (collegiate) on the desktop version; correcting for L-R confusion (desktop) decreased the prevalence of invalid Impulse Control scores to 5.4% (HS) and 2.6% (collegiate). Male athletes and HS athletes with attention deficit or learning disorders who took the online version were more likely to have at least 1 invalid indicator. Utility of additional invalidity indicators is reported. Conclusions: The online ImPACT version appeared to yield fewer invalid baseline results than did the desktop version. Identification of L-R confusion reduces the prevalence of invalid baselines (desktop only) and the potency of Impulse Control as a validity indicator. We advise test administrators to be vigilant in identifying invalid baseline results as part of routine concussion management and prevention programs. PMID:22892410

  14. Semantic Document Model to Enhance Data and Knowledge Interoperability

    NASA Astrophysics Data System (ADS)

    Nešić, Saša

    To enable document data and knowledge to be efficiently shared and reused across application, enterprise, and community boundaries, desktop documents should be completely open and queryable resources, whose data and knowledge are represented in a form understandable to both humans and machines. At the same time, these are the requirements that desktop documents need to satisfy in order to contribute to the visions of the Semantic Web. With the aim of achieving this goal, we have developed the Semantic Document Model (SDM), which turns desktop documents into Semantic Documents as uniquely identified and semantically annotated composite resources, that can be instantiated into human-readable (HR) and machine-processable (MP) forms. In this paper, we present the SDM along with an RDF and ontology-based solution for the MP document instance. Moreover, on top of the proposed model, we have built the Semantic Document Management System (SDMS), which provides a set of services that exploit the model. As an application example that takes advantage of SDMS services, we have extended MS Office with a set of tools that enables users to transform MS Office documents (e.g., MS Word and MS PowerPoint) into Semantic Documents, and to search local and distant semantic document repositories for document content units (CUs) over Semantic Web protocols.

  15. Effects of mouse slant and desktop position on muscular and postural stresses, subject preference and performance in women aged 18-40 years.

    PubMed

    Gaudez, Clarisse; Cail, François

    2016-11-01

    This study compared muscular and postural stresses, performance and subject preference in women aged 18-40 years using a standard mouse, a vertical mouse and a slanted mouse in three different computer workstation positions. Four tasks were analysed: pointing, pointing-clicking, pointing-clicking-dragging and grasping-pointing the mouse after typing. Flexor digitorum superficialis (FDS) and extensor carpi radialis (ECR) activities were greater using the standard mouse compared to the vertical or slanted mouse. In all cases, the wrist position remained in the comfort zone recommended by standard ISO 11228-3. The vertical mouse was less comfortable and more difficult to use than the other two mice. FDS and ECR activities, shoulder abduction and wrist extension were greater when the mouse was placed next to the keyboard. Performance and subject preference were better with the unrestricted mouse positioning on the desktop. Grasping the mouse after typing was the task that caused the greatest stress. Practitioner Summary: In women, the slanted mouse and the unrestricted mouse positioning on the desktop provide a good blend of stresses, performance and preference. Unrestricted mouse positioning requires no keyboard, which is rare in practice. Placing the mouse in front of the keyboard, rather than next to it, reduced the physical load.

  16. Cloud-Based Tools to Support High-Resolution Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Swain, N.; Christensen, S.

    2013-12-01

    The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  17. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less

  18. Software Workshop.

    ERIC Educational Resources Information Center

    Lazerick, Beth

    1990-01-01

    This article describes desktop publishing and discusses the features and classroom uses of one of the newest desktop publishing programs. Several desktop publishing projects for teachers and students are suggested. (IAH)

  19. Time Series Data Visualization in World Wide Telescope

    NASA Astrophysics Data System (ADS)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  20. An analysis of running skyline load path.

    Treesearch

    Ward W. Carson; Charles N. Mann

    1971-01-01

    This paper is intended for those who wish to prepare an algorithm to determine the load path of a running skyline. The mathematics of a simplified approach to this running skyline design problem are presented. The approach employs assumptions which reduce the complexity of the problem to the point where it can be solved on desk-top computers of limited capacities. The...

Top