1987-06-01
International Business Machines ( IBM ) Corporation compatible synchronous terminals (2780/3780/327X), and the Federal Data Corporation (FDC) has developed...the interfaces for Burroughs look-alike asynchronous and synchronous terminals. Basically, this means that the IBM and Burroughs protocols are...and other vendor computers, such as IBM , UNIVAC, and Honeywell. The Navy has developed file transfer capabilities between Tandem and Burroughs. These
International Futures (IFs): A Global Issues Simulation for Teaching and Research.
ERIC Educational Resources Information Center
Hughes, Barry B.
This paper describes the International Futures (IFs) computer assisted simulation game for use with undergraduates. Written in Standard Fortran IV, the model currently runs on mainframe or mini computers, but has not been adapted for micros. It has been successfully installed on Harris, Burroughs, Telefunken, CDC, Univac, IBM, and Prime machines.…
The Mathematics and Computer Science Learning Center (MLC).
ERIC Educational Resources Information Center
Abraham, Solomon T.
The Mathematics and Computer Science Learning Center (MLC) was established in the Department of Mathematics at North Carolina Central University during the fall semester of the 1982-83 academic year. The initial operations of the MLC were supported by grants to the University from the Burroughs-Wellcome Company and the Kenan Charitable Trust Fund.…
A Corrupt Medium: Stephen Burroughs and the Bridgehampton, New York, Library.
ERIC Educational Resources Information Center
Ashton, Susanna
2003-01-01
Discusses criminal Stephen Burroughs'"The Memoirs of Stephen Burroughs", a well-known rogue narrative of the 19th century, and his campaign to establish a library in Bridgehampton, New York. Topics include rationalism; the role of reading; the growth of libraries following the American Revolution; and the role of individual…
An Automated Circulation System for a Small Technical Library.
ERIC Educational Resources Information Center
Culnan, Mary J.
The traditional manually-controlled circulation records of the Burroughs Corporation Library in Goleta, California, presented problems of inaccuracies, time time-consuming searches, and lack of use statistics. An automated system with the capacity to do file maintenance and statistical record-keeping was implemented on a Burroughts B1700 computer.…
VIEW NORTH, SHOWING ORIGINAL NAMEPLATE ON SOUTH END OF SOUTHEAST ...
VIEW NORTH, SHOWING ORIGINAL NAMEPLATE ON SOUTH END OF SOUTHEAST WINGWALL - Belleville Road Bridge, Spanning the Flat River at Burroughs Drive (replaced Burroughs Street Bridge), Lowell, Kent County, MI
VIEW NORTH, SHOWING SOUTH END OF BRIDGE LOOKING TOWARD VAN ...
VIEW NORTH, SHOWING SOUTH END OF BRIDGE LOOKING TOWARD VAN BUREN TOWNSHIP - Belleville Road Bridge, Spanning the Flat River at Burroughs Drive (replaced Burroughs Street Bridge), Lowell, Kent County, MI
A Management System for Computer Performance Evaluation.
1981-12-01
AD-AIlS 538 AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOO-ETC F/6 S/1 MANAGEMENT SYSTEM FOR COMPUTER PERFORMANCE EVALUATION. (U DEC 81 H K...release; distribution unlimited. AFIT/GCS/1,Y/81 D)-i PREFACE As an installation manager of a Burroughs 3500 1 erncountered many problems concerning its...techniques to select, and finally, how do I organize the effort. As a manager I felt that I needed a reference or tool that would broaden my OPE
A translational registration system for LANDSAT image segments
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Erthal, G. J.; Velasco, F. R. D.; Mascarenhas, N. D. D.
1983-01-01
The use of satellite images obtained from various dates is essential for crop forecast systems. In order to make possible a multitemporal analysis, it is necessary that images belonging to each acquisition have pixel-wise correspondence. A system developed to obtain, register and record image segments from LANDSAT images in computer compatible tapes is described. The translational registration of the segments is performed by correlating image edges in different acquisitions. The system was constructed for the Burroughs B6800 computer in ALGOL language.
How to Write About a Bumblebee--John Burroughs.
ERIC Educational Resources Information Center
Stock, Tom
1981-01-01
The personal comments of literary naturalist, John Burroughs, whose 60-year writing career began in the 1860s, can guide writing teachers today. Recommended techniques include literary walks, a fermentation process between contact with nature and writing about it, emphasis on clarity and truth, and keeping a journal. (NEC)
A Review of Software Maintenance Technology.
1980-02-01
LABS HONEYWELL, REF BURROUGHS, OTHERS BOOLE £ LANGUAGE "PE BABBAGE , OPERATIONAL INDEPENDENT 4.2.17 INC. IBM H M RELIABILITY MOST LARGE U. MEASUREMENT...80-13 ML llluuluunuuuuu SmeeI..... f. Maintenance Experience (1) Multiple Implementation Charles Holmes (Source 2) described two attempts at McDonnell...proprietary software monitor package distributed by Boole and Babbage , Inc., Sunnyvale, California. it has been implemented on IBM computers and is language
Can dryland geoproxy data generate Quaternary palaeoclimate and palaeoenvironmental records?
NASA Astrophysics Data System (ADS)
Thomas, David S. G.
2017-04-01
Dryland regions present many challenges for robustly reconstructing late Quaternary palaeoenvironments and palaeoclimates, not least a common deficit, or considerable spatial variability, in the availability of high resolution biological proxy data sources. Substantial advances have been made in some regions in recent years, through the exploitation of new high resolution biomarker and isotope records, for example from hyrax middens (e.g. Chase et al., 2012) and from offshore sediments (e.g. Collins et al., 2014). In others however, suitable data sources for these approaches are absent, so these approaches are not available or if data are applied from distant sources, subject to risks of excessive spatial extrapolation of records in environmental contexts where environmental gradients are steep and variability is common (Thomas and Burrough, 2012, Thomas et al., 2013). In these contexts, geoproxy records, derived from the analysis of landforms and their associated sediments, are often utilised in dryland Quaternary research (e.g. Burrough and Thomas 2009, Stone and Thomas, 2013, Thomas, 2013, Lancaster et al., 2015), but with a number of associated difficulties (e.g. Chase, 2009). This paper examines these difficulties and then explores different approaches to the analysis of Quaternary landform records. It is argued that geoproxies with chronometric control, usually provided by OSL dating, have considerable potential to improve data on Quaternary environmental and climate dynamics, if records are interpreted effectively and appropriately (e.g. Bailey and Thomas, 2014, Thomas and Burrough, 2016). Examples of challenges and new approaches will be drawn from aeolian and fluvial domains, and from research in Africa, Australia and Arabia. Bailey RM, Thomas DSG 2014 Earth Surf. Proc. Landf. 39, 614-631. Burrough SL, Thomas DSG 2009. Geomorphology 103, 285-298. Chase, B 2009. Earth-Sci Rev. 93, 31-45. Chase BM et al. 2010 Quat. Sci. Rev. 56, 107-125. Collins JA et al. 2014 Earth Planet. Sci. Let. 398, 1-10. Lancaster N et al. 2016 Quat. Int. 410, 3-10. Stone AEC, Thomas DSG 2013 J. Arid Env. 93, 40-58. Thomas DSG 2013 Earth Surf. Proc. Landf. 38, 3-16. Thomas DSG, Burrough SL 2012 Quat. Int. 253, 5-17. Thomas DSG et al. 2012 J. Quat. Sci. 27, 7-12. Thomas DSG, Burrough SL 2016 Quat. Int. 410, 30-45.
1986-03-01
and universal terminal/printer interface mapping ( TMAP ) software. When the Burroughs HYPERchannel software package (i.e., Burroughs NETEX) provided...and terminal device and security functions placed under the control of the FDC’s SAS/ TMAP processes. Without processing efficiency enhancements, TAPS...FDC’s SAS/ TMAP processes. As was also previously indicated, the performance of TAPS II on TANDEM is poor today, and there are questions as whether
1979-08-01
WHAT YOU DO IN YOUR PRESENT JOB. IdiS IS hQ A E&T. NEITHER YCU, YOUR CCMMANDER, NOR YOUR UNIT WILL BE EVALUATED , IN ANY WAY, CN THE INFORMATICN YCU...ACCOLNTING MACHINE (PCAM) OPERATOR 336. EAM/PCAM SUPERVISOR 231. EAM/PCAM WIRING TECI-NICIAN )38. EQUIPMENT MANAGEMENT TECFNICIAN 339. EVALUATION AND...R RAMO .4 12. BUNKER RAMC 13) 13. BUNKER RAMO 2)) 14. aJNKER RAMU 1563 15. BUhROdGHS 263 16. BuRROUGHS B d)) 17. BURROUGHS b35 .J 18. BjRROUGHS 847
Does the Weather Really Matter?
NASA Astrophysics Data System (ADS)
Burroughs, William James
1997-09-01
We talk about it endlessly, write about it copiously, and predict it badly. It influences what we do, what we wear, and how we live. Weather--how does it really impact our lives? In this compelling look at weather, author Burroughs combines historical perspective and economic and political analysis to give the impact of weather and climate change relevance and weight. He examines whether the frequency of extreme events is changing and the consequences of these changes. He looks at the chaotic nature of the climate and how this unpredictability can impose serious limits on how we plan for the future. Finally, he poses the important question: what types of serious, even less predictable changes are around the corner? In balanced and accessible prose, Burroughs works these issues into lucid analysis. This refreshing and insightful look at the impact of weather will appeal to anyone who has ever worried about forgetting an umbrella. William James Burroughs is the author of Watching the World's Weather (CUP, 1991) and Weather Cycles: Real or Imaginary? (CUP, 1994).
1981-02-01
N E. Of CAPA:;hINTRK) INCAP AV; TNJ73 170 GO t) 3000 A V.; T NI 1 7 3100 NT=I A V’ NI 17 SUBROUTINE AVSlN 74./74 OPTzg ROUND-*/ YRACE FTN N.lj# 4bD ...bSTOP FRR 6 tD ERR 7 SUFRUUTIKE LNCF(ICEVI ENflF 2 INiTLt.Ei% DATL.02.IPARK.REGPI53l.EPAL1501 ENDF 3 ,RLAL ISHFT ENnF 1 ALPHA INAIIEI5C ).0§IG(501
Spiegel, A D; Spiegel, M S
1991-01-01
In July 1865, the Harris/Burroughs trial marked the first time in a U.S. courtroom that expert medical testimony supported a plea of paroxysmal [temporary] insanity in a murder defense. Furthermore, the "medical expert" ["mad doctor"] was pitted against "common-sense" physicians. Forensic rationales and societal reactions of the 1860s appear to be remarkably similar to what happens in the 1990s. By merely changing the antebellum language, the arguments and ripostes could readily be recycled into current temporary insanity confrontations. Sociocultural aspects of the Harris/Burroughs murder case may yield clues as to the persistence of the forensic and attitudinal stances toward temporary insanity pleas by the mass media, the physicians, the legal profession and the public.
A translator and simulator for the Burroughs D machine
NASA Technical Reports Server (NTRS)
Roberts, J.
1972-01-01
The D Machine is described as a small user microprogrammable computer designed to be a versatile building block for such diverse functions as: disk file controllers, I/O controllers, and emulators. TRANSLANG is an ALGOL-like language, which allows D Machine users to write microprograms in an English-like format as opposed to creating binary bit pattern maps. The TRANSLANG translator parses TRANSLANG programs into D Machine microinstruction bit patterns which can be executed on the D Machine simulator. In addition to simulation and translation, the two programs also offer several debugging tools, such as: a full set of diagnostic error messages, register dumps, simulated memory dumps, traces on instructions and groups of instructions, and breakpoints.
"All in the Day's Work": Cold War Doctoring and Its Discontents in William Burroughs's Naked Lunch.
Jarvis, Michael
In Naked Lunch, the institutions and practices of science and medicine, specifically with regard to psychiatry/psychology, are symptoms of a bureaucratic system of control that shapes, constructs, defines, and makes procrustean alterations to both the mind and body of human subjects. Using sickness and junk (or heroin) as convenient metaphors for both a Cold War binary mentality and the mandatory consumption of twentieth-century capitalism, Burroughs presents modern man as fundamentally alienated from any sense of a personal self. Through policing the health of citizens, the doctors are some of the novel's most overt "Senders," or agents of capital-C Control, commodifying and exploiting the individual's humanity (mind and body) as a raw material in the generation of a knowledge that functions only in the legitimation and reinforcement of itself as authoritative.
Potential impacts of robust surface roughness indexes on DTM-based segmentation
NASA Astrophysics Data System (ADS)
Trevisani, Sebastiano; Rocca, Michele
2017-04-01
In this study, we explore the impact of robust surface texture indexes based on MAD (median absolute differences), implemented by Trevisani and Rocca (2015), in the unsupervised morphological segmentation of an alpine basin. The area was already object of a geomorphometric analysis, consisting in the roughness-based segmentation of the landscape (Trevisani et al. 2012); the roughness indexes were calculated on a high resolution DTM derived by means of airborne Lidar using the variogram as estimator. The calculated roughness indexes have been then used for the fuzzy clustering (Odeh et al., 1992; Burrough et al., 2000) of the basin, revealing the high informative geomorphometric content of the roughness-based indexes. However, the fuzzy clustering revealed a high fuzziness and a high degree of mixing between textural classes; this was ascribed both to the morphological complexity of the basin and to the high sensitivity of variogram to non-stationarity and signal-noise. Accordingly, we explore how the new implemented roughness indexes based on MAD affect the morphological segmentation of the studied basin. References Burrough, P.A., Van Gaans, P.F.M., MacMillan, R.A., 2000. High-resolution landform classification using fuzzy k-means. Fuzzy Sets and Systems 113, 37-52. Odeh, I.O.A., McBratney, A.B., Chittleborough, D.J., 1992. Soil pattern recognition with fuzzy-c-means: application to classification and soil-landform interrelationships. Soil Sciences Society of America Journal 56, 505-516. Trevisani, S., Cavalli, M. & Marchi, L. 2012, "Surface texture analysis of a high-resolution DTM: Interpreting an alpine basin", Geomorphology, vol. 161-162, pp. 26-39. Trevisani, S. & Rocca, M. 2015, "MAD: Robust image texture analysis for applications in high resolution geomorphometry", Computers and Geosciences, vol. 81, pp. 78-92.
Partnering Research Involving Mentoring and Education (PRIME) in Prostate Cancer
2007-02-01
Nurses Symposium on Cancer in African Americans, Atlanta. 7. Price, M.M. (1994, October 28-30). “Living with Genital Herpes : Counseling the...Patient”, Paper presented and Seminar Moderator for the Burroughs Wellcome Pharmaceutical Corporation Nursing Conference on Genital Herpes , Research
Collaboration Around Research and Education (CARE) in Prostate Cancer
2010-02-01
the Oncology Nurses Symposium on Cancer in African Americans, Atlanta, GA. 7. Price, M.M. (1994, October 28-30). “Living with Genital Herpes ...Counseling the Patient”. Paper presented and Seminar Moderator for the Burroughs Wellcome Pharmaceutical Corporation Nursing Conference on Genital Herpes
Collaboration around Research and Education (Care) in Prostate Cancer
2008-02-01
M.M. (1994, October 28-30). “Living with Genital Herpes : Counseling the Patient”, Paper presented and Seminar Moderator for the Burroughs Wellcome...Pharmaceutical Corporation Nursing Conference on Genital Herpes , Research Triangle Park, NC. 8. Price, M.M. (1995, April, Miami; 1995, March
Marketing of patent medicines in the nineteenth century via a corkscrew medicine spoon.
Fincham, Jack E
2010-01-01
The C. T. Williamson spoon with manufactured products from a pharmaceutical company engraved on the bowl of the spoon is one of the earliest examples of a manufacturer marketing products via a drug delivery device. The Burroughs, Wellcome and Company, a British corporation using initially an American patented, and later a British patented, Williamson corkscrew spoon marketed British manufactured medicinal products in the U.S. and England to physicians and pharmacists in the late nineteenth and early twentieth century. Other corkscrew spoons were manufactured in this era without product specific notations contained on the spoons. 40 These corkscrew spoons, such as the Williamson and Noe patented apparatuses, helped patients in more easily consuming liquid medications. They also were items potentially favored by physicians and pharmacists for patient's pro- vided liquid medications. Finally, they allowed patients to open corked containers, consume liquid dosage amounts, and hopefully more appropriately comply with necessary regimens in the late nineteenth and early twentieth century. Not surprisingly, Burroughs, Wellcome and Company used the Williamson spoon to successfully market company products to physicians, pharmacists, and patients on several continents.
Stockstill, K.R.; Vogel, T.A.; Sisson, T.W.
2002-01-01
Burroughs Mountain, situated at the northeast foot of Mount Rainier, WA, exposes a large-volume (3.4 km3) andesitic lava flow, up to 350 m thick and extending 11 km in length. Two sampling traverses from flow base to eroded top, over vertical sections of 245 and 300 m, show that the flow consists of a felsic lower unit (100 m thick) overlain sharply by a more mafic upper unit. The mafic upper unit is chemically zoned, becoming slightly more evolved upward; the lower unit is heterogeneous and unzoned. The lower unit is also more phenocryst-rich and locally contains inclusions of quenched basaltic andesite magma that are absent from the upper unit. Widespread, vuggy, gabbronorite-to-diorite inclusions may be fragments of shallow cumulates, exhumed from the Mount Rainier magmatic system. Chemically heterogeneous block-and-ash-flow deposits that conformably underlie the lava flow were the earliest products of the eruptive episode. The felsic-mafic-felsic progression in lava composition resulted from partial evacuation of a vertically-zoned magma reservoir, in which either (1) average depth of withdrawal increased, then decreased, during eruption, perhaps due to variations in effusion rate, or (2) magmatic recharge stimulated ascent of a plume that brought less evolved magma to shallow levels at an intermediate stage of the eruption. Pre-eruptive zonation resulted from combined crystallization- differentiation and intrusion(s) of less evolved magma into the partly crystallized resident magma body. The zoned lava flow at Burroughs Mountain shows that, at times, Mount Rainier's magmatic system has developed relatively large, shallow reservoirs that, despite complex recharge events, were capable of developing a felsic-upward compositional zonation similar to that inferred from large ash-flow sheets and other zoned lava flows. ?? 2002 Elsevier Science B.V. All rights reserved.
75 FR 79084 - Qualification of Drivers; Exemption Applications; Vision
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-17
....gpo.gov/2008/pdf/E8-785.pdf . FOR FURTHER INFORMATION CONTACT: Dr. Mary D. Gunnels, Director, Medical... are: Johnny Becerra John B. Ethridge Michael B. McClure Ross E. Burroughs Larry J. Folkerts Francis M. McMullin Lester W. Carter Paul W. Hunter Norman Mullins Christopher L. DePuy Ray P. Lenz David Triplett The...
ERIC Educational Resources Information Center
Murray, Alana D
2012-01-01
The purpose of this dissertation is to explore the development of the alternative black curriculum in social studies from 1890-1940. W.E.B. Du Bois and Carter G. Woodson worked in collaboration with women educators Nannie H. Burroughs and Anna Julia Cooper to create an alternative black curriculum that would support the intellectual growth of…
Partnering Research Involving Mentoring and Education (PRIME) in Prostate Cancer
2008-08-01
Genital Herpes , Research Triangle Park, NC. 8. Price, M.M. (1995, April, Miami; 1995, March, Washington, DC; & 1995, February, Philadelphia...Workshop during the Oncology Nurses Symposium on Cancer in African Americans, Atlanta. 7. Price, M.M. (1994, October 28-30). “Living with Genital ... Herpes : Counseling the Patient”, Paper presented and Seminar Moderator for the Burroughs Wellcome Pharmaceutical Corporation Nursing Conference on
Structure, composition and thermal state of the crust in Brazil. [geomagnetic survey
NASA Technical Reports Server (NTRS)
Pacca, I. I. G. (Principal Investigator); Shukowsky, W.
1981-01-01
Efforts in support of a geomagnetic survey of the Brazilian area are described. Software to convert MAGSAT data tapes to the Burroughs/B-6700 binary format was developed and tested. A preliminary analysis of the first total intensity anomaly map was performed and methodologies for more intensive analysis were defined. The sources for correlative geological, aeromagnetic, and gravimetric data are described.
Translations on Telecommunications Policy, Research and Development, No. 59.
1978-11-14
14 COBRA Official Seeks CAPRE Criteria Applied Throughout Industry (JORNAL DO BRASIL, 4 Oct 78) 16 EMFA Supports Incentives for Digital ...Terminals (0 ESTADO DE SAO PAULO, 29 Sep 78) 22 CUBA Telephone Service Expansion in Holguin ( JUVENTUD REBELDE, 22 Sep 78) 23 Briefs Direct...that next year it plans to be the second, overcoming Burroughs in invoicing. 12,116 CSO: 5500 17 BRAZIL EMFA SUPPORTS INCENTIVES FOR DIGITAL
Image Understanding Architecture Prototype Evaluation and Development
1993-06-01
combination of the two. There are examples where a hybrid of these approaches have been used (for example, Cedar [ Gajski 83; 86], NETRA [Sharma 85], and PM4...interconnection network can be found in Cedar [ Gajski 83; Gajski 86], Aquarius [Srini 851, the Las Alamos project [Trujillo 82], and the Burroughs Scientific...Foster 76 1 C. C. Foster, Content Addressable Parallel Processors, Van Nostrand Reinhold Company, New York, 1976. [ Gajski 83 ] Daniel Gajski et al
NASA Astrophysics Data System (ADS)
Lark, R. Murray
2014-05-01
Conventionally the uncertainty of a conventional soil map has been expressed in terms of the mean purity of its map units: the probability that the soil profile class examined at a site would be found to correspond to the eponymous class of the simple map unit that is delineated there (Burrough et al, 1971). This measure of uncertainty has an intuitive meaning and is used for quality control in soil survey contracts (Western, 1978). However, it may be of limited value to the manager or policy maker who wants to decide whether the map provides a basis for decision making, and whether the cost of producing a better map would be justified. In this study I extend a published analysis of the economic implications of uncertainty in a soil map (Giasson et al., 2000). A decision analysis was developed to assess the economic value of imperfect soil map information for agricultural land use planning. Random error matrices for the soil map units were then generated, subject to constraints which ensure consistency with fixed frequencies of the different soil classes. For each error matrix the mean map unit purity was computed, and the value of the implied imperfect soil information was computed by the decision analysis. An alternative measure of the uncertainty in a soil map was considered. This is the mean soil map information which is the difference between the information content of a soil observation, at a random location in the region, and the information content of a soil observation given that the map unit is known. I examined the relationship between the value of imperfect soil information and the purity and information measures of map uncertainty. In both cases there was considerable variation in the economic value of possible maps with fixed values of the uncertainty measure. However, the correlation was somewhat stronger with the information measure, and there was a clear upper bound on the value of an imperfect soil map when the mean information takes some particular value. This suggests that the information measure may be a useful one for general communication of the value of soil and similar thematic data. Burrough, P.A., Beckett, P.H.T., Jarvis, M.G., 1971. The relation between cost and utility in soil survey. J. Soil Sci. 22, 359-394. Giasson, E., van Es, C, van Wambeke, A., Bryant, R.B. 2000. Assessing the economic value of soil information using decision analysis techniques. Soil Science 165, 971-978 Western, S., 1978. Soil survey contracts and quality control. Oxford Univ. Press, Oxford.
Career Benchmarks From the Burroughs Wellcome Fund's Early Faculty Career Development Awards.
McGovern, Victoria; Kramarik, Jean; Wilkins, Gary
2013-11-01
Documenting the career characteristics of a highly selective group of researchers provides some insight into how a successful career begins. This knowledge is of value to early-career faculty and those who evaluate them, as well as trainees who aspire to the professoriate and those who educate them. In 2010, the authors extracted information by hand from the curricula vitae of 196 basic scientists who have been supported by the Burroughs Wellcome Fund's early faculty career development programs from 1982 to 2010. Data were collected on awardees' education, awards and honors, funding, promotion, publication, service, and training activities. The end point for data was December 2010. Analyses quantified participants' time to terminal degree, faculty appointment, and first R01; determined their publication productivity; and calculated their rates of training graduate students and postdoctoral fellows. This group moved into jobs and gained first R01s faster than average. Surprisingly, those who train the most students and fellows do not publish the most. Women and men trained different numbers of undergraduates, PhDs, and postdocs. Women awardees had fewer publications on average than men. Researchers who are highly competitive at the early faculty career stage have generally been both timely in their arrival at important benchmarks and productive in terms of their scientific output. Newly trained researchers and the people and institutions that train them share responsibility for attaining expeditious progress, developing a substantial track record, and staking out fertile intellectual ground from which to grow an independent faculty career.
Medical supplies for the expeditions of the heroic age of Antarctic exploration: introduction.
Guly, H R
2012-06-01
During the heroic age of Antarctic exploration (1895-1922) there were at least 18 expeditions to the Antarctic lasting between 18 and 30 months. This is an introduction to a series of articles about the drugs taken and used in the Antarctic at this time. Most of the information relates to the expeditions of Robert Scott and Ernest Shackleton and the main supplier of medical equipment was Burroughs Wellcome and Co. This article also describes the medical cases that were taken to the Antarctic.
1982-12-01
recipe-menu cross reference list is dependent on the ability to sort various files. At the time the model was first implemented, a FORTRAN callable system...the printer. e. As mentioned in paragraph 2-2d(5), a FORTRAN callable system sort was not available at the time the model was first implemented, and...absence of a FORTRAN callable system sort at the time the menu planning model was placed on the Burroughs meant that most output was not dis- played in
Climate Change: A Multidisciplinary Approach, Second Edition
NASA Astrophysics Data System (ADS)
Kirk-Davidoff, Daniel
2008-07-01
William Burroughs, who died in November 2007, was a wonderfully clear and evocative writer. Chapter 3 of his last work, Climate Change: A Multidisciplinary Approach, begins with the loveliest four-paragraph description of the general circulation of the Earth's atmosphere I have ever encountered. His writing also shines in his descriptions of the climate record of the past few thousand years, and in his introduction to the measurement of climate change. Unfortunately, the book is marred by inconsistencies in its treatment of climate dynamics, as well as by a number of idiosyncratic choices of emphasis that detract from the book's quality as a general introduction to the science of climate change.
Multi-cultural Aspects of Spatial Knowledge
NASA Astrophysics Data System (ADS)
Frank, Andrew U.
It is trivial to observe differences between cultures: people use different languages, have different modes of building houses and organize their cities differently, to mention only a few. Differences in the culture of different people were and still are one of the main reasons for travel to foreign countries. The question whether cultural differences are relevant for the construction of Geographic Information Systems is longstanding (Burrough et al. 1995) and is of increasing interest since geographic information is widely accessible using the web and users volunteer information to be included in the system (Goodchild 2007). The review of how the question of cultural differences was posed at different times reveals a great deal about the conceptualization of GIS at different times and makes a critical review interesting.
Recent Observations and Structural Analysis of Surge-Type Glaciers in the Glacier Bay Area
NASA Astrophysics Data System (ADS)
Mayer, H.; Herzfeld, U. C.
2003-12-01
The Chugach-St.-Elias Mountains in North America hold the largest non-polar connected glaciated area of the world. Most of its larger glaciers are surge-type glaciers. In the summer of 2003, we collected aerial photographic and GPS data over numerous glaciers in the eastern St. Elias Mountains, including the Glacier Bay area. Observed glaciers include Davidson, Casement, McBride, Riggs, Cushing, Carroll, Rendu, Tsirku, Grand Pacific, Melbern, Ferris, Margerie, Johns Hopkins, Lamplugh, Reid, Burroughs, Morse, Muir and Willard Glaciers, of which Carroll, Rendu, Ferris, Grand Pacific, Johns Hopkins and Margerie Glaciers are surge-type glaciers. Our approach utilizes a quantitative analysis of surface patterns, following the principles of structural geology for the analysis of brittle-deformation patterns (manifested in crevasses) and ductile deformation patterns (visible in folded moraines). First results will be presented.
Marooned on Mars: Mind-Spinning Books for Software Engineers
NASA Technical Reports Server (NTRS)
Clancey, William J.; Swanson, Keith (Technical Monitor)
1999-01-01
Dragonfly - NASA and the Crisis Aboard MIR (New York: HarperCollins Publishers), the story of the Russian-American misadventures on MIR. An expose with almost embarrassing detail about the inner-workings of Johnson Space Center in Houston, this book is best read with the JSC organization chart in hand. Here's the real world of engineering and life in extreme environments. It makes most other accounts of "requirements analysis" appear glib and simplistic. The book vividly portrays the sometimes harrowing experiences of the American astronauts in the web of Russian interpersonal relations and literally in the web of MIR's wiring. Burrough's exposition reveals how handling bureaucratic procedures and bulky facilities is as much a matter of moxie and goodwill as technical capability. Lessons from MIR showed NASA that getting to Mars required a different view of knowledge and improvisation-long-duration missions are not at all like the scripted and pre-engineered flights of Apollo or the Space Shuttle.
Doctors, lies and the addiction bureaucracy.
Dalrymple, Theodore
2008-04-01
Almost everything you know about heroin addiction is wrong. Not only is it wrong, but it is obviously wrong. Heroin is not highly addictive; withdrawal from it is not medically serious; addicts do not become criminals to feed their habit; addicts do not need any medical assistance to stop taking heroin; and contrary to received wisdom, heroin addiction most certainly is a moral or spiritual problem. A literary tradition dating back to De Quincey and Coleridge, and continuing up to the deeply sociopathic William Burroughs and beyond, has misled all Western societies for generations about the nature of heroin addiction. These writers' self-dramatizing and dishonest accounts of their own addiction have been accepted uncritically, and have been more influential by far in forming public attitudes than the whole of pharmacological science. As a result, a self-serving, self-perpetuating and completely useless medical bureaucracy has been set up to deal with the problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, R.F.; Baughman, R.P.; Waide, J.J.
1995-12-01
The pathogenesis of ARDS is largely unknown, but many factors are known to predispose one to ARDS: sepsis, aspiration of gastric contents, pneumonia, fracture, multiple transfusions, cardiopulmonary bypass, burn, dissemination intravascular coagulation, pulmonary contusion, near drowning, and pancreatitis. ARDS is characterized by severe hypoxemia, diffuse pulmonary infiltrates, and decreased pulmonary compliance. Current treatment methods still result in 50% mortality. Studies are underway at the University of Cincinnati to determine if treatment with a synthetic pulmonary surfactant, Exosurf{sup {reg_sign}} (contains dipalmitoyl phosphatidyl choline, Burroughs-Wellcome), improves the prognosis of these patients. BALF from these patients, before and after treatment, was analyzed tomore » determine if the treatment resulted in an increase in disaturated phospholipids (surfactant phospholipids) in the epithelial lining fluid and if the treatments reduced the concentration of markers of inflammation and toxicity in the BALF. This study indicates that the method of administering Exosurf{sup {reg_sign}} did not lead to an increase in surfactant lipid or protein in the bronchoalveolar region of the respiratory tract.« less
Scientists and K-12: Experience from The Science House
NASA Astrophysics Data System (ADS)
Haase, David G.
2003-03-01
In working with K-12 science and mathematics education, scientists may take on many different roles - from presenter to full-time partner. These roles are illustrated in the activities of The Science House, a K-12 education program of North Carolina State University, (www.science-house.org) which partners with teachers and students across the state to promote inquiry-based learning in mathematics and science. While it is important to involve scientists in K-12, most universities do not have effective means to make the connections. In our efforts to do so, which began with a few teacher workshops and now encompasses six offices across NC, we have sought to join the interests of the university (research, teaching, student recruiting) to the needs of K-12. Our programs now include teacher training workshops, student science camps and curriculum projects in several states. We are reminded that K-12 science education is interdisciplinary; local and political; and a process, not a problem to be solved and forgotten. Partially supported by NSF (CHE-9876674 and DBI-0115462), the Howard Hughes Medical Institute and the Burroughs Wellcome Fund.
Right Language to Release the River of Health in the Medical Industry.
Weeks, John
2015-11-01
Performance artist Laurie Anderson appropriated an idea from beat writer William Burroughs a few years back. Language, Anderson sings, is a virus.(1) The words we choose lock in ideas and discharge reverberations. They subtly evoke personal, professional, and societal power relationships. Language is a virus. By extension, changes of language can shift power relations. The removal of a conquest name of a former US president from the highest point in the northern part of the western hemisphere is a case in point. US President Barack Obama re-anointed Mt McKinley as Mt Denali, the name used by the indigenous people whose descendants still live in its presence.(2) The act replaced a European surname linked to cultural suppression and colonization with one that honors the first human inhabitants. The renaissance of indigenous medical practices is effecting a similar renaming of what many view to be the high point in the development of medicine. "Traditional medicine" has for decades been misused to describe biomedical and industrial practices that are less than a century old. Better to qualify this medicine with "conventional" or "bio-." Let "traditional medicine" indicate practices that carry the weight of history. These choices move toward right language.
[Tablets and tablet production - with special reference to Icelandic conditions].
Skaftason, Jóhannes F; Jóhannesson, Thorkell
2013-04-01
Modern tablet compression was instituted in England in 1844 by William Brockedon (1787-1854). The first tablets made according to Brockedon´s procedures contained watersoluble salts and were most likely compressed without expedients. In USA a watershed occurred around 1887 when starch (amylum maydis) was introduced to disperse tablets in aqueous milieu in order to corroborate bioavailability of drugs in the almentary canal. About the same time great advances in tablet production were introduced by the British firm Burroughs Wellcome and Co. In Denmark on the other hand tablet production remained on low scale until after 1920. As Icelandic pharmacies and drug firms modelled themselves mostly upon Danish firms tablet production was first instituted in Iceland around 1930. The first tablet machines in Iceland were hand-driven. More efficent machines came after 1945. Around 1960 three sizeable tablet producers were in Iceland; now there is only one. Numbers of individual tablet species (generic and proprietary) on the market rose from less than 10 in 1913 to 500 in 1965, with wide variations in numbers in between. Tablets have not wiped out other medicinal forms for peroral use but most new peroral drugs have been marketed in the form of tablets during the last decades.
Impulsive movements lead to high hops on sand
NASA Astrophysics Data System (ADS)
Aguilar, Jeffrey; Goldman, Daniel I.
2014-03-01
Various animals exhibit locomotive behaviors (like sprinting and hopping) involving transient bursts of actuation coupled to the ground through internal elastic elements. The performance of such maneuvers is subject to reaction forces on the feet from the environment. On substrates like dry granular media, the laws that govern these forces are not fully understood, and can vary with foot size and shape, material compaction (measured by the volume fraction ϕ) and intrusion kinematics. To gain insight into how such interactions affect jumps on granular media, we study the performance of an actuated spring mass robot. We compare performance between two jump strategies: a single-cycle sine-wave actuation (a ``single jump'') and this actuation preceded by an impulsive preload (a ``preload jump''). We vary ϕ for both strategies, and find that ϕ significantly affects performance: we observe a 200% increase in the single jump height with only a 5% increase in volume fraction using a 7.62 cm diameter flat foot. The preload jump outperforms the single jump height by 150% for all ϕ. We hypothesize that this increase in performance results from higher intrusion velocities and accelerations associated with the preload. NSF POLS CAREER, Burroughs Wellcome Fund, and ARO.
Statistical Mechanics of US Supreme Court
NASA Astrophysics Data System (ADS)
Lee, Edward; Broedersz, Chase; Bialek, William; Biophysics Theory Group Team
2014-03-01
We build simple models for the distribution of voting patterns in a group, using the Supreme Court of the United States as an example. The least structured, or maximum entropy, model that is consistent with the observed pairwise correlations among justices' votes is equivalent to an Ising spin glass. While all correlations (perhaps surprisingly) are positive, the effective pairwise interactions in the spin glass model have both signs, recovering some of our intuition that justices on opposite sides of the ideological spectrum should have a negative influence on one another. Despite the competing interactions, a strong tendency toward unanimity emerges from the model, and this agrees quantitatively with the data. The model shows that voting patterns are organized in a relatively simple ``energy landscape,'' correctly predicts the extent to which each justice is correlated with the majority, and gives us a measure of the influence that justices exert on one another. These results suggest that simple models, grounded in statistical physics, can capture essential features of collective decision making quantitatively, even in a complex political context. Funded by National Science Foundation Grants PHY-0957573 and CCF-0939370, WM Keck Foundation, Lewis-Sigler Fellowship, Burroughs Wellcome Fund, and Winston Foundation.
Emission of sound from the mammalian inner ear
NASA Astrophysics Data System (ADS)
Reichenbach, Tobias; Stefanovic, Aleksandra; Nin, Fumiaki; Hudspeth, A. J.
2013-03-01
The mammalian inner ear, or cochlea, not only acts as a detector of sound but can also produce tones itself. These otoacoustic emissions are a striking manifestation of the mechanical active process that sensitizes the cochlea and sharpens its frequency discrimination. It remains uncertain how these signals propagate back to the middle ear, from which they are emitted as sound. Although reverse propagation might occur through waves on the cochlear basilar membrane, experiments suggest the existence of a second component in otoacoustic emissions. We have combined theoretical and experimental studies to show that mechanical signals can also be transmitted by waves on Reissner's membrane, a second elastic structure within the cochea. We have developed a theoretical description of wave propagation on the parallel Reissner's and basilar membranes and its role in the emission of distortion products. By scanning laser interferometry we have measured traveling waves on Reissner's membrane in the gerbil, guinea pig, and chinchilla. The results accord with the theory and thus support a role for Reissner's membrane in otoacoustic emission. T. R. holds a Career Award at the Scientific Interface from the Burroughs Wellcome Fund; A. J. H. is an Investigator of Howard Hughes Medical Institute.
Phage-bacteria infection networks: From nestedness to modularity
NASA Astrophysics Data System (ADS)
Flores, Cesar O.; Valverde, Sergi; Weitz, Joshua S.
2013-03-01
Bacteriophages (viruses that infect bacteria) are the most abundant biological life-forms on Earth. However, very little is known regarding the structure of phage-bacteria infections. In a recent study we re-evaluated 38 prior studies and demonstrated that phage-bacteria infection networks tend to be statistically nested in small scale communities (Flores et al 2011). Nestedness is consistent with a hierarchy of infection and resistance within phages and bacteria, respectively. However, we predicted that at large scales, phage-bacteria infection networks should be typified by a modular structure. We evaluate and confirm this hypothesis using the most extensive study of phage-bacteria infections (Moebus and Nattkemper 1981). In this study, cross-infections were evaluated between 215 marine phages and 286 marine bacteria. We develop a novel multi-scale network analysis and find that the Moebus and Nattkemper (1981) study, is highly modular (at the whole network scale), yet also exhibits nestedness and modularity at the within-module scale. We examine the role of geography in driving these modular patterns and find evidence that phage-bacteria interactions can exhibit strong similarity despite large distances between sites. CFG acknowledges the support of CONACyT Foundation. JSW holds a Career Award at the Scientific Interface from the Burroughs Wellcome Fund and acknowledges the support of the James S. McDonnell Foundation
MOC Views of Martian Solar Eclipses
NASA Technical Reports Server (NTRS)
1999-01-01
[figure removed for brevity, see original site]
The shadow of the martian moon, Phobos, has been captured in many recent wide angle camera views of the red planet obtained by the Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC). Designed to monitor changes in weather and surface conditions, the wide angle cameras are also proving to be a good way to spot the frequent solar eclipses caused by the passage of Phobos between Mars and the Sun.The first figure (above), shows wide angle red (left), blue (middle), and color composite (right) views of the shadow of Phobos (elliptical feature at center of each frame) as it was cast upon western Xanthe Terra on August 26, 1999, at about 2 p.m.local time on Mars. The image covers an area about 250 kilometers (155 miles) across and is illuminated from the left. The meandering Nanedi Valles is visible in the lower right corner of the scene. Note the dark spots on three crater floors--these appear dark in the red camera image (left) but are barely distinguished in the blue image (middle), while the shadow is dark in both images. The spots on the crater floors are probably small fields of dark sand dunes.The second figure shows three samples of MOC's global image swaths, each in this case with a shadow of Phobos visible (arrow). The first scene (left) was taken on September 1, 1999, and shows the shadow of Phobos cast upon southern Elysium Planitia. The large crater with dark markings on its floor at the lower right corner is Herschel Basin. The second scene shows the shadow of Phobos cast upon northern Lunae Planum on September 8, 1999. Kasei Valles dominates the upper right and the deep chasms of Valles Marineris dominate the lower third of the September 8 image. The picture on the right shows the shadow of Phobos near the giant volcano, Olympus Mons (upper left), on September 25, 1999. Three other major volcanoes are visible from lower-center (Arsia Mons) and right-center (Pavonis Mons) to upper-middle-right (Ascraeus Mons).Phobos and the smaller, more distant satellite, Deimos, were discovered in 1877 by Asaph Hall, an astronomer at the United States Naval Observatory in Washington, D.C. Hall had been hunting for martian satellites for some time, and was about to abandon the search when he was encouraged by his wife to continue. In honor of her role, the largest crater on Phobos was named Stickney, her maiden name. Phobos is a tiny, potato-shaped world that is only about 13 km by 11 km by 9 km (8 mi by 7 mi by 6 mi) in size.In 1912 Edgar Rice Burroughs published a story entitled 'Under the Moons of Mars' (printed in book form in 1917 as A Princess of Mars) in which he referred to the 'hurtling moons of Barsoom' (Barsoom being the 'native' word for Mars in the fictional account). Burroughs was inspired by the fact that Phobos, having an orbital period of slightly less than 8 hours, would appear from Mars to rise in the west and set in the east only five and a half hours later. (Despite Burroughs' phrase, the outer moon, Deimos, can hardly be said to 'hurtle' -- it takes nearly 60 hours to cross the sky from east to west, rising on one day and not setting again for over two more.)If you could stand on Mars and watch Phobos passing overhead, you would notice that this moon appears to be only about half the size of what Earth's Moon looks like when viewed from the ground. In addition, the Sun would seem to have shrunk to about 2/3 (or nearly 1/2) of its size as seen from Earth. Martian eclipses are therefore dark but not as spectacular as total solar eclipses on Earth can be. In compensation, the martian eclipses are thousands of times more common, occurring a few times a day somewhere on Mars whenever Phobos passes over the planet's sunlit side. Due to the changing geometry of the MGS orbit relative to that of Phobos, the shadow is actually seen in MOC global map images (like in the second figure above) about a dozen times a month.The shadow of Phobos was seen during the Viking missions in the late 1970s, and in fact one day the shadow was observed to pass right over the Viking 1 lander. The surface of Phobos itself was first imaged by Mariner 9 in 1971, and global coverage was obtained by the Viking orbiters in 1976-80. Phobos was the target of the ill-fated Phobos 1 and Phobos 2 spacecraft, launched by the Soviet Union in 1988. Phobos 2 actually reached Mars in 1989 and obtained a few pictures of the satellite--it also captured the shadow of Phobos cast upon the martian surface using its thermal infrared imager, Termoskan. More recently, the MGS MOC observed the tiny moon four times in August and September 1998.NASA Astrophysics Data System (ADS)
Vahey, Michael
Despite relevance to human health, the mechanisms of enveloped virus assembly remain largely mysterious. This is particularly true of influenza A virus (IAV), which (unlike viral capsids with stereotyped shape and composition) forms heterogeneous particles whose assembly cannot be described in terms of equilibrium thermodynamics. Although the ability to assemble into particles with diverse size and composition could have important implications for infectivity, understanding how virion-to-virion differences arise and how they ultimately influence virus replication has proven challenging due to the lack of available tools for studying the assembly process. To address this challenge and establish a dynamic picture of how IAV assembles, we have developed virus strains that harbor small, non-disruptive fluorescent tags on each of the virus's five major structural proteins. Using these multispectral strains, we are able to quantify the protein composition and dynamics of virions as they assemble in live infected cells - measurements that have been previously inaccessible, and which reveal subpopulations of virus that favor either the binding or destruction of host receptors. The occupancy of these different subpopulations is malleable, shifting in response to environmental stimuli, including antiviral drugs that block receptor-destruction. In complex environments like the human respiratory tract, this phenotypic diversity could act as an evolutionary hedge. We acknowledge the Burroughs Wellcome Fund and NIH NIGMS for supporting this work.
Multiscale Airflow Model and Aerosol Deposition in Healthy and Emphysematous Rat Lungs
NASA Astrophysics Data System (ADS)
Oakes, Jessica; Marsden, Alison; Grandmont, Celine; Darquenne, Chantal; Vignon-Clementel, Irene
2012-11-01
The fate of aerosol particles in healthy and emphysematic lungs is needed to determine the toxic or therapeutic effects of inhalable particles. In this study we used a multiscale numerical model that couples a 0D resistance and capacitance model to 3D airways generated from MR images. Airflow simulations were performed using an in-house 3D finite element solver (SimVascular, simtk.org). Seven simulations were performed; 1 healthy, 1 uniform emphysema and 5 different cases of heterogeneous emphysema. In the heterogeneous emphysema cases the disease was confined to a single lobe. As a post processing step, 1 micron diameter particles were tracked in the flow field using Lagrangian particle tracking. The simulation results showed that the inhaled flow distribution was equal for the healthy and uniform emphysema cases. However, in the heterogeneous emphysema cases the delivery of inhaled air was larger in the diseased lobe. Additionally, there was an increase in delivery of aerosol particles to the diseased lobe. This suggests that as the therapeutic particles would reach the diseased areas of the lung, while toxic particles would increasingly harm the lung. The 3D-0D model described here is the first of its kind to be used to study healthy and emphysematic lungs. NSF Graduate Fellowship (Oakes), Burroughs Wellcome Fund (Marsden, Oakes) 1R21HL087805-02 from NHLBI at NIH, INRIA Team Grant.
Organic Light-Emitting Devices (OLEDS) and Their Optically Detected Magnetic Resonance (ODMR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Gang
2003-01-01
Organic Light-Emitting Devices (OLEDs), both small molecular and polymeric have been studied extensively since the first efficient small molecule OLED was reported by Tang and VanSlyke in 1987. Burroughes' report on conjugated polymer-based OLEDs led to another track in OLED development. These developments have resulted in full color, highly efficient (up to {approx} 20% external efficiency 60 lm/W power efficiency for green emitters), and highly bright (> 140,000 Cd/m{sup 2} DC, {approx}2,000,000 Cd/m{sup 2} AC), stable (>40,000 hr at 5 mA/cm{sup 2}) devices. OLEDs are Lambertian emitters, which intrinsically eliminates the view angle problem of liquid crystal displays (LCDs). Thusmore » OLEDs are beginning to compete with the current dominant LCDs in information display. Numerous companies are now active in this field, including large companies such as Pioneer, Toyota, Estman Kodak, Philipps, DuPont, Samsung, Sony, Toshiba, and Osram, and small companies like Cambridge Display Technology (CDT), Universal Display Corporation (UDC), and eMagin. The first small molecular display for vehicular stereos was introduced in 1998, and polymer OLED displays have begun to appear in commercial products. Although displays are the major application for OLEDs at present, they are also candidates for nest generation solid-state lighting. In this case the light source needs to be white in most cases. Organic transistors, organic solar cells, etc. are also being developed vigorously.« less
Zheng, Yu; Wang, Hai-Lin; Li, Jian-Kang; Xu, Li; Tellier, Laurent; Li, Xiao-Lin; Huang, Xiao-Yan; Li, Wei; Niu, Tong-Tong; Yang, Huan-Ming; Zhang, Jian-Guo; Liu, Dong-Ning
2018-01-01
To study the genes responsible for retinitis pigmentosa. A total of 15 Chinese families with retinitis pigmentosa, containing 94 sporadically afflicted cases, were recruited. The targeted sequences were captured using the Target_Eye_365_V3 chip and sequenced using the BGISEQ-500 sequencer, according to the manufacturer's instructions. Data were aligned to UCSC Genome Browser build hg19, using the Burroughs Wheeler Aligner MEM algorithm. Local realignment was performed with the Genome Analysis Toolkit (GATK v.3.3.0) IndelRealigner, and variants were called with the Genome Analysis Toolkit Haplotypecaller, without any use of imputation. Variants were filtered against a panel derived from 1000 Genomes Project, 1000G_ASN, ESP6500, ExAC and dbSNP138. In all members of Family ONE and Family TWO with available DNA samples, the genetic variant was validated using Sanger sequencing. A novel, pathogenic variant of retinitis pigmentosa, c.357_358delAA (p.Ser119SerfsX5) was identified in PRPF31 in 2 of 15 autosomal-dominant retinitis pigmentosa (ADRP) families, as well as in one, sporadic case. Sanger sequencing was performed upon probands, as well as upon other family members. This novel, pathogenic genotype co-segregated with retinitis pigmentosa phenotype in these two families. ADRP is a subtype of retinitis pigmentosa, defined by its genotype, which accounts for 20%-40% of the retinitis pigmentosa patients. Our study thus expands the spectrum of PRPF31 mutations known to occur in ADRP, and provides further demonstration of the applicability of the BGISEQ500 sequencer for genomics research.
Carl Sagan and the Exploration of Mars and Venus
NASA Technical Reports Server (NTRS)
Toon, Owen B.; Condon, Estelle P. (Technical Monitor)
1997-01-01
Inspired by childhood readings of books by Edgar Rice Burroughs, Carl Sagan's first interest in planetary science focused on Mars and Venus. Typical of much of his career he was skeptical of early views about these planets. Early in this century it was thought that the Martian wave of darkening, a seasonal albedo change on the planet, was biological in origin. He suggested instead that it was due to massive dust storms, as was later shown to be the case. He was the first to recognize that Mars has huge topography gradients across its surface. During the spacecraft era, as ancient river valleys were found on the planet, he directed studies of Mars' ancient climate. He suggested that changes in the planets orbit were involved in climate shifts on Mars, just as they are on Earth. Carl had an early interest in Venus. Contradictory observations led to a controversy about the surface temperature, and Carl was one of the first to recognize that Venus has a massive greenhouse effect at work warming its surface. His work on radiative transfer led to an algorithm that was extensively used by modelers of the Earth's climate and whose derivatives still dominate the calculation of radiative transfer in planetary atmospheres today. Carl inspired a vast number of young scientists through his enthusiasm for new ideas and discoveries, his skeptical approach, and his boundless energy. I had the privilege to work in Carl's laboratory during the peak of the era of Mars' initial exploration. It was an exciting time, and place. Carl made it a wonderful experience.
Climate: Into the 21st Century
NASA Astrophysics Data System (ADS)
Burroughs, William
2003-08-01
Toward the end of the twentieth century, it became evident to professionals working within the meterological arena that the world's climate system was showing signs of change that could not be adequately explained in terms of natural variation. Since that time there has been an increasing recognition that the climate system is changing as a result of human industries and lifestyles, and that the outcomes may prove catastrophic to the world's escalating population. Compiled by an international team formed under the auspices of the World Meteorological Organization (WMO), Climate: Into the 21st Century features an unrivalled collection of essays by the world's leading meteorological experts. These fully integrated contributions provide a perspective of the global climate system across the twentieth century, and describe some of the most arresting and extreme climatic events and their effects that have occurred during that time. In addition, the book traces the development of our capabilities to observe and monitor the climate system, and outlines our understanding of the predictability of climate on time-scales of months and longer. It concludes with a summary of the prospects for applying the twentieth century climate experience in order to benefit society in the twenty-first century. Lavishly illustrated in color, Climate is an accessible acccount of the challenges that climate poses at the start of the twenty-first century. Filled with fascinating facts and diagrams, it is written for a wide audience and will captivate the general reader interested in climate issues, and will be a valuable teaching resource. William Burroughs is a successful science author of books on climate, including Weather (Time Life, 2000), and Climate Change: A Multidisciplinary Approach (2001), Does the Weather Really Matter? (1997) and The Climate Revealed (1999), all published by Cambridge University Press.
Zheng, Yu; Wang, Hai-Lin; Li, Jian-Kang; Xu, Li; Tellier, Laurent; Li, Xiao-Lin; Huang, Xiao-Yan; Li, Wei; Niu, Tong-Tong; Yang, Huan-Ming; Zhang, Jian-Guo; Liu, Dong-Ning
2018-01-01
AIM To study the genes responsible for retinitis pigmentosa. METHODS A total of 15 Chinese families with retinitis pigmentosa, containing 94 sporadically afflicted cases, were recruited. The targeted sequences were captured using the Target_Eye_365_V3 chip and sequenced using the BGISEQ-500 sequencer, according to the manufacturer's instructions. Data were aligned to UCSC Genome Browser build hg19, using the Burroughs Wheeler Aligner MEM algorithm. Local realignment was performed with the Genome Analysis Toolkit (GATK v.3.3.0) IndelRealigner, and variants were called with the Genome Analysis Toolkit Haplotypecaller, without any use of imputation. Variants were filtered against a panel derived from 1000 Genomes Project, 1000G_ASN, ESP6500, ExAC and dbSNP138. In all members of Family ONE and Family TWO with available DNA samples, the genetic variant was validated using Sanger sequencing. RESULTS A novel, pathogenic variant of retinitis pigmentosa, c.357_358delAA (p.Ser119SerfsX5) was identified in PRPF31 in 2 of 15 autosomal-dominant retinitis pigmentosa (ADRP) families, as well as in one, sporadic case. Sanger sequencing was performed upon probands, as well as upon other family members. This novel, pathogenic genotype co-segregated with retinitis pigmentosa phenotype in these two families. CONCLUSION ADRP is a subtype of retinitis pigmentosa, defined by its genotype, which accounts for 20%-40% of the retinitis pigmentosa patients. Our study thus expands the spectrum of PRPF31 mutations known to occur in ADRP, and provides further demonstration of the applicability of the BGISEQ500 sequencer for genomics research. PMID:29375987
Monocyte dysregulation and systemic inflammation during pediatric falciparum malaria
Dobbs, Katherine R.; Embury, Paula; Odada, Peter S.; Rosa, Bruce A.; Mitreva, Makedonka; Kazura, James W.; Dent, Arlene E.
2017-01-01
BACKGROUND. Inflammation and monocytes are thought to be important to human malaria pathogenesis. However, the relationship of inflammation and various monocyte functions to acute malaria, recovery from acute malaria, and asymptomatic parasitemia in endemic populations is poorly understood. METHODS. We evaluated plasma cytokine levels, monocyte subsets, monocyte functional responses, and monocyte inflammatory transcriptional profiles of 1- to 10-year-old Kenyan children at the time of presentation with acute uncomplicated malaria and at recovery 6 weeks later; these results were compared with analogous data from asymptomatic children and adults in the same community. RESULTS. Acute malaria was marked by elevated levels of proinflammatory and regulatory cytokines and expansion of the inflammatory “intermediate” monocyte subset that returned to levels of healthy asymptomatic children 6 weeks later. Monocytes displayed activated phenotypes during acute malaria, with changes in surface expression of markers important to innate and adaptive immunity. Functionally, acute malaria monocytes and monocytes from asymptomatic infected children had impaired phagocytosis of P. falciparum–infected erythrocytes relative to asymptomatic children with no blood-stage infection. Monocytes from both acute malaria and recovery time points displayed strong and equivalent cytokine responsiveness to innate immune agonists that were independent of infection status. Monocyte transcriptional profiles revealed regulated and balanced proinflammatory and antiinflammatory and altered phagocytosis gene expression patterns distinct from malaria-naive monocytes. CONCLUSION. These observations provide insights into monocyte functions and the innate immune response during uncomplicated malaria and suggest that asymptomatic parasitemia in children is not clinically benign. FUNDING. Support for this work was provided by NIH/National Institute of Allergy and Infectious Diseases (R01AI095192-05), the Burroughs Wellcome Fund/American Society of Tropical Medicine and Hygiene, and the Rainbow Babies & Children’s Foundation. PMID:28931756
Rare Event Simulation for T-cell Activation
NASA Astrophysics Data System (ADS)
Lipsmeier, Florian; Baake, Ellen
2009-02-01
The problem of statistical recognition is considered, as it arises in immunobiology, namely, the discrimination of foreign antigens against a background of the body's own molecules. The precise mechanism of this foreign-self-distinction, though one of the major tasks of the immune system, continues to be a fundamental puzzle. Recent progress has been made by van den Berg, Rand, and Burroughs (J. Theor. Biol. 209:465-486, 2001), who modelled the probabilistic nature of the interaction between the relevant cell types, namely, T-cells and antigen-presenting cells (APCs). Here, the stochasticity is due to the random sample of antigens present on the surface of every APC, and to the random receptor type that characterises individual T-cells. It has been shown previously (van den Berg et al. in J. Theor. Biol. 209:465-486, 2001; Zint et al. in J. Math. Biol. 57:841-861, 2008) that this model, though highly idealised, is capable of reproducing important aspects of the recognition phenomenon, and of explaining them on the basis of stochastic rare events. These results were obtained with the help of a refined large deviation theorem and were thus asymptotic in nature. Simulations have, so far, been restricted to the straightforward simple sampling approach, which does not allow for sample sizes large enough to address more detailed questions. Building on the available large deviation results, we develop an importance sampling technique that allows for a convenient exploration of the relevant tail events by means of simulation. With its help, we investigate the mechanism of statistical recognition in some depth. In particular, we illustrate how a foreign antigen can stand out against the self background if it is present in sufficiently many copies, although no a priori difference between self and nonself is built into the model.
'Saving the lives of our dogs': the development of canine distemper vaccine in interwar Britain.
Bresalier, Michael; Worboys, Michael
2014-06-01
This paper examines the successful campaign in Britain to develop canine distemper vaccine between 1922 and 1933. The campaign mobilized disparate groups around the common cause of using modern science to save the nation's dogs from a deadly disease. Spearheaded by landed patricians associated with the country journal The Field, and funded by dog owners and associations, it relied on collaborations with veterinary professionals, government scientists, the Medical Research Council (MRC) and the commercial pharmaceutical house the Burroughs Wellcome Company (BWC). The social organization of the campaign reveals a number of important, yet previously unexplored, features of interwar science and medicine in Britain. It depended on a patronage system that drew upon a large base of influential benefactors and public subscriptions. Coordinated by the Field Distemper Fund, this system was characterized by close relationships between landed elites and their social networks with senior science administrators and researchers. Relations between experts and non-experts were crucial, with high levels of public engagement in all aspects of research and vaccine development. At the same time, experimental and commercial research supported under the campaign saw dynamic interactions between animal and human medicine, which shaped the organization of the MRC's research programme and demonstrated the value of close collaboration between veterinary and medical science, with the dog as a shared object and resource. Finally, the campaign made possible the translation of 'laboratory' findings into field conditions and commercial products. Rather than a unidirectional process, translation involved negotiations over the very boundaries of the 'laboratory' and the 'field', and what constituted a viable vaccine. This paper suggests that historians reconsider standard historical accounts of the nature of patronage, the role of animals, and the interests of landed elites in interwar British science and medicine.
The Washington Biologists' Field Club : Its members and its history (1900-2006)
Perry, M.C.
2007-01-01
This book is based on the interesting one-hundred-plus-year history of the Club and its members. Plummers Island and the historic cabin on the Island have served as a common meeting area where the Club members have conducted research and held many social activities for over a century. The history has been written and revised over the years by members, and the biographical sketches also have been collected and written by the members. The Club was formed in 1900 and incorporated as a society in 1901 for scientists in the Washington, D.C., area. In recent years the Club has sponsored research by many non-member local scientists with grants totaling over $305,000. The cumulative total of 267 members represents all branches of natural science, with a strong emphasis on biology as the Club name indicates. In addition to the biologists there have been famous naturalists (e.g., John Burroughs), high-level administrators (e.g., Ira Gabrielson), and well-known artists (e.g., Roger Tory Peterson). Most members have been biological scientists, working for agencies in the Washington, D.C., area, who have published many articles and books dealing with biology and related subjects. The book is publIshed mainly for the benefit of the living Club members and for relatives of the deceased members. The members hope that the book will find its way into libraries across the country and that in the future, persons interested in some of the pioneer scientists, in the various professional areas of science, can obtain biographical information from a well-documented source. Most of the 542 illustrations of the members, cabin, and the Island have not been published previously. It is hopeful that the biographical sketches, pictures, and other information presented in this book can generate new information for future publications and for the website of the Washington Biologists' Field Club, which is updated frequently.
Tucker, Joseph D; Hughes, Molly A; Durvasula, Ravi V; Vinetz, Joseph M; McGovern, Victoria P; Schultz, Rhonda; Dunavan, Claire Panosian; Wilson, Mary E; Milner, Danny A; LaRocque, Regina C; Calderwood, Stephen B; Guerrant, Richard L; Weller, Peter F; Taylor, Terrie E
2017-06-15
In modern academic medicine, especially in the fields of infectious diseases and global health, aspiring physician-scientists often wait years before achieving independence as basic, translational, and clinical investigators. This study employed mixed methods to evaluate the success of the Burroughs Wellcome Fund/American Society for Tropical Medicine and Hygiene (BWF/ASTMH) global health postdoctoral fellowship in promoting scientific independence. We examined quantitative data obtained from the National Institutes of Health (NIH) and qualitative data provided by the ASTMH and program participants to assess BWF/ASTMH trainees' success in earning NIH grants, publishing manuscripts, and gaining faculty positions. We also calculated the return on investment (ROI) associated with the training program by dividing direct costs of NIH research grants awarded to trainees by the direct costs invested by the BWF/ASTMH fellowship. Forty-one trainees received fellowships between 2001 and 2015. Within 3 years of completing their fellowships, 21 of 35 (60%) had received career development awards, and within 5 years, 12 of 26 (46%) had received independent research awards. Overall, 22 of 35 (63%) received 1 or more research awards. BWF/ASTMH recipients with at least 3 years of follow-up data had coauthored a mean of 36 publications (range, 2-151) and 29 of 35 (82%) held academic positions. The return on investment was 11.9 overall and 31.8 for fellowships awarded between 2001 and 2004. Between 2001 and 2015, the BWF/ASTMH postdoctoral training program successfully facilitated progress to scientific independence. This program model underscores the importance of custom-designed postdoctoral training as a bridge to NIH awards and professional autonomy. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Carson, Rachel L.
1991-12-01
Published in 1951, The Sea Around Us is one of the most remarkably successful books ever written about the natural world. Rachel Carson's rare ability to combine scientific insight with moving, poetic prose catapulted her book to first place on The New York Times best-seller list, where it enjoyed wide attention for thirty-one consecutive weeks. It remained on the list for more than a year and a half and ultimately sold well over a million copies, has been translated into 28 languages, inspired an Academy Award-winning documentary, and won both the 1952 National Book Award and the John Burroughs Medal. This classic work remains as fresh today as when it first appeared. Carson's writing teems with stunning, memorable images--the newly formed Earth cooling beneath an endlessly overcast sky; the centuries of nonstop rain that created the oceans; giant squids battling sperm whales hundreds of fathoms below the surface; and incredibly powerful tides moving 100 billion tons of water daily in the Bay of Fundy. Quite simply, she captures the mystery and allure of the ocean with a compelling blend of imagination and expertise. Reintroducing a classic work to a whole new generation of readers, this Special Edition features a new chapter written by Jeffrey Levinton, a leading expert in marine ecology, that brings the scientific side of The Sea Around Us completely up to date. Levinton incorporates the most recent thinking on continental drift, coral reefs, the spread of the ocean floor, the deterioration of the oceans, mass extinction of sea life, and many other topics. In addition, acclaimed nature writer Ann Zwinger has contributed a brief foreword. Today, with the oceans endangered by the dumping of medical waste and ecological disasters such as the Exxon oil spill in Alaska, this illuminating volume provides a timely reminder of both the fragility and the importance of the ocean and the life that abounds within it. Anyone who loves the sea, or who is concerned about our natural environment, will want to read this classic work.
Cusi, Kenneth; Orsak, Beverly; Bril, Fernando; Lomonaco, Romina; Hecht, Joan; Ortiz-Lopez, Carolina; Tio, Fermin; Hardies, Jean; Darland, Celia; Musi, Nicolas; Webb, Amy; Portillo-Sanchez, Paola
2016-09-06
The metabolic defects of nonalcoholic steatohepatitis (NASH) and prediabetes or type 2 diabetes mellitus (T2DM) seem to be specifically targeted by pioglitazone. However, information about its long-term use in this population is limited. To determine the efficacy and safety of long-term pioglitazone treatment in patients with NASH and prediabetes or T2DM. Randomized, double-blind, placebo-controlled trial. (ClinicalTrials.gov: NCT00994682). University hospital. Patients (n = 101) with prediabetes or T2DM and biopsy-proven NASH were recruited from the general population and outpatient clinics. All patients were prescribed a hypocaloric diet (500-kcal/d deficit from weight-maintaining caloric intake) and then randomly assigned to pioglitazone, 45 mg/d, or placebo for 18 months, followed by an 18-month open-label phase with pioglitazone treatment. The primary outcome was a reduction of at least 2 points in the nonalcoholic fatty liver disease activity score in 2 histologic categories without worsening of fibrosis. Secondary outcomes included other histologic outcomes, hepatic triglyceride content measured by magnetic resonance and proton spectroscopy, and metabolic parameters. Among patients randomly assigned to pioglitazone, 58% achieved the primary outcome (treatment difference, 41 percentage points [95% CI, 23 to 59 percentage points]) and 51% had resolution of NASH (treatment difference, 32 percentage points [CI, 13 to 51 percentage points]) (P < 0.001 for each). Pioglitazone treatment also was associated with improvement in individual histologic scores, including the fibrosis score (treatment difference, -0.5 [CI, -0.9 to 0.0]; P = 0.039); reduced hepatic triglyceride content from 19% to 7% (treatment difference, -7 percentage points [CI, -10 to -4 percentage points]; P < 0.001); and improved adipose tissue, hepatic, and muscle insulin sensitivity (P < 0.001 vs. placebo for all). All 18-month metabolic and histologic improvements persisted over 36 months of therapy. The overall rate of adverse events did not differ between groups, although weight gain was greater with pioglitazone (2.5 kg vs. placebo). Single-center study. Long-term pioglitazone treatment is safe and effective in patients with prediabetes or T2DM and NASH. Burroughs Wellcome Fund and American Diabetes Association.
Krischer, Jeffrey; Cronholm, Peter F; Burroughs, Cristina; McAlear, Carol A; Borchin, Renee; Easley, Ebony; Davis, Trocon; Kullman, Joyce; Carette, Simon; Khalidi, Nader; Koening, Curry; Langford, Carol A; Monach, Paul; Moreland, Larry; Pagnoux, Christian; Specks, Ulrich; Sreih, Antoine G; Ytterberg, Steven; Merkel, Peter A
2017-02-28
The target sample size for clinical trials often necessitates a multicenter (center of excellence, CoE) approach with associated added complexity, cost, and regulatory requirements. Alternative recruitment strategies need to be tested against this standard model. The aim of our study was to test whether a Web-based direct recruitment approach (patient-centric, PC) using social marketing strategies provides a viable option to the CoE recruitment method. PC recruitment and Web-based informed consent was compared with CoE recruitment for a randomized controlled trial (RCT) of continuing versus stopping low-dose prednisone for maintenance of remission of patients with granulomatosis with polyangiitis (GPA). The PC approach was not as successful as the CoE approach. Enrollment of those confirmed eligible by their physician was 10 of 13 (77%) and 49 of 51 (96%) in the PC and CoE arms, respectively (P=.05). The two approaches were not significantly different in terms of eligibility with 34% of potential participants in the CoE found to be ineligible as compared with 22% in the PC arm (P=.11) nor in provider acceptance, 22% versus 26% (P=.78). There was no difference in the understanding of the trial as reflected in the knowledge surveys of individuals in the PC and CoE arms. PC recruitment was substantially less successful than that achieved by the CoE approach. However, the PC approach was good at confirming eligibility and was as acceptable to providers and as understandable to patients as the CoE approach. The PC approach should be evaluated in other clinical settings to get a better sense of its potential. ©Jeffrey Krischer, Peter F Cronholm, Cristina Burroughs, Carol A McAlear, Renee Borchin, Ebony Easley, Trocon Davis, Joyce Kullman, Simon Carette, Nader Khalidi, Curry Koening, Carol A Langford, Paul Monach, Larry Moreland, Christian Pagnoux, Ulrich Specks, Antoine G Sreih, Steven Ytterberg, Peter A Merkel, Vasculitis Clinical Research Consortium. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.02.2017.
Alemtuzumab (Millennium/ILEX).
Dumont, F J
2001-01-01
Alemtuzumab, a lymphocyte-depleting humanized monoclonal antibody, is being developed by Millennium Pharmaceuticals Inc and ILEX Oncology for the potential treatment of chronic lymphocytic leukemia (CLL) [274580]. The utility of the compound for treating bone marrow (BM) stem cell transplantation-associated graft-versus-host disease (GVHD) [372946] and for ex vivo purging of BM to remove malignant T-cells [244056] is also being investigated. Additional potential therapeutic areas for which clinical trials are planned or ongoing include vasculitis, multiple sclerosis [288762] and organ transplantation [338304]. A Biologics License Application (BLA) was filed with the FDA in December 1999 by ILEX and Millennium [351523], [351524], [373873]. The FDA accepted the application for filing in February 2000 [355775] and returned a complete response letter in June 2000 [372172]. Millennium and ILEX submitted a response to the FDA in August 2000 [379766]. Alemtuzumab has received Fast Track designation [304771] and orphan drug status from the FDA [288762], and the drug was reviewed by the FDA's Oncologic Drugs Advisory Committee on 14 December, 2000 [387228]. The committee voted 14 to 1 to recommend accelerated approval of alemtuzumab for patients with CLL who have been treated with alkylating agents and who have failed fludarabine therapy [393778], [393894]. In March 2000, Millennium and ILEX also submitted a Marketing Authorization Application (MAA) for alemtuzumab to the European Agency for the Evaluation of Medicinal Products (EMEA) [363595]. In October 2000, EMEA accepted the MAA for alemtuzumab under the agency's centralized approval procedure [387228]. Alemtuzumab was originally synthesized by Herman Waldmann and colleagues at Cambridge University and licensed to Burroughs Wellcome (BW) via the British Technology Group (BTG) [162622]. BW conducted phase I and II trials for a broad range of indications, but then discontinued development because of disappointing results in phase II rheumatoid arthritis trials [326848]. In April 1997, LeukoSite licensed rights to the antibody from BTG for the treatment of CLL and prolymphocytic leukemia, plus an option to develop it for other indications. BW agreed to supply LeukoSite with intellectual property [244056], [326848]. In May 1997, LeukoSite entered into a joint venture with ILEX Oncology for the further development of alemtuzumab [245986]. By the end of 1999, Millennium acquired LeukoSite with commitment to pursue development of the compound through the joint venture Millennium & ILEX Partners LP [351523], [370237]. In August 1999, Schering AG and its US affiliate Berlex Laboratories obtained exclusive worldwide marketing rights for alemtuzumab, excluding Japan and East Asia. In the US, Berlex, Millennium and ILEX will divide profits from alemtuzumab sales equally [337702], [338837].
García, Patricia J; Holmes, King K; Cárcamo, César P; Garnett, Geoff P; Hughes, James P; Campos, Pablo E; Whittington, William L H
2012-03-24
Previous community-randomised trials of interventions to control sexually transmitted infections (STIs) have involved rural settings, were rarely multicomponent, and had varying results. We aimed to assess the effect of a multicomponent intervention on curable STIs in urban young adults and female sex workers (FSWs). In this community-randomised trial, baseline STI screening was done between August, and November, 2002, in random household samples of young adults (aged 18-29 years) and in FSWs in Peruvian cities with more than 50,000 inhabitants. Geographically separate cities were selected, matched into pairs, and randomly allocated to intervention or control groups with an S-PLUS program. Follow-up surveys of random samples were done after 2 years and 3 years. The intervention comprised four modalities: strengthened STI syndromic management by pharmacy workers and clinicians; mobile-team outreach to FSWs for STI screening and pathogen-specific treatment; periodic presumptive treatment of FSWs for trichomoniasis; and condom promotion for FSWs and the general population. Individuals in control cities received standard care. The composite primary endpoint was infection of young adults with Chlamydia trachomatis, Trichomonas vaginalis, or Neisseria gonorrhoeae, or syphilis seroreactivity. Laboratory workers and the data analyst were masked, but fieldworkers, the Peruvian study team, and participants in the outcome surveys were not. All analyses were done by intention to treat. This trial is registered, ISRCTN43722548. We did baseline surveys of 15,261 young adults in 24 Peruvian cities. Of those, 20 geographically separate cities were matched into pairs, in each of which one city was assigned to intervention and the other to standard of care. In the 2006 follow-up survey, data for the composite primary outcome were available for 12,930 young adults. We report a non-significant reduction in prevalence of STIs in young adults, adjusted for baseline prevalence, in intervention cities compared with control cities (relative risk 0·84, 95% CI 0·69-1·02; p=0·096). In subgroup analyses, significant reductions were noted in intervention cities in young adult women and FSWs. Syndromic management of STIs, mobile-team outreach to FSWs, presumptive treatment for trichomoniasis in FSWs, and condom promotion might reduce the composite prevalence of any of the four curable STIs investigated in this trial. Wellcome Trust and Burroughs Wellcome Fund, National Institutes of Health, Center for AIDS Research, CIPRA, and USAID-Peru. Copyright © 2012 Elsevier Ltd. All rights reserved.
The development of antiretroviral therapy and its impact on the HIV-1/AIDS pandemic.
Broder, Samuel
2010-01-01
In the last 25 years, HIV-1, the retrovirus responsible for the acquired immunodeficiency syndrome (AIDS), has gone from being an "inherently untreatable" infectious agent to one eminently susceptible to a range of approved therapies. During a five-year period, starting in the mid-1980s, my group at the National Cancer Institute played a role in the discovery and development of the first generation of antiretroviral agents, starting in 1985 with Retrovir (zidovudine, AZT) in a collaboration with scientists at the Burroughs-Wellcome Company (now GlaxoSmithKline). We focused on AZT and related congeners in the dideoxynucleoside family of nucleoside reverse transcriptase inhibitors (NRTIs), taking them from the laboratory to the clinic in response to the pandemic of AIDS, then a terrifying and lethal disease. These drugs proved, above all else, that HIV-1 infection is treatable, and such proof provided momentum for new therapies from many sources, directed at a range of viral targets, at a pace that has rarely if ever been matched in modern drug development. Antiretroviral therapy has brought about a substantial decrease in the death rate due to HIV-1 infection, changing it from a rapidly lethal disease into a chronic manageable condition, compatible with very long survival. This has special implications within the classic boundaries of public health around the world, but at the same time in certain regions may also affect a cycle of economic and civil instability in which HIV-1/AIDS is both cause and consequence. Many challenges remain, including (1) the life-long duration of therapy; (2) the ultimate role of pre-exposure prophylaxis (PrEP); (3) the cardiometabolic side-effects or other toxicities of long-term therapy; (4) the emergence of drug-resistance and viral genetic diversity (non-B subtypes); (5) the specter of new cross-species transmissions from established retroviral reservoirs in apes and Old World monkeys; and (6) the continued pace of new HIV-1 infections in many parts of the world. All of these factors make refining current therapies and developing new therapeutic paradigms essential priorities, topics covered in articles within this special issue of Antiviral Research. Fortunately, there are exciting new insights into the biology of HIV-1, its interaction with cellular resistance factors, and novel points of attack for future therapies. Moreover, it is a short journey from basic research to public health benefit around the world. The current science will lead to new therapeutic strategies with far-reaching implications in the HIV-1/AIDS pandemic. This article forms part of a special issue of Antiviral Research marking the 25th anniversary of antiretroviral drug discovery and development, Vol. 85, issue 1, 2010. Copyright 2009 Elsevier B.V. All rights reserved.
The development of antiretroviral therapy and its impact on the HIV-1/AIDS pandemic
Broder, Samuel
2010-01-01
In the last 25 years, HIV-1, the retrovirus responsible for the Acquired Immunodeficiency Syndrome (AIDS), has gone from being an “inherently untreatable” infectious agent to one eminently susceptible to a range of approved therapies. During a five-year period, starting in the mid-1980s, my group at the National Cancer Institute played a role in the discovery and development of the first generation of antiretroviral agents, starting in 1985 with Retrovir® (zidovudine, AZT) in a collaboration with scientists at the Burroughs-Wellcome Company (now GlaxoSmithKline). We focused on AZT and related congeners in the dideoxynucleoside family of nucleoside reverse transcriptase inhibitors (NRTIs), taking them from the laboratory to the clinic in response to the pandemic of AIDS, then a terrifying and lethal disease. These drugs proved, above all else, that HIV-1 infection is treatable, and such proof provided momentum for new therapies from many sources, directed at a range of viral targets, at a pace that has rarely if ever been matched in modern drug development. Antiretroviral therapy has brought about a substantial decrease in the death rate due to HIV-1 infection, changing it from a rapidly lethal disease into a chronic manageable condition, compatible with very long survival. This has special implications within the classic boundaries of public health around the world, but at the same time in certain regions may also affect a cycle of economic and civil instability in which HIV-1/AIDS is both cause and consequence. Many challenges remain, including 1.) the life-long duration of therapy; 2.) the ultimate role of pre-exposure prophylaxis (PrEP); 3.) the cardiometabolic side effects or other toxicities of long-term therapy; 4.) the emergence of drug-resistance and viral genetic diversity (non-B subtypes); 5.) the specter of new cross-species transmissions from established retroviral reservoirs in apes and Old World monkeys; and 6.) the continued pace of new HIV-1 infections in many parts of the world. All of these factors make refining current therapies and developing new therapeutic paradigms essential priorities, topics covered in articles within this special issue of Antiviral Research. Fortunately, there are exciting new insights into the biology of HIV-1, its interaction with cellular resistance factors, and novel points of attack for future therapies. Moreover, it is a short journey from basic research to public health benefit around the world. The current science will lead to new therapeutic strategies with far-reaching implications in the HIV-1/AIDS pandemic. This article forms part of a special issue of Antiviral Research marking the 25th anniversary of antiretroviral drug discovery and development, Vol 85, issue 1, 2010. PMID:20018391
Cárcamo, César P; Campos, Pablo E; García, Patricia J; Hughes, James P; Garnett, Geoff P; Holmes, King K
2012-10-01
We assessed prevalences of seven sexually transmitted infections (STIs) in Peru, stratified by risk behaviours, to help to define care and prevention priorities. In a 2002 household-based survey of the general population, we enrolled randomly selected 18-29-year-old residents of 24 cities with populations greater than 50 000 people. We then surveyed female sex workers (FSWs) in these cities. We gathered data for sexual behaviour; vaginal specimens or urine for nucleic acid amplification tests for Neisseria gonorrhoeae, Chlamydia trachomatis, and Trichomonas vaginalis; and blood for serological tests for syphilis, HIV, and (in subsamples) herpes simplex virus 2 (HSV2) and human T-lymphotropic virus. This study is a registered component of the PREVEN trial, number ISRCTN43722548. 15 261 individuals from the general population and 4485 FSWs agreed to participate in our survey. Overall prevalence of infection with HSV2, weighted for city size, was 13·5% in men, 13·6% in women, and 60·6% in FSWs (all values in FSWs standardised to age composition of women in the general population). The prevalence of C trachomatis infection was 4·2% in men, 6·5% in women, and 16·4% in FSWs; of T vaginalis infection was 0·3% in men, 4·9% in women, and 7·9% in FSWs; and of syphilis was 0·5% in men, 0·4% in women, and 0·8% in FSWs. N gonorrhoeae infection had a prevalence of 0·1% in men and women, and of 1·6% in FSWs. Prevalence of HIV infection was 0·5% in men and FSWs, and 0·1% in women. Four (0·3%) of 1535 specimens were positive for human T-lymphotropic virus 1. In men, 65·0% of infections with HIV, 71·5% of N gonorrhoeae, and 41·4% of HSV2 and 60·9% of cases of syphilis were in the 13·3% who had sex with men or unprotected sex with FSWs in the past year. In women from the general population, 66·7% of infections with HIV and 16·7% of cases of syphilis were accounted for by the 4·4% who had been paid for sex by any of their past three partners. Defining of high-risk groups could guide targeting of interventions for communicable diseases-including STIs-in the general Peruvian population. Wellcome Trust-Burroughs Wellcome Fund Infectious Disease Initiative and US National Institutes of Health. Copyright © 2012 Elsevier Ltd. All rights reserved.
Liotta, Dennis C; Painter, George R
2016-01-01
The HIV/AIDS epidemic, which was first reported on in 1981, progressed in just 10 years to a disease afflicting 10 million people worldwide including 1 million in the US. In 1987, AZT was approved for treating HIV/AIDS. Unfortunately, its clinical usefullness was severly limited by associated toxicities and the emergence of resistance. Three other drugs that were approved in the early 1990s suffered from similar liabilities. In 1990, the Liotta group at Emory University developed a highly diastereoselective synthesis of racemic 3'-thia-2',3'-dideoxycytidine and 3'-thia-2',3'-5-fluorodideoxycytidine and demonstrated that these compounds exhibited excellent anti-HIV activity with no apparent cytotoxicity. Subsequently, the enantiomers of these compounds were separated using enzyme-mediated kinetic resolutions and their (-)-enantiomers (3TC and FTC, respectively) were found to have exceptionally attractive preclinical profiles. In addition to their anti-HIV activity, 3TC and FTC potently inhibit the replication of hepatitis B virus. The development of FTC, which was being carried out by Burroughs Wellcome, had many remarkable starts and stops. For example, passage studies indicated that the compound rapidly selected for a single resistant mutant, M184V, and that this strain was 500-1000-fold less sensitive to FTC than was wild-type virus. Fortunately, it was found that combinations of AZT with either 3TC or FTC were synergistic. The effectiveness of AZT-3TC combination therapy was subsequently demonstrated in four independent clinical trials, and in 1997, the FDA approved Combivir, a fixed dose combination of AZT and 3TC. In phase 1 clinical trials, FTC was well tolerated by all subjects with no adverse events observed. However, the development of FTC was halted by the aquistition of Wellcome PLC by Glaxo PLC in January 1995. In 1996, Triangle Pharmaceuticals licensed FTC from Emory and initiated a series of phase I/II clinical studies that demonstrated the safety and efficacy of the drug. In August 1998, FTC was granted "Fast Track" status, based primarily on its potential for once daily dosing. While the outcomes of two subsequent phase III trials were positive, a third phase III clinical trial involving combinations of 3TC or FTC with stavudine and neviripine had to be terminated due to serious liver-related adverse events. Although analysis of the data suggested that the liver toxicity was due to neviripine, the FDA decided that the study could not be used for drug registration. Ultimately, in January 2003, Gilead Sciences acquired Triangle Pharmaceuticals and completed the development of FTC (emtricitabine), which was approved for once a day, oral administration in July 2003. A year later, Truvada, a once a day, oral, fixed dose combination of emtricitabine and tenofovir disoproxyl fumarate received FDA approval and quickly became the accepted first line therapy when used with a third antiretroviral agent. In July 2006, the FDA approved Atripla, a once a day, oral, fixed dose combination of emtricitabine, tenofovir disoproxyl fumarate, and efavirenz, which represented the culmination of two decades of research that had transformed AIDS from a death sentence to a manageable chronic disease.
Collectively loading an application in a parallel computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.
Collectively loading an application in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: identifying, by a parallel computer control system, a subset of compute nodes in the parallel computer to execute a job; selecting, by the parallel computer control system, one of the subset of compute nodes in the parallel computer as a job leader compute node; retrieving, by the job leader compute node from computer memory, an application for executing the job; and broadcasting, by the job leader to the subset of compute nodes in the parallel computer, the application for executing the job.
Distributing an executable job load file to compute nodes in a parallel computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gooding, Thomas M.
Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less
Organization of the secure distributed computing based on multi-agent system
NASA Astrophysics Data System (ADS)
Khovanskov, Sergey; Rumyantsev, Konstantin; Khovanskova, Vera
2018-04-01
Nowadays developing methods for distributed computing is received much attention. One of the methods of distributed computing is using of multi-agent systems. The organization of distributed computing based on the conventional network computers can experience security threats performed by computational processes. Authors have developed the unified agent algorithm of control system of computing network nodes operation. Network PCs is used as computing nodes. The proposed multi-agent control system for the implementation of distributed computing allows in a short time to organize using of the processing power of computers any existing network to solve large-task by creating a distributed computing. Agents based on a computer network can: configure a distributed computing system; to distribute the computational load among computers operated agents; perform optimization distributed computing system according to the computing power of computers on the network. The number of computers connected to the network can be increased by connecting computers to the new computer system, which leads to an increase in overall processing power. Adding multi-agent system in the central agent increases the security of distributed computing. This organization of the distributed computing system reduces the problem solving time and increase fault tolerance (vitality) of computing processes in a changing computing environment (dynamic change of the number of computers on the network). Developed a multi-agent system detects cases of falsification of the results of a distributed system, which may lead to wrong decisions. In addition, the system checks and corrects wrong results.
Aggregating job exit statuses of a plurality of compute nodes executing a parallel application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.
Aggregating job exit statuses of a plurality of compute nodes executing a parallel application, including: identifying a subset of compute nodes in the parallel computer to execute the parallel application; selecting one compute node in the subset of compute nodes in the parallel computer as a job leader compute node; initiating execution of the parallel application on the subset of compute nodes; receiving an exit status from each compute node in the subset of compute nodes, where the exit status for each compute node includes information describing execution of some portion of the parallel application by the compute node; aggregatingmore » each exit status from each compute node in the subset of compute nodes; and sending an aggregated exit status for the subset of compute nodes in the parallel computer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gooding, Thomas M.
Distributing an executable job load file to compute nodes in a parallel computer, the parallel computer comprising a plurality of compute nodes, including: determining, by a compute node in the parallel computer, whether the compute node is participating in a job; determining, by the compute node in the parallel computer, whether a descendant compute node is participating in the job; responsive to determining that the compute node is participating in the job or that the descendant compute node is participating in the job, communicating, by the compute node to a parent compute node, an identification of a data communications linkmore » over which the compute node receives data from the parent compute node; constructing a class route for the job, wherein the class route identifies all compute nodes participating in the job; and broadcasting the executable load file for the job along the class route for the job.« less
Neural Computation and the Computational Theory of Cognition
ERIC Educational Resources Information Center
Piccinini, Gualtiero; Bahar, Sonya
2013-01-01
We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…
Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat
2010-10-01
This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.
Computer Fear and Anxiety in the United States Army
1991-03-01
number) FIELD I GROUP SUBGROUP Computer Fear, Computer Anxiety, Computerphobia, Cyberphobia, Technostress , Computer Aversion, Corn puterphrenia 19...physiological and psychological disorders that impact not only on individuals, but on organizations as well. " Technostress " is a related term which is...computers, technostress , computer anxious, computer resistance, terminal phobia, fear of technology, computer distrust, and computer aversion. Whatever
Cloud@Home: A New Enhanced Computing Paradigm
NASA Astrophysics Data System (ADS)
Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco
Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).
ERIC Educational Resources Information Center
Erdogan, Yavuz
2009-01-01
The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…
When does a physical system compute?
Horsman, Clare; Stepney, Susan; Wagner, Rob C; Kendon, Viv
2014-09-08
Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution . We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a 'computational entity', and its critical role in defining when computing is taking place in physical systems.
When does a physical system compute?
Horsman, Clare; Stepney, Susan; Wagner, Rob C.; Kendon, Viv
2014-01-01
Computing is a high-level process of a physical system. Recent interest in non-standard computing systems, including quantum and biological computers, has brought this physical basis of computing to the forefront. There has been, however, no consensus on how to tell if a given physical system is acting as a computer or not; leading to confusion over novel computational devices, and even claims that every physical event is a computation. In this paper, we introduce a formal framework that can be used to determine whether a physical system is performing a computation. We demonstrate how the abstract computational level interacts with the physical device level, in comparison with the use of mathematical models in experimental science. This powerful formulation allows a precise description of experiments, technology, computation and simulation, giving our central conclusion: physical computing is the use of a physical system to predict the outcome of an abstract evolution. We give conditions for computing, illustrated using a range of non-standard computing scenarios. The framework also covers broader computing contexts, where there is no obvious human computer user. We introduce the notion of a ‘computational entity’, and its critical role in defining when computing is taking place in physical systems. PMID:25197245
48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...
48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...
48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...
48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...
48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... computers and computer peripheral devices and components thereof and products containing the same that...
Code of Federal Regulations, 2010 CFR
2010-10-01
... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...
Code of Federal Regulations, 2011 CFR
2011-10-01
... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...
Code of Federal Regulations, 2012 CFR
2012-10-01
... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...
Code of Federal Regulations, 2011 CFR
2011-10-01
... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...
Code of Federal Regulations, 2014 CFR
2014-10-01
... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...
Code of Federal Regulations, 2014 CFR
2014-10-01
... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...
Code of Federal Regulations, 2010 CFR
2010-10-01
... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...
Code of Federal Regulations, 2013 CFR
2013-10-01
... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...
Code of Federal Regulations, 2014 CFR
2014-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2011 CFR
2011-10-01
... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...
Code of Federal Regulations, 2014 CFR
2014-10-01
... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...
Code of Federal Regulations, 2012 CFR
2012-10-01
... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2012 CFR
2012-10-01
... computer software or commercial computer software documentation. 227.7202-3 Section 227.7202-3 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202-3 Rights in commercial computer software or commercial computer software documentation...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2010 CFR
2010-10-01
... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., and warranty of computer software and computer software documentation. 227.7203-14 Section 227.7203-14... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...
Code of Federal Regulations, 2013 CFR
2013-10-01
... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and marking of computer software or computer software documentation to be furnished with restrictive markings...
Code of Federal Regulations, 2013 CFR
2013-10-01
... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...
The Research of the Parallel Computing Development from the Angle of Cloud Computing
NASA Astrophysics Data System (ADS)
Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun
2017-10-01
Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.
Archer, Charles J; Faraj, Ahmad A; Inglett, Todd A; Ratterman, Joseph D
2013-04-16
Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selected link to the adjacent compute node connected to the compute node through the selected link.
Computer Anxiety: How to Measure It?
ERIC Educational Resources Information Center
McPherson, Bill
1997-01-01
Provides an overview of five scales that are used to measure computer anxiety: Computer Anxiety Index, Computer Anxiety Scale, Computer Attitude Scale, Attitudes toward Computers, and Blombert-Erickson-Lowrey Computer Attitude Task. Includes background information and scale specifics. (JOW)
Yaghmaie, Farideh; Jayasuriya, Rohan
2004-01-01
There have been many changes made to information systems in the last decade. Changes in information systems require users constantly to update their computer knowledge and skills. Computer training is a critical issue for any user because it offers them considerable new skills. The purpose of this study was to measure the effects of 'subjective computer training' and management support on attitudes to computers, computer anxiety and subjective norms to use computers. The data were collected from community health centre staff. The results of the study showed that health staff trained in computer use had more favourable attitudes to computers, less computer anxiety and more awareness of others' expectations about computer use than untrained users. However, there was no relationship between management support and computer attitude, computer anxiety or subjective norms. Lack of computer training for the majority of healthcare staff confirmed the need for more attention to this issue, particularly in health centres.
Code of Federal Regulations, 2011 CFR
2011-10-01
... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...
Code of Federal Regulations, 2010 CFR
2010-10-01
... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...
Code of Federal Regulations, 2012 CFR
2012-10-01
... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...
Code of Federal Regulations, 2012 CFR
2012-10-01
... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...
Code of Federal Regulations, 2011 CFR
2011-10-01
... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...
Code of Federal Regulations, 2014 CFR
2014-10-01
... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...
Code of Federal Regulations, 2013 CFR
2013-10-01
... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...
Code of Federal Regulations, 2010 CFR
2010-10-01
... deferred ordering of computer software and computer software documentation. 227.7203-8 Section 227.7203-8... GENERAL CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-8 Deferred delivery and deferred ordering of computer software and computer...
Code of Federal Regulations, 2013 CFR
2013-10-01
... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...
Code of Federal Regulations, 2014 CFR
2014-10-01
... computer software or computer software documentation to be furnished to the Government with restrictions on..., DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-3 Early identification of computer software or computer software documentation to be furnished to the Government with...
NASA Astrophysics Data System (ADS)
Furht, Borko
In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.
Educational Technology: Best Practices from America's Schools.
ERIC Educational Resources Information Center
Bozeman, William C.; Baumbach, Donna J.
This book begins with an overview of computer technology concepts, including computer system configurations, computer communications, and software. Instructional computer applications are then discussed; topics include computer-assisted instruction, computer-managed instruction, computer-enhanced instruction, LOGO, authoring programs, presentation…
48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...
48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...
48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...
48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...
48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...
48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...
48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...
48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...
48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...
48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...
ERIC Educational Resources Information Center
Celik, Vehbi; Yesilyurt, Etem
2013-01-01
There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…
Proposal for grid computing for nuclear applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.
2014-02-12
The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.
Demonstration of blind quantum computing.
Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip
2012-01-20
Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Charles J.; Faraj, Daniel A.; Inglett, Todd A.
Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selectedmore » link to the adjacent compute node connected to the compute node through the selected link.« less
Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
... of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L.... 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the Office of Management... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS...
A Short History of the Computer.
ERIC Educational Resources Information Center
Leon, George
1984-01-01
Briefly traces the development of computers from the abacus, John Napier's logarithms, the first computer/calculator (known as the Differential Engine), the first computer programming via steel punched cards, the electrical analog computer, electronic digital computer, and the transistor to the microchip of today's computers. (MBR)
Code of Federal Regulations, 2011 CFR
2011-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2013 CFR
2013-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2012 CFR
2012-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2014 CFR
2014-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2010 CFR
2010-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Differences in muscle load between computer and non-computer work among office workers.
Richter, J M; Mathiassen, S E; Slijper, H P; Over, E A B; Frens, M A
2009-12-01
Introduction of more non-computer tasks has been suggested to increase exposure variation and thus reduce musculoskeletal complaints (MSC) in computer-intensive office work. This study investigated whether muscle activity did, indeed, differ between computer and non-computer activities. Whole-day logs of input device use in 30 office workers were used to identify computer and non-computer work, using a range of classification thresholds (non-computer thresholds (NCTs)). Exposure during these activities was assessed by bilateral electromyography recordings from the upper trapezius and lower arm. Contrasts in muscle activity between computer and non-computer work were distinct but small, even at the individualised, optimal NCT. Using an average group-based NCT resulted in less contrast, even in smaller subgroups defined by job function or MSC. Thus, computer activity logs should be used cautiously as proxies of biomechanical exposure. Conventional non-computer tasks may have a limited potential to increase variation in muscle activity during computer-intensive office work.
Nurses' computer literacy and attitudes towards the use of computers in health care.
Gürdaş Topkaya, Sati; Kaya, Nurten
2015-05-01
This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.
ERIC Educational Resources Information Center
Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.
1999-01-01
Of 175 freshmen agriculture students, 74% had prior computer courses, 62% owned computers. The number of computer topics studied predicted both computer self-efficacy and computer knowledge. A substantial positive correlation was found between self-efficacy and computer knowledge. (SK)
Impact of Classroom Computer Use on Computer Anxiety.
ERIC Educational Resources Information Center
Lambert, Matthew E.; And Others
Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…
Power throttling of collections of computing elements
Bellofatto, Ralph E [Ridgefield, CT; Coteus, Paul W [Yorktown Heights, NY; Crumley, Paul G [Yorktown Heights, NY; Gara, Alan G [Mount Kidsco, NY; Giampapa, Mark E [Irvington, NY; Gooding,; Thomas, M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Megerian, Mark G [Rochester, MN; Ohmacht, Martin [Yorktown Heights, NY; Reed, Don D [Mantorville, MN; Swetz, Richard A [Mahopac, NY; Takken, Todd [Brewster, NY
2011-08-16
An apparatus and method for controlling power usage in a computer includes a plurality of computers communicating with a local control device, and a power source supplying power to the local control device and the computer. A plurality of sensors communicate with the computer for ascertaining power usage of the computer, and a system control device communicates with the computer for controlling power usage of the computer.
ERIC Educational Resources Information Center
Tosun, Nilgün; Suçsuz, Nursen; Yigit, Birol
2006-01-01
The purpose of this research was to investigate the effects of the computer-assisted and computer-based instructional methods on students achievement at computer classes and on their attitudes towards using computers. The study, which was completed in 6 weeks, were carried out with 94 sophomores studying in formal education program of Primary…
Code of Federal Regulations, 2013 CFR
2013-10-01
... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...
Code of Federal Regulations, 2012 CFR
2012-10-01
... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...
Code of Federal Regulations, 2014 CFR
2014-10-01
... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...
Code of Federal Regulations, 2011 CFR
2011-10-01
... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...
Precollege Computer Literacy: A Personal Computing Approach. Second Edition.
ERIC Educational Resources Information Center
Moursund, David
Intended for elementary and secondary teachers and curriculum specialists, this booklet discusses and defines computer literacy as a functional knowledge of computers and their effects on students and the rest of society. It analyzes personal computing and the aspects of computers that have direct impact on students. Outlining computer-assisted…
Code of Federal Regulations, 2010 CFR
2010-10-01
... computer software and noncommercial computer software documentation. 252.227-7014 Section 252.227-7014... Rights in noncommercial computer software and noncommercial computer software documentation. As prescribed in 227.7203-6(a)(1), use the following clause. Rights in Noncommercial Computer Software and...
Synchronizing compute node time bases in a parallel computer
Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip
2015-01-27
Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.
Synchronizing compute node time bases in a parallel computer
Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip
2014-12-30
Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
Physarum machines: encapsulating reaction-diffusion to compute spanning tree
NASA Astrophysics Data System (ADS)
Adamatzky, Andrew
2007-12-01
The Physarum machine is a biological computing device, which employs plasmodium of Physarum polycephalum as an unconventional computing substrate. A reaction-diffusion computer is a chemical computing device that computes by propagating diffusive or excitation wave fronts. Reaction-diffusion computers, despite being computationally universal machines, are unable to construct certain classes of proximity graphs without the assistance of an external computing device. I demonstrate that the problem can be solved if the reaction-diffusion system is enclosed in a membrane with few ‘growth points’, sites guiding the pattern propagation. Experimental approximation of spanning trees by P. polycephalum slime mold demonstrates the feasibility of the approach. Findings provided advance theory of reaction-diffusion computation by enriching it with ideas of slime mold computation.
Efficient universal blind quantum computation.
Giovannetti, Vittorio; Maccone, Lorenzo; Morimae, Tomoyuki; Rudolph, Terry G
2013-12-06
We give a cheat sensitive protocol for blind universal quantum computation that is efficient in terms of computational and communication resources: it allows one party to perform an arbitrary computation on a second party's quantum computer without revealing either which computation is performed, or its input and output. The first party's computational capabilities can be extremely limited: she must only be able to create and measure single-qubit superposition states. The second party is not required to use measurement-based quantum computation. The protocol requires the (optimal) exchange of O(Jlog2(N)) single-qubit states, where J is the computational depth and N is the number of qubits needed for the computation.
Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.
ERIC Educational Resources Information Center
Murray, David R.
This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…
Know Your Discipline: Teaching the Philosophy of Computer Science
ERIC Educational Resources Information Center
Tedre, Matti
2007-01-01
The diversity and interdisciplinarity of computer science and the multiplicity of its uses in other sciences make it hard to define computer science and to prescribe how computer science should be carried out. The diversity of computer science also causes friction between computer scientists from different branches. Computer science curricula, as…
ERIC Educational Resources Information Center
McInerney, Valentina; And Others
This study examined the effects of increased computing experience on the computer anxiety of 101 first year preservice teacher education students at a regional university in Australia. Three instruments measuring computer anxiety and attitudes--the Computer Anxiety Rating Scale (CARS), Attitudes Towards Computers Scale (ATCS), and Computer…
Code of Federal Regulations, 2011 CFR
2011-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2010 CFR
2010-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2013 CFR
2013-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
Code of Federal Regulations, 2012 CFR
2012-07-01
... pertaining to computer shareware and donation of public domain computer software. 201.26 Section 201.26... public domain computer software. (a) General. This section prescribes the procedures for submission of legal documents pertaining to computer shareware and the deposit of public domain computer software...
ERIC Educational Resources Information Center
Brady, Corey; Orton, Kai; Weintrop, David; Anton, Gabriella; Rodriguez, Sebastian; Wilensky, Uri
2017-01-01
Computer science (CS) is becoming an increasingly diverse domain. This paper reports on an initiative designed to introduce underrepresented populations to computing using an eclectic, multifaceted approach. As part of a yearlong computing course, students engage in Maker activities, participatory simulations, and computing projects that…
ERIC Educational Resources Information Center
Rozell, E. J.; Gardner, W. L., III
1999-01-01
A model of the intrapersonal processes impacting computer-related performance was tested using data from 75 manufacturing employees in a computer training course. Gender, computer experience, and attributional style were predictive of computer attitudes, which were in turn related to computer efficacy, task-specific performance expectations, and…
Computer ergonomics: the medical practice guide to developing good computer habits.
Hills, Laura
2011-01-01
Medical practice employees are likely to use computers for at least some of their work. Some sit several hours each day at computer workstations. Therefore, it is important that members of your medical practice team develop good computer work habits and that they know how to align equipment, furniture, and their bodies to prevent strain, stress, and computer-related injuries. This article delves into the field of computer ergonomics-the design of computer workstations and work habits to reduce user fatigue, discomfort, and injury. It describes practical strategies medical practice employees can use to improve their computer work habits. Specifically, this article describes the proper use of the computer workstation chair, the ideal placement of the computer monitor and keyboard, and the best lighting for computer work areas and tasks. Moreover, this article includes computer ergonomic guidelines especially for bifocal and progressive lens wearers and offers 10 tips for proper mousing. Ergonomically correct posture, movements, positioning, and equipment are all described in detail to enable the frequent computer user in your medical practice to remain healthy, pain-free, and productive.
Increasing processor utilization during parallel computation rundown
NASA Technical Reports Server (NTRS)
Jones, W. H.
1986-01-01
Some parallel processing environments provide for asynchronous execution and completion of general purpose parallel computations from a single computational phase. When all the computations from such a phase are complete, a new parallel computational phase is begun. Depending upon the granularity of the parallel computations to be performed, there may be a shortage of available work as a particular computational phase draws to a close (computational rundown). This can result in the waste of computing resources and the delay of the overall problem. In many practical instances, strict sequential ordering of phases of parallel computation is not totally required. In such cases, the beginning of one phase can be correctly computed before the end of a previous phase is completed. This allows additional work to be generated somewhat earlier to keep computing resources busy during each computational rundown. The conditions under which this can occur are identified and the frequency of occurrence of such overlapping in an actual parallel Navier-Stokes code is reported. A language construct is suggested and possible control strategies for the management of such computational phase overlapping are discussed.
Gender stereotypes, aggression, and computer games: an online survey of women.
Norris, Kamala O
2004-12-01
Computer games were conceptualized as a potential mode of entry into computer-related employment for women. Computer games contain increasing levels of realism and violence, as well as biased gender portrayals. It has been suggested that aggressive personality characteristics attract people to aggressive video games, and that more women do not play computer games because they are socialized to be non-aggressive. To explore gender identity and aggressive personality in the context of computers, an online survey was conducted on women who played computer games and women who used the computer but did not play computer games. Women who played computer games perceived their online environments as less friendly but experienced less sexual harassment online, were more aggressive themselves, and did not differ in gender identity, degree of sex role stereotyping, or acceptance of sexual violence when compared to women who used the computer but did not play video games. Finally, computer gaming was associated with decreased participation in computer-related employment; however, women with high masculine gender identities were more likely to use computers at work.
Recent development on computer aided tissue engineering--a review.
Sun, Wei; Lal, Pallavi
2002-02-01
The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.
Computer hardware fault administration
Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.
2010-09-14
Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.
Aono, Masashi; Gunji, Yukio-Pegio
2003-10-01
The emergence derived from errors is the key importance for both novel computing and novel usage of the computer. In this paper, we propose an implementable experimental plan for the biological computing so as to elicit the emergent property of complex systems. An individual plasmodium of the true slime mold Physarum polycephalum acts in the slime mold computer. Modifying the Elementary Cellular Automaton as it entails the global synchronization problem upon the parallel computing provides the NP-complete problem solved by the slime mold computer. The possibility to solve the problem by giving neither all possible results nor explicit prescription of solution-seeking is discussed. In slime mold computing, the distributivity in the local computing logic can change dynamically, and its parallel non-distributed computing cannot be reduced into the spatial addition of multiple serial computings. The computing system based on exhaustive absence of the super-system may produce, something more than filling the vacancy.
Computer Use and Computer Anxiety in Older Korean Americans.
Yoon, Hyunwoo; Jang, Yuri; Xie, Bo
2016-09-01
Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. © The Author(s) 2015.
Archer, Charles J.; Faraj, Ahmad A.; Inglett, Todd A.; Ratterman, Joseph D.
2012-10-23
Methods, apparatus, and products are disclosed for providing nearest neighbor point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: identifying each link in the global combining network for each compute node of the operational group; designating one of a plurality of point-to-point class routing identifiers for each link such that no compute node in the operational group is connected to two adjacent compute nodes in the operational group with links designated for the same class routing identifiers; and configuring each compute node of the operational group for point-to-point communications with each adjacent compute node in the global combining network through the link between that compute node and that adjacent compute node using that link's designated class routing identifier.
NASA Astrophysics Data System (ADS)
Nelson, Mathew
In today's age of exponential change and technological advancement, awareness of any gender gap in technology and computer science-related fields is crucial, but further research must be done in an effort to better understand the complex interacting factors contributing to the gender gap. This study utilized a survey to investigate specific gender differences relating to computing self-efficacy, computer usage, and environmental factors of exposure, personal interests, and parental influence that impact gender differences of high school students within a one-to-one computing environment in South Dakota. The population who completed the One-to-One High School Computing Survey for this study consisted of South Dakota high school seniors who had been involved in a one-to-one computing environment for two or more years. The data from the survey were analyzed using descriptive and inferential statistics for the determined variables. From the review of literature and data analysis several conclusions were drawn from the findings. Among them are that overall, there was very little difference in perceived computing self-efficacy and computing anxiety between male and female students within the one-to-one computing initiative. The study supported the current research that males and females utilized computers similarly, but males spent more time using their computers to play online games. Early exposure to computers, or the age at which the student was first exposed to a computer, and the number of computers present in the home (computer ownership) impacted computing self-efficacy. The results also indicated parental encouragement to work with computers also contributed positively to both male and female students' computing self-efficacy. Finally the study also found that both mothers and fathers encouraged their male children more than their female children to work with computing and pursue careers in computing science fields.
Pedretti, Kevin
2008-11-18
A compute processor allocator architecture for allocating compute processors to run applications in a multiple processor computing apparatus is distributed among a subset of processors within the computing apparatus. Each processor of the subset includes a compute processor allocator. The compute processor allocators can share a common database of information pertinent to compute processor allocation. A communication path permits retrieval of information from the database independently of the compute processor allocators.
Locating hardware faults in a data communications network of a parallel computer
Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.
2010-01-12
Hardware faults location in a data communications network of a parallel computer. Such a parallel computer includes a plurality of compute nodes and a data communications network that couples the compute nodes for data communications and organizes the compute node as a tree. Locating hardware faults includes identifying a next compute node as a parent node and a root of a parent test tree, identifying for each child compute node of the parent node a child test tree having the child compute node as root, running a same test suite on the parent test tree and each child test tree, and identifying the parent compute node as having a defective link connected from the parent compute node to a child compute node if the test suite fails on the parent test tree and succeeds on all the child test trees.
Broadcasting collective operation contributions throughout a parallel computer
Faraj, Ahmad [Rochester, MN
2012-02-21
Methods, systems, and products are disclosed for broadcasting collective operation contributions throughout a parallel computer. The parallel computer includes a plurality of compute nodes connected together through a data communications network. Each compute node has a plurality of processors for use in collective parallel operations on the parallel computer. Broadcasting collective operation contributions throughout a parallel computer according to embodiments of the present invention includes: transmitting, by each processor on each compute node, that processor's collective operation contribution to the other processors on that compute node using intra-node communications; and transmitting on a designated network link, by each processor on each compute node according to a serial processor transmission sequence, that processor's collective operation contribution to the other processors on the other compute nodes using inter-node communications.
Reflections from the Computer Equity Training Project.
ERIC Educational Resources Information Center
Sanders, Jo Shuchat
This paper addresses girls' patterns of computer avoidance at the middle school and other grade levels. It reviews the evidence for a gender gap in computer use in several areas: in school, at home, in computer camps, in computer magazines, and in computer-related jobs. It compares the computer equity issue to math avoidance, and cites the middle…
Models of Computer Use in School Settings. Technical Report Series, Report No. 84.2.2.
ERIC Educational Resources Information Center
Sherwood, Robert D.
Designed to focus on student learning and to illustrate techniques that might be used with computers to facilitate that process, this paper discusses five types of computer use in educational settings: (1) learning ABOUT computers; (2) learning WITH computers; (3) learning FROM computers; (4) learning ABOUT THINKING with computers; and (5)…
Implementing Computer Technology in the Rehabilitation Process.
ERIC Educational Resources Information Center
McCollum, Paul S., Ed.; Chan, Fong, Ed.
1985-01-01
This special issue contains seven articles, addressing rehabilitation in the information age, computer-assisted rehabilitation services, computer technology in rehabilitation counseling, computer-assisted career exploration and vocational decision making, computer-assisted assessment, computer enhanced employment opportunities for persons with…
Cork, Randy D.; Detmer, William M.; Friedman, Charles P.
1998-01-01
This paper describes details of four scales of a questionnaire—“Computers in Medical Care”—measuring attributes of computer use, self-reported computer knowledge, computer feature demand, and computer optimism of academic physicians. The reliability (i.e., precision, or degree to which the scale's result is reproducible) and validity (i.e., accuracy, or degree to which the scale actually measures what it is supposed to measure) of each scale were examined by analysis of the responses of 771 full-time academic physicians across four departments at five academic medical centers in the United States. The objectives of this paper were to define the psychometric properties of the scales as the basis for a future demonstration study and, pending the results of further validity studies, to provide the questionnaire and scales to the medical informatics community as a tool for measuring the attitudes of health care providers. Methodology: The dimensionality of each scale and degree of association of each item with the attribute of interest were determined by principal components factor analysis with othogonal varimax rotation. Weakly associated items (factor loading <.40) were deleted. The reliability of each resultant scale was computed using Cronbach's alpha coefficient. Content validity was addressed during scale construction; construct validity was examined through factor analysis and by correlational analyses. Results: Attributes of computer use, computer knowledge, and computer optimism were unidimensional, with the corresponding scales having reliabilities of.79,.91, and.86, respectively. The computer-feature demand attribute differentiated into two dimensions: the first reflecting demand for high-level functionality with reliability of.81 and the second demand for usability with reliability of.69. There were significant positive correlations between computer use, computer knowledge, and computer optimism scale scores and respondents' hands-on computer use, computer training, and self-reported computer sophistication. In addition, items posited on the computer knowledge scale to be more difficult generated significantly lower scores. Conclusion: The four scales of the questionnaire appear to measure with adequate reliability five attributes of academic physicians' attitudes toward computers in medical care: computer use, self-reported computer knowledge, demand for computer functionality, demand for computer usability, and computer optimism. Results of initial validity studies are positive, but further validation of the scales is needed. The URL of a downloadable HTML copy of the questionnaire is provided. PMID:9524349
Computer-aided design and computer science technology
NASA Technical Reports Server (NTRS)
Fulton, R. E.; Voigt, S. J.
1976-01-01
A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.
NASA Technical Reports Server (NTRS)
Kutler, Paul; Yee, Helen
1987-01-01
Topics addressed include: numerical aerodynamic simulation; computational mechanics; supercomputers; aerospace propulsion systems; computational modeling in ballistics; turbulence modeling; computational chemistry; computational fluid dynamics; and computational astrophysics.
Publishing Trends in Educational Computing.
ERIC Educational Resources Information Center
O'Hair, Marilyn; Johnson, D. LaMont
1989-01-01
Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…
Adaptive voting computer system
NASA Technical Reports Server (NTRS)
Koczela, L. J.; Wilgus, D. S. (Inventor)
1974-01-01
A computer system is reported that uses adaptive voting to tolerate failures and operates in a fail-operational, fail-safe manner. Each of four computers is individually connected to one of four external input/output (I/O) busses which interface with external subsystems. Each computer is connected to receive input data and commands from the other three computers and to furnish output data commands to the other three computers. An adaptive control apparatus including a voter-comparator-switch (VCS) is provided for each computer to receive signals from each of the computers and permits adaptive voting among the computers to permit the fail-operational, fail-safe operation.
NASA Astrophysics Data System (ADS)
Cook, Perry
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.
Paging memory from random access memory to backing storage in a parallel computer
Archer, Charles J; Blocksome, Michael A; Inglett, Todd A; Ratterman, Joseph D; Smith, Brian E
2013-05-21
Paging memory from random access memory (`RAM`) to backing storage in a parallel computer that includes a plurality of compute nodes, including: executing a data processing application on a virtual machine operating system in a virtual machine on a first compute node; providing, by a second compute node, backing storage for the contents of RAM on the first compute node; and swapping, by the virtual machine operating system in the virtual machine on the first compute node, a page of memory from RAM on the first compute node to the backing storage on the second compute node.
Virtualization and cloud computing in dentistry.
Chow, Frank; Muftu, Ali; Shorter, Richard
2014-01-01
The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.
Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
Molecular Sticker Model Stimulation on Silicon for a Maximum Clique Problem
Ning, Jianguo; Li, Yanmei; Yu, Wen
2015-01-01
Molecular computers (also called DNA computers), as an alternative to traditional electronic computers, are smaller in size but more energy efficient, and have massive parallel processing capacity. However, DNA computers may not outperform electronic computers owing to their higher error rates and some limitations of the biological laboratory. The stickers model, as a typical DNA-based computer, is computationally complete and universal, and can be viewed as a bit-vertically operating machine. This makes it attractive for silicon implementation. Inspired by the information processing method on the stickers computer, we propose a novel parallel computing model called DEM (DNA Electronic Computing Model) on System-on-a-Programmable-Chip (SOPC) architecture. Except for the significant difference in the computing medium—transistor chips rather than bio-molecules—the DEM works similarly to DNA computers in immense parallel information processing. Additionally, a plasma display panel (PDP) is used to show the change of solutions, and helps us directly see the distribution of assignments. The feasibility of the DEM is tested by applying it to compute a maximum clique problem (MCP) with eight vertices. Owing to the limited computing sources on SOPC architecture, the DEM could solve moderate-size problems in polynomial time. PMID:26075867
Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing
NASA Astrophysics Data System (ADS)
Shi, X.
2017-10-01
Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.
Evaluation of computer usage in healthcare among private practitioners of NCT Delhi.
Ganeshkumar, P; Arun Kumar, Sharma; Rajoura, O P
2011-01-01
1. To evaluate the usage and the knowledge of computers and Information and Communication Technology in health care delivery by private practitioners. 2. To understand the determinants of computer usage by them. A cross sectional study was conducted among the private practitioners practising in three districts of NCT of Delhi between November 2007 and December 2008 by stratified random sampling method, where knowledge and usage of computers in health care and determinants of usage of computer was evaluated in them by a pre-coded semi open ended questionnaire. About 77% of the practitioners reported to have a computer and had the accessibility to internet. Computer availability and internet accessibility was highest among super speciality practitioners. Practitioners who attended a computer course were 13.8 times [OR: 13.8 (7.3 - 25.8)] more likely to have installed an EHR in the clinic. Technical related issues were the major perceived barrier in installing a computer in the clinic. Practice speciality, previous attendance of a computer course, age of started using a computer influenced the knowledge about computers. Speciality of the practice, presence of a computer professional and gender were the determinants of usage of computer.
Research on Key Technologies of Cloud Computing
NASA Astrophysics Data System (ADS)
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
Performing an allreduce operation on a plurality of compute nodes of a parallel computer
Faraj, Ahmad [Rochester, MN
2012-04-17
Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.
Broadcasting a message in a parallel computer
Berg, Jeremy E [Rochester, MN; Faraj, Ahmad A [Rochester, MN
2011-08-02
Methods, systems, and products are disclosed for broadcasting a message in a parallel computer. The parallel computer includes a plurality of compute nodes connected together using a data communications network. The data communications network optimized for point to point data communications and is characterized by at least two dimensions. The compute nodes are organized into at least one operational group of compute nodes for collective parallel operations of the parallel computer. One compute node of the operational group assigned to be a logical root. Broadcasting a message in a parallel computer includes: establishing a Hamiltonian path along all of the compute nodes in at least one plane of the data communications network and in the operational group; and broadcasting, by the logical root to the remaining compute nodes, the logical root's message along the established Hamiltonian path.
NASA Astrophysics Data System (ADS)
Rodriguez, Sarah L.; Lehman, Kathleen
2017-10-01
This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.
AV Programs for Computer Know-How.
ERIC Educational Resources Information Center
Mandell, Phyllis Levy
1985-01-01
Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…
29 CFR 541.401 - Computer manufacture and repair.
Code of Federal Regulations, 2011 CFR
2011-07-01
... DEFINING AND DELIMITING THE EXEMPTIONS FOR EXECUTIVE, ADMINISTRATIVE, PROFESSIONAL, COMPUTER AND OUTSIDE..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...
Controlling data transfers from an origin compute node to a target compute node
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2011-06-21
Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1983-01-01
The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
Computational Social Creativity.
Saunders, Rob; Bown, Oliver
2015-01-01
This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.
Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography
Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji
2013-01-01
OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume < −950 HU of each lobe using (i) the slice-by-slice method (reference standard), (ii) number of segments method, and (iii) semi-automatic and (iv) automatic computer-aided diagnosis. We determined Pearson correlation coefficients between the reference standard and the three other methods for lobar volumes and emphysematous lobar volumes. We also compared the relative errors among the three measurement methods. RESULTS Both semi-automatic and automatic computer-aided diagnosis results were more strongly correlated with the reference standard than the number of segments method. The correlation coefficients for automatic computer-aided diagnosis were slightly lower than those for semi-automatic computer-aided diagnosis because there was one outlier among 50 cases (2%) in the right upper lobe and two outliers among 50 cases (4%) in the other lobes. The number of segments method relative error was significantly greater than those for semi-automatic and automatic computer-aided diagnosis (P < 0.001). The computational time for automatic computer-aided diagnosis was 1/2 to 2/3 than that of semi-automatic computer-aided diagnosis. CONCLUSIONS A novel lobar volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418
NASA Astrophysics Data System (ADS)
Anderson, Delia Marie Castro
Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.
Pacing a data transfer operation between compute nodes on a parallel computer
Blocksome, Michael A [Rochester, MN
2011-09-13
Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.
Archer, Charles J; Blocksome, Michael A; Peters, Amanda E; Ratterman, Joseph D; Smith, Brian E
2012-10-16
Methods, apparatus, and products are disclosed for scheduling applications for execution on a plurality of compute nodes of a parallel computer to manage temperature of the plurality of compute nodes during execution that include: identifying one or more applications for execution on the plurality of compute nodes; creating a plurality of physically discontiguous node partitions in dependence upon temperature characteristics for the compute nodes and a physical topology for the compute nodes, each discontiguous node partition specifying a collection of physically adjacent compute nodes; and assigning, for each application, that application to one or more of the discontiguous node partitions for execution on the compute nodes specified by the assigned discontiguous node partitions.
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2012-01-10
Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2012-04-17
Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.
A novel quantum scheme for secure two-party distance computation
NASA Astrophysics Data System (ADS)
Peng, Zhen-wan; Shi, Run-hua; Zhong, Hong; Cui, Jie; Zhang, Shun
2017-12-01
Secure multiparty computational geometry is an essential field of secure multiparty computation, which computes a computation geometric problem without revealing any private information of each party. Secure two-party distance computation is a primitive of secure multiparty computational geometry, which computes the distance between two points without revealing each point's location information (i.e., coordinate). Secure two-party distance computation has potential applications with high secure requirements in military, business, engineering and so on. In this paper, we present a quantum solution to secure two-party distance computation by subtly using quantum private query. Compared to the classical related protocols, our quantum protocol can ensure higher security and better privacy protection because of the physical principle of quantum mechanics.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Housner, Jerrold M.
1993-01-01
Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.
Reducing power consumption during execution of an application on a plurality of compute nodes
Archer, Charles J.; Blocksome, Michael A.; Peters, Amanda E.; Ratterman, Joseph D.; Smith, Brian E.
2013-09-10
Methods, apparatus, and products are disclosed for reducing power consumption during execution of an application on a plurality of compute nodes that include: powering up, during compute node initialization, only a portion of computer memory of the compute node, including configuring an operating system for the compute node in the powered up portion of computer memory; receiving, by the operating system, an instruction to load an application for execution; allocating, by the operating system, additional portions of computer memory to the application for use during execution; powering up the additional portions of computer memory allocated for use by the application during execution; and loading, by the operating system, the application into the powered up additional portions of computer memory.
Computer-assisted learning in critical care: from ENIAC to HAL.
Tegtmeyer, K; Ibsen, L; Goldstein, B
2001-08-01
Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer.
Ye, Nong; Li, Xiangyang; Farley, Toni
2003-01-15
Hand signs are considered as one of the important ways to enter information into computers for certain tasks. Computers receive sensor data of hand signs for recognition. When using hand signs as computer inputs, we need to (1) train computer users in the sign language so that their hand signs can be easily recognized by computers, and (2) design the computer interface to avoid the use of confusing signs for improving user input performance and user satisfaction. For user training and computer interface design, it is important to have a knowledge of which signs can be easily recognized by computers and which signs are not distinguishable by computers. This paper presents a data mining technique to discover distinct patterns of hand signs from sensor data. Based on these patterns, we derive a group of indistinguishable signs by computers. Such information can in turn assist in user training and computer interface design.
NASA Astrophysics Data System (ADS)
Cook, Perry R.
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).
ERIC Educational Resources Information Center
Zamora, Ramon M.
Alternative learning environments offering computer-related instruction are developing around the world. Storefront learning centers, museum-based computer facilities, and special theme parks are some of the new concepts. ComputerTown, USA! is a public access computer literacy project begun in 1979 to serve both adults and children in Menlo Park…
A Research Program in Computer Technology. 1982 Annual Technical Report
1983-03-01
for the Defense Advanced Research Projects Agency. The research applies computer science and technology to areas of high DoD/ military impact. The ISI...implement the plan; New Computing Environment - investigation and adaptation of developing computer technologies to serve the research and military ...Computing Environment - ,.*_i;.;"’.)n and adaptation of developing computer technologies to serve the research and military tser communities; and Computer
Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A
2016-12-01
Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)
1990-01-01
This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.
NASA Astrophysics Data System (ADS)
Huang, Qian
2014-09-01
Scientific computing often requires the availability of a massive number of computers for performing large-scale simulations, and computing in mineral physics is no exception. In order to investigate physical properties of minerals at extreme conditions in computational mineral physics, parallel computing technology is used to speed up the performance by utilizing multiple computer resources to process a computational task simultaneously thereby greatly reducing computation time. Traditionally, parallel computing has been addressed by using High Performance Computing (HPC) solutions and installed facilities such as clusters and super computers. Today, it has been seen that there is a tremendous growth in cloud computing. Infrastructure as a Service (IaaS), the on-demand and pay-as-you-go model, creates a flexible and cost-effective mean to access computing resources. In this paper, a feasibility report of HPC on a cloud infrastructure is presented. It is found that current cloud services in IaaS layer still need to improve performance to be useful to research projects. On the other hand, Software as a Service (SaaS), another type of cloud computing, is introduced into an HPC system for computing in mineral physics, and an application of which is developed. In this paper, an overall description of this SaaS application is presented. This contribution can promote cloud application development in computational mineral physics, and cross-disciplinary studies.
CSNS computing environment Based on OpenStack
NASA Astrophysics Data System (ADS)
Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu
2017-10-01
Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.
Distributed Accounting on the Grid
NASA Technical Reports Server (NTRS)
Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.
2001-01-01
By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.
The meaning of computers to a group of men who are homeless.
Miller, Kathleen Swenson; Bunch-Harrison, Stacey; Brumbaugh, Brett; Kutty, Rekha Sankaran; FitzGerald, Kathleen
2005-01-01
The purpose of this pilot study was to explore the experience with computers and the meaning of computers to a group of homeless men living in a long-term shelter. This descriptive exploratory study used semistructured interviews with seven men who had been given access to computers and had participated in individually tailored occupation based interventions through a Work Readiness Program. Three themes emerged from analyzing the interviews: access to computers, computers as a bridge to life-skill development, and changed self-perceptions as a result of connecting to technology. Because they lacked computer knowledge and feared failure, the majority of study participants had not sought out computers available through public access. The need for access to computers, the potential use of computers as a medium for intervention, and the meaning of computers to these men who represent the digital divide are described in this study.
Performing process migration with allreduce operations
Archer, Charles Jens; Peters, Amanda; Wallenfelt, Brian Paul
2010-12-14
Compute nodes perform allreduce operations that swap processes at nodes. A first allreduce operation generates a first result and uses a first process from a first compute node, a second process from a second compute node, and zeros from other compute nodes. The first compute node replaces the first process with the first result. A second allreduce operation generates a second result and uses the first result from the first compute node, the second process from the second compute node, and zeros from others. The second compute node replaces the second process with the second result, which is the first process. A third allreduce operation generates a third result and uses the first result from first compute node, the second result from the second compute node, and zeros from others. The first compute node replaces the first result with the third result, which is the second process.
Cloud Computing with iPlant Atmosphere.
McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos
2013-10-15
Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.
Link failure detection in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Megerian, Mark G.; Smith, Brian E.
2010-11-09
Methods, apparatus, and products are disclosed for link failure detection in a parallel computer including compute nodes connected in a rectangular mesh network, each pair of adjacent compute nodes in the rectangular mesh network connected together using a pair of links, that includes: assigning each compute node to either a first group or a second group such that adjacent compute nodes in the rectangular mesh network are assigned to different groups; sending, by each of the compute nodes assigned to the first group, a first test message to each adjacent compute node assigned to the second group; determining, by each of the compute nodes assigned to the second group, whether the first test message was received from each adjacent compute node assigned to the first group; and notifying a user, by each of the compute nodes assigned to the second group, whether the first test message was received.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less
'I'm good, but not that good': digitally-skilled young people's identity in computing
NASA Astrophysics Data System (ADS)
Wong, Billy
2016-12-01
Computers and information technology are fast becoming a part of young people's everyday life. However, there remains a difference between the majority who can use computers and the minority who are computer scientists or professionals. Drawing on 32 semi-structured interviews with digitally skilled young people (aged 13-19), we explore their views and aspirations in computing, with a focus on the identities and discourses that these youngsters articulate in relation to this field. Our findings suggest that, even among digitally skilled young people, traditional identities of computing as people who are clever but antisocial still prevail, which can be unattractive for youths, especially girls. Digitally skilled youths identify with computing in different ways and for different reasons. Most enjoy doing computing but few aspired to being a computer person. Implications of our findings for computing education are discussed especially the continued need to broaden identities in computing, even for the digitally skilled.
Internode data communications in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Miller, Douglas R.; Parker, Jeffrey J.; Ratterman, Joseph D.; Smith, Brian E.
2013-09-03
Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.
Internode data communications in a parallel computer
Archer, Charles J; Blocksome, Michael A; Miller, Douglas R; Parker, Jeffrey J; Ratterman, Joseph D; Smith, Brian E
2014-02-11
Internode data communications in a parallel computer that includes compute nodes that each include main memory and a messaging unit, the messaging unit including computer memory and coupling compute nodes for data communications, in which, for each compute node at compute node boot time: a messaging unit allocates, in the messaging unit's computer memory, a predefined number of message buffers, each message buffer associated with a process to be initialized on the compute node; receives, prior to initialization of a particular process on the compute node, a data communications message intended for the particular process; and stores the data communications message in the message buffer associated with the particular process. Upon initialization of the particular process, the process establishes a messaging buffer in main memory of the compute node and copies the data communications message from the message buffer of the messaging unit into the message buffer of main memory.
Computers in Undergraduate Science Education. Conference Proceedings.
ERIC Educational Resources Information Center
Blum, Ronald, Ed.
Six areas of computer use in undergraduate education, particularly in the fields of mathematics and physics, are discussed in these proceedings. The areas included are: the computational mode; computer graphics; the simulation mode; analog computing; computer-assisted instruction; and the current politics and management of college level computer…
ERIC Educational Resources Information Center
Angier, Natalie
1983-01-01
The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)
37 CFR 201.40 - Exemption to prohibition against circumvention.
Code of Federal Regulations, 2012 CFR
2012-07-01
... security of the owner or operator of a computer, computer system, or computer network; and (ii) The... film and media studies students; (ii) Documentary filmmaking; (iii) Noncommercial videos. (2) Computer... lawfully obtained, with computer programs on the telephone handset. (3) Computer programs, in the form of...
Computers in aeronautics and space research at the Lewis Research Center
NASA Technical Reports Server (NTRS)
1991-01-01
This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.
NASA Astrophysics Data System (ADS)
Antoine, Marilyn V.
2011-12-01
The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.
The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.
Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B
2006-02-15
To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.
The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics
Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.
2006-01-01
Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147
A Position on a Computer Literacy Course.
ERIC Educational Resources Information Center
Self, Charles C.
A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…
Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.
ERIC Educational Resources Information Center
Rosenberg, R.C.; And Others
These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
High-Performance Computing and Visualization | Energy Systems Integration
Facility | NREL High-Performance Computing and Visualization High-Performance Computing and Visualization High-performance computing (HPC) and visualization at NREL propel technology innovation as a . Capabilities High-Performance Computing NREL is home to Peregrine-the largest high-performance computing system
Computer Training for Seniors: An Academic-Community Partnership
ERIC Educational Resources Information Center
Sanders, Martha J.; O'Sullivan, Beth; DeBurra, Katherine; Fedner, Alesha
2013-01-01
Computer technology is integral to information retrieval, social communication, and social interaction. However, only 47% of seniors aged 65 and older use computers. The purpose of this study was to determine the impact of a client-centered computer program on computer skills, attitudes toward computer use, and generativity in novice senior…
Computing Education in Korea--Current Issues and Endeavors
ERIC Educational Resources Information Center
Choi, Jeongwon; An, Sangjin; Lee, Youngjun
2015-01-01
Computer education has been provided for a long period of time in Korea. Starting as a vocational program, the content of computer education for students evolved to include content on computer literacy, Information Communication Technology (ICT) literacy, and brand-new computer science. While a new curriculum related to computer science was…
Computers in Electrical Engineering Education at Virginia Polytechnic Institute.
ERIC Educational Resources Information Center
Bennett, A. Wayne
1982-01-01
Discusses use of computers in Electrical Engineering (EE) at Virginia Polytechnic Institute. Topics include: departmental background, level of computing power using large scale systems, mini and microcomputers, use of digital logic trainers and analog/hybrid computers, comments on integrating computers into EE curricula, and computer use in…
CAA: Computer Assisted Athletics.
ERIC Educational Resources Information Center
Hall, John H.
Computers have been used in a variety of applications for athletics since the late 1950's. These have ranged from computer-controlled electric scoreboards to computer-designed pole vaulting poles. Described in this paper are a computer-based athletic injury reporting system and a computer-assisted football scouting system. The injury reporting…
An Implemented Strategy for Campus Connectivity and Cooperative Computing.
ERIC Educational Resources Information Center
Halaris, Antony S.; Sloan, Lynda W.
1989-01-01
ConnectPac, a software package developed at Iona College to allow a computer user to access all services from a single personal computer, is described. ConnectPac uses mainframe computing to support a campus computing network, integrating personal and centralized computing into a menu-driven user environment. (Author/MLW)
ERIC Educational Resources Information Center
King, Kenneth M.
1988-01-01
Discussion of the recent computer virus attacks on computers with vulnerable operating systems focuses on the values of educational computer networks. The need for computer security procedures is emphasized, and the ethical use of computer hardware and software is discussed. (LRW)
ERIC Educational Resources Information Center
Sinn, John W.
This instructional manual contains five learning activity packets for use in a workshop on computer numerical control for computer-aided manufacturing. The lessons cover the following topics: introduction to computer-aided manufacturing, understanding the lathe, using the computer, computer numerically controlled part programming, and executing a…
29 CFR 541.402 - Executive and administrative computer employees.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 3 2014-07-01 2014-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...
29 CFR 541.402 - Executive and administrative computer employees.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 3 2012-07-01 2012-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...
29 CFR 541.402 - Executive and administrative computer employees.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 3 2013-07-01 2013-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...
Computer Connections for Gifted Children and Youth.
ERIC Educational Resources Information Center
Nazzaro, Jean N., Ed.
Written by computer specialists, teachers, parents, and students, the 23 articles emphasize the role computers play in the development of thinking, problem solving, and creativity in gifted and talented students. Articles have the following titles and authors: "Computers and Computer Cultures" (S. Papert); "Classroom Computers--Beyond the 3 R's"…
Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities
ERIC Educational Resources Information Center
Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David
2005-01-01
Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…
48 CFR 552.216-72 - Placement of Orders.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...
48 CFR 552.216-72 - Placement of Orders.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...
48 CFR 552.216-72 - Placement of Orders.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...
48 CFR 552.216-72 - Placement of Orders.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...
48 CFR 552.216-72 - Placement of Orders.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Acquisition Service (FAS) will place all orders by EDI using computer-to-computer EDI. If computer-to-computer... EDI. (d) When computer-to-computer EDI procedures will be used to place orders, the Contractor shall... electronic orders are placed, the transaction sets used, security procedures, and guidelines for...
A Call for Computational Thinking in Undergraduate Psychology
ERIC Educational Resources Information Center
Anderson, Nicole D.
2016-01-01
Computational thinking is an approach to problem solving that is typically employed by computer programmers. The advantage of this approach is that solutions can be generated through algorithms that can be implemented as computer code. Although computational thinking has historically been a skill that is exclusively taught within computer science,…
29 CFR 541.402 - Executive and administrative computer employees.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 3 2011-07-01 2011-07-01 false Executive and administrative computer employees. 541.402..., COMPUTER AND OUTSIDE SALES EMPLOYEES Computer Employees § 541.402 Executive and administrative computer employees. Computer employees within the scope of this exemption, as well as those employees not within its...
An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation
NASA Technical Reports Server (NTRS)
Bartos, R. D.
1993-01-01
Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.
Application Portable Parallel Library
NASA Technical Reports Server (NTRS)
Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott
1995-01-01
Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.
Quantum simulations with noisy quantum computers
NASA Astrophysics Data System (ADS)
Gambetta, Jay
Quantum computing is a new computational paradigm that is expected to lie beyond the standard model of computation. This implies a quantum computer can solve problems that can't be solved by a conventional computer with tractable overhead. To fully harness this power we need a universal fault-tolerant quantum computer. However the overhead in building such a machine is high and a full solution appears to be many years away. Nevertheless, we believe that we can build machines in the near term that cannot be emulated by a conventional computer. It is then interesting to ask what these can be used for. In this talk we will present our advances in simulating complex quantum systems with noisy quantum computers. We will show experimental implementations of this on some small quantum computers.
Ranasinghe, Priyanga; Wickramasinghe, Sashimali A; Pieris, Wa Rasanga; Karunathilake, Indika; Constantine, Godwin R
2012-09-14
The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty.Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p < 0.001). In the linear regression analysis, formal computer training was the strongest predictor of computer literacy (β = 13.034), followed by using internet facility, being from Western province, using computers for Web browsing and computer programming, computer ownership and doing IT (Information Technology) as a subject in GCE (A/L) examination. Sri Lankan medical undergraduates had a low-intermediate level of computer literacy. There is a need to improve computer literacy, by increasing computer training in schools, or by introducing computer training in the initial stages of the undergraduate programme. These two options require improvement in infrastructure and other resources.
Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
2003-01-01
The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.
Flow Ambiguity: A Path Towards Classically Driven Blind Quantum Computation
NASA Astrophysics Data System (ADS)
Mantri, Atul; Demarie, Tommaso F.; Menicucci, Nicolas C.; Fitzsimons, Joseph F.
2017-07-01
Blind quantum computation protocols allow a user to delegate a computation to a remote quantum computer in such a way that the privacy of their computation is preserved, even from the device implementing the computation. To date, such protocols are only known for settings involving at least two quantum devices: either a user with some quantum capabilities and a remote quantum server or two or more entangled but noncommunicating servers. In this work, we take the first step towards the construction of a blind quantum computing protocol with a completely classical client and single quantum server. Specifically, we show how a classical client can exploit the ambiguity in the flow of information in measurement-based quantum computing to construct a protocol for hiding critical aspects of a computation delegated to a remote quantum computer. This ambiguity arises due to the fact that, for a fixed graph, there exist multiple choices of the input and output vertex sets that result in deterministic measurement patterns consistent with the same fixed total ordering of vertices. This allows a classical user, computing only measurement angles, to drive a measurement-based computation performed on a remote device while hiding critical aspects of the computation.
Low latency, high bandwidth data communications between compute nodes in a parallel computer
Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.
2010-11-02
Methods, parallel computers, and computer program products are disclosed for low latency, high bandwidth data communications between compute nodes in a parallel computer. Embodiments include receiving, by an origin direct memory access (`DMA`) engine of an origin compute node, data for transfer to a target compute node; sending, by the origin DMA engine of the origin compute node to a target DMA engine on the target compute node, a request to send (`RTS`) message; transferring, by the origin DMA engine, a predetermined portion of the data to the target compute node using memory FIFO operation; determining, by the origin DMA engine whether an acknowledgement of the RTS message has been received from the target DMA engine; if the an acknowledgement of the RTS message has not been received, transferring, by the origin DMA engine, another predetermined portion of the data to the target compute node using a memory FIFO operation; and if the acknowledgement of the RTS message has been received by the origin DMA engine, transferring, by the origin DMA engine, any remaining portion of the data to the target compute node using a direct put operation.
2010-04-29
Cloud Computing The answer, my friend, is blowing in the wind. The answer is blowing in the wind. 1Bingue ‐ Cook Cloud Computing STSC 2010... Cloud Computing STSC 2010 Objectives • Define the cloud • Risks of cloud computing f l d i• Essence o c ou comput ng • Deployed clouds in DoD 3Bingue...Cook Cloud Computing STSC 2010 Definitions of Cloud Computing Cloud computing is a model for enabling b d d ku
Analysis on the security of cloud computing
NASA Astrophysics Data System (ADS)
He, Zhonglin; He, Yuhua
2011-02-01
Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.
Computer applications in remote sensing education
NASA Technical Reports Server (NTRS)
Danielson, R. L.
1980-01-01
Computer applications to instruction in any field may be divided into two broad generic classes: computer-managed instruction and computer-assisted instruction. The division is based on how frequently the computer affects the instructional process and how active a role the computer affects the instructional process and how active a role the computer takes in actually providing instruction. There are no inherent characteristics of remote sensing education to preclude the use of one or both of these techniques, depending on the computer facilities available to the instructor. The characteristics of the two classes are summarized, potential applications to remote sensing education are discussed, and the advantages and disadvantages of computer applications to the instructional process are considered.
Removing the center from computing: biology's new mode of digital knowledge production.
November, Joseph
2011-06-01
This article shows how the USA's National Institutes of Health (NIH) helped to bring about a major shift in the way computers are used to produce knowledge and in the design of computers themselves as a consequence of its early 1960s efforts to introduce information technology to biologists. Starting in 1960 the NIH sought to reform the life sciences by encouraging researchers to make use of digital electronic computers, but despite generous federal support biologists generally did not embrace the new technology. Initially the blame fell on biologists' lack of appropriate (i.e. digital) data for computers to process. However, when the NIH consulted MIT computer architect Wesley Clark about this problem, he argued that the computer's quality as a device that was centralized posed an even greater challenge to potential biologist users than did the computer's need for digital data. Clark convinced the NIH that if the agency hoped to effectively computerize biology, it would need to satisfy biologists' experimental and institutional needs by providing them the means to use a computer without going to a computing center. With NIH support, Clark developed the 1963 Laboratory Instrument Computer (LINC), a small, real-time interactive computer intended to be used inside the laboratory and controlled entirely by its biologist users. Once built, the LINC provided a viable alternative to the 1960s norm of large computers housed in computing centers. As such, the LINC not only became popular among biologists, but also served in later decades as an important precursor of today's computing norm in the sciences and far beyond, the personal computer.
Reconfigurable Computing for Computational Science: A New Focus in High Performance Computing
2006-11-01
in the past decade. Researchers are regularly employing the power of large computing systems and parallel processing to tackle larger and more...complex problems in all of the physical sciences. For the past decade or so, most of this growth in computing power has been “free” with increased...the scientific computing community as a means to continued growth in computing capability. This paper offers a glimpse of the hardware and
ERIC Educational Resources Information Center
Paisley, William; Butler, Matilda
This study of the computer/user interface investigated the role of the computer in performing information tasks that users now perform without computer assistance. Users' perceptual/cognitive processes are to be accelerated or augmented by the computer; a long term goal is to delegate information tasks entirely to the computer. Cybernetic and…
Human Expertise Helps Computer Classify Images
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.
1991-01-01
Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.
29 CFR 541.401 - Computer manufacture and repair.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...
29 CFR 541.401 - Computer manufacture and repair.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...
29 CFR 541.401 - Computer manufacture and repair.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...
29 CFR 541.401 - Computer manufacture and repair.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., the use of computers and computer software programs (e.g., engineers, drafters and others skilled in computer-aided design software), but who are not primarily engaged in computer systems analysis and...
Cost Considerations in Nonlinear Finite-Element Computing
NASA Technical Reports Server (NTRS)
Utku, S.; Melosh, R. J.; Islam, M.; Salama, M.
1985-01-01
Conference paper discusses computational requirements for finiteelement analysis using quasi-linear approach to nonlinear problems. Paper evaluates computational efficiency of different computer architecturtural types in terms of relative cost and computing time.
Computing Literacy in the University of the Future.
ERIC Educational Resources Information Center
Gantt, Vernon W.
In exploring the impact of microcomputers and the future of the university in 1985 and beyond, a distinction should be made between computing literacy--the ability to use a computer--and computer literacy, which goes beyond successful computer use to include knowing how to program in various computer languages and understanding what goes on…
ERIC Educational Resources Information Center
McAnear, Anita
2006-01-01
When we planned the editorial calendar with the topic ubiquitous computing, we were thinking of ubiquitous computing as the one-to-one ratio of computers to students and teachers and 24/7 access to electronic resources. At the time, we were aware that ubiquitous computing in the computer science field had more to do with wearable computers. Our…
Democratizing Computer Science
ERIC Educational Resources Information Center
Margolis, Jane; Goode, Joanna; Ryoo, Jean J.
2015-01-01
Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…
ERIC Educational Resources Information Center
Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu
2013-01-01
With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…
Suggested Approaches to the Measurement of Computer Anxiety.
ERIC Educational Resources Information Center
Toris, Carol
Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…
Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics
ERIC Educational Resources Information Center
Ciampa, Mark
2013-01-01
Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…
The Gender Factor in Computer Anxiety and Interest among Some Australian High School Students.
ERIC Educational Resources Information Center
Okebukola, Peter Akinsola
1993-01-01
Western Australia eleventh graders (142 boys, 139 girls) were compared on such variables as computers at home, computer classes, experience with computers, and socioeconomic status. Girls had higher anxiety levels, boys higher computer interest. Possible causes included social beliefs about computer use, teacher sex bias, and software (games) more…
Computer Programming Languages and Expertise Needed by Practicing Engineers.
ERIC Educational Resources Information Center
Doelling, Irvin
1980-01-01
Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
ERIC Educational Resources Information Center
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Computer-Mediated Communication in a High School: The Users Shape the Medium--Part 1.
ERIC Educational Resources Information Center
Bresler, Liora
1990-01-01
This field study represents a departure from structured, or directed, computer-mediated communication as used in its natural environment, the computer lab. Using observations, interviews, and the computer medium itself, the investigators report how high school students interact with computers and create their own agendas for computer usage and…
Computer-Game Construction: A Gender-Neutral Attractor to Computing Science
ERIC Educational Resources Information Center
Carbonaro, Mike; Szafron, Duane; Cutumisu, Maria; Schaeffer, Jonathan
2010-01-01
Enrollment in Computing Science university programs is at a dangerously low level. A major reason for this is the general lack of interest in Computing Science by females. In this paper, we discuss our experience with using a computer game construction environment as a vehicle to encourage female participation in Computing Science. Experiments…
ERIC Educational Resources Information Center
Conn, Samuel S.; Reichgelt, Han
2013-01-01
Cloud computing represents an architecture and paradigm of computing designed to deliver infrastructure, platforms, and software as constructible computing resources on demand to networked users. As campuses are challenged to better accommodate academic needs for applications and computing environments, cloud computing can provide an accommodating…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-12
... limited to) desktop computers, integrated desktop computers, laptop/notebook/ netbook computers, and... computer, and 65% of U.S. households owning a notebook, laptop, or netbook computer, in 2013.\\4\\ Coverage... recently published studies. In these studies, the average annual energy use for a desktop computer was...
Elementary EFL Teachers' Computer Phobia and Computer Self-Efficacy in Taiwan
ERIC Educational Resources Information Center
Chen, Kate Tzuching
2012-01-01
The advent and application of computer and information technology has increased the overall success of EFL teaching; however, such success is hard to assess, and teachers prone to computer avoidance face negative consequences. Two major obstacles are high computer phobia and low computer self-efficacy. However, little research has been carried out…
Computer use changes generalization of movement learning.
Wei, Kunlin; Yan, Xiang; Kong, Gaiqing; Yin, Cong; Zhang, Fan; Wang, Qining; Kording, Konrad Paul
2014-01-06
Over the past few decades, one of the most salient lifestyle changes for us has been the use of computers. For many of us, manual interaction with a computer occupies a large portion of our working time. Through neural plasticity, this extensive movement training should change our representation of movements (e.g., [1-3]), just like search engines affect memory [4]. However, how computer use affects motor learning is largely understudied. Additionally, as virtually all participants in studies of perception and actions are computer users, a legitimate question is whether insights from these studies bear the signature of computer-use experience. We compared non-computer users with age- and education-matched computer users in standard motor learning experiments. We found that people learned equally fast but that non-computer users generalized significantly less across space, a difference negated by two weeks of intensive computer training. Our findings suggest that computer-use experience shaped our basic sensorimotor behaviors, and this influence should be considered whenever computer users are recruited as study participants. Copyright © 2014 Elsevier Ltd. All rights reserved.
NACA Computer at the Lewis Flight Propulsion Laboratory
1951-02-21
A female computer at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory with a slide rule and Friden adding machine to make computations. The computer staff was introduced during World War II to relieve short-handed research engineers of some of the tedious computational work. The Computing Section was staffed by “computers,” young female employees, who often worked overnight when most of the tests were run. The computers obtained test data from the manometers and other instruments, made the initial computations, and plotted the data graphically. Researchers then analyzed the data and summarized the findings in a report or made modifications and ran the test again. There were over 400 female employees at the laboratory in 1944, including 100 computers. The use of computers was originally planned only for the duration of the war. The system was so successful that it was extended into the 1960s. The computers and analysts were located in the Altitude Wind Tunnel Shop and Office Building office wing during the 1940s and transferred to the new 8- by 6-Foot Supersonic Wind Tunnel in 1948.
Experiments in Computing: A Survey
Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404
Conjunctival impression cytology in computer users.
Kumar, S; Bansal, R; Khare, A; Malik, K P S; Malik, V K; Jain, K; Jain, C
2013-01-01
It is known that the computer users develop the features of dry eye. To study the cytological changes in the conjunctiva using conjunctival impression cytology in computer users and a control group. Fifteen eyes of computer users who had used computers for more than one year and ten eyes of an age-and-sex matched control group (those who had not used computers) were studied by conjunctival impression cytology. Conjunctival impression cytology (CIC) results in the control group were of stage 0 and stage I while the computer user group showed CIC results between stages II to stage IV. Among the computer users, the majority ( > 90 %) showed stage III and stage IV changes. We found that those who used computers daily for long hours developed more CIC changes than those who worked at the computer for a shorter daily duration. © NEPjOPH.
Experiments in computing: a survey.
Tedre, Matti; Moisseinen, Nella
2014-01-01
Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.
Ray, N J; Hannigan, A
1999-05-01
As dental practice management becomes more computer-based, the efficient functioning of the dentist will become dependent on adequate computer literacy. A survey has been carried out into the computer literacy of a cohort of 140 undergraduate dental students at a University Dental School in Ireland (years 1-5), in the academic year 1997-98. Aspects investigated by anonymous questionnaire were: (1) keyboard skills; (2) computer skills; (3) access to computer facilities; (4) software competencies and (5) use of medical library computer facilities. The students are relatively unfamiliar with basic computer hardware and software: 51.1% considered their expertise with computers as "poor"; 34.3% had taken a formal typewriting or computer keyboarding course; 7.9% had taken a formal computer course at university level and 67.2% were without access to computer facilities at their term-time residences. A majority of students had never used either word-processing, spreadsheet, or graphics programs. Programs relating to "informatics" were more popular, such as literature searching, accessing the Internet and the use of e-mail which represent the major use of the computers in the medical library. The lack of experience with computers may be addressed by including suitable computing courses at the secondary level (age 13-18 years) and/or tertiary level (FE/HE) education programmes. Such training may promote greater use of generic softwares, particularly in the library, with a more electronic-based approach to data handling.
Human-computer interaction in multitask situations
NASA Technical Reports Server (NTRS)
Rouse, W. B.
1977-01-01
Human-computer interaction in multitask decisionmaking situations is considered, and it is proposed that humans and computers have overlapping responsibilities. Queueing theory is employed to model this dynamic approach to the allocation of responsibility between human and computer. Results of simulation experiments are used to illustrate the effects of several system variables including number of tasks, mean time between arrivals of action-evoking events, human-computer speed mismatch, probability of computer error, probability of human error, and the level of feedback between human and computer. Current experimental efforts are discussed and the practical issues involved in designing human-computer systems for multitask situations are considered.
Performing a global barrier operation in a parallel computer
Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E
2014-12-09
Executing computing tasks on a parallel computer that includes compute nodes coupled for data communications, where each compute node executes tasks, with one task on each compute node designated as a master task, including: for each task on each compute node until all master tasks have joined a global barrier: determining whether the task is a master task; if the task is not a master task, joining a single local barrier; if the task is a master task, joining the global barrier and the single local barrier only after all other tasks on the compute node have joined the single local barrier.
NASA Astrophysics Data System (ADS)
Ukawa, Akira
1998-05-01
The CP-PACS computer is a massively parallel computer consisting of 2048 processing units and having a peak speed of 614 GFLOPS and 128 GByte of main memory. It was developed over the four years from 1992 to 1996 at the Center for Computational Physics, University of Tsukuba, for large-scale numerical simulations in computational physics, especially those of lattice QCD. The CP-PACS computer has been in full operation for physics computations since October 1996. In this article we describe the chronology of the development, the hardware and software characteristics of the computer, and its performance for lattice QCD simulations.
Advanced flight computers for planetary exploration
NASA Technical Reports Server (NTRS)
Stephenson, R. Rhoads
1988-01-01
Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Khan, Asaduzzaman; Western, Mark
The purpose of this study was to explore factors that facilitate or hinder effective use of computers in Australian general medical practice. This study is based on data extracted from a national telephone survey of 480 general practitioners (GPs) across Australia. Clinical functions performed by GPs using computers were examined using a zero-inflated Poisson (ZIP) regression modelling. About 17% of GPs were not using computer for any clinical function, while 18% reported using computers for all clinical functions. The ZIP model showed that computer anxiety was negatively associated with effective computer use, while practitioners' belief about usefulness of computers was positively associated with effective computer use. Being a female GP or working in partnership or group practice increased the odds of effectively using computers for clinical functions. To fully capitalise on the benefits of computer technology, GPs need to be convinced that this technology is useful and can make a difference.
Administering truncated receive functions in a parallel messaging interface
Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E
2014-12-09
Administering truncated receive functions in a parallel messaging interface (`PMI`) of a parallel computer comprising a plurality of compute nodes coupled for data communications through the PMI and through a data communications network, including: sending, through the PMI on a source compute node, a quantity of data from the source compute node to a destination compute node; specifying, by an application on the destination compute node, a portion of the quantity of data to be received by the application on the destination compute node and a portion of the quantity of data to be discarded; receiving, by the PMI on the destination compute node, all of the quantity of data; providing, by the PMI on the destination compute node to the application on the destination compute node, only the portion of the quantity of data to be received by the application; and discarding, by the PMI on the destination compute node, the portion of the quantity of data to be discarded.
Hardware based redundant multi-threading inside a GPU for improved reliability
Sridharan, Vilas; Gurumurthi, Sudhanva
2015-05-05
A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
Computers and Children: Problems and Possibilities.
ERIC Educational Resources Information Center
Siegfried, Pat
1983-01-01
Discusses the use of computers by children, highlighting a definition of computer literacy, computer education in schools, computer software, microcomputers, programming languages, and public library involvement. Seven references and a 40-item bibliography are included. (EJS)
Digital computer technique for setup and checkout of an analog computer
NASA Technical Reports Server (NTRS)
Ambaruch, R.
1968-01-01
Computer program technique, called Analog Computer Check-Out Routine Digitally /ACCORD/, generates complete setup and checkout data for an analog computer. In addition, the correctness of the analog program implementation is validated.
ERIC Educational Resources Information Center
Murphy, Harry J., Ed.
The following 25 papers (with their authors) are presented from a conference on computer technology in special education and rehabilitation: "Computers for Business--Computers for Life" (I. Keith Austin); "Rehabilitation and the Computer: How to Find What You Need" (Thomas Backer); "Computer Access Alternatives for Visually Impaired People in…
Simple proof of equivalence between adiabatic quantum computation and the circuit model.
Mizel, Ari; Lidar, Daniel A; Mitchell, Morgan
2007-08-17
We prove the equivalence between adiabatic quantum computation and quantum computation in the circuit model. An explicit adiabatic computation procedure is given that generates a ground state from which the answer can be extracted. The amount of time needed is evaluated by computing the gap. We show that the procedure is computationally efficient.
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
5 CFR 838.805 - OPM computation of formulas in computing the designated base.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false OPM computation of formulas in computing... computing the designated base. (a) A court order awarding a former spouse survivor annuity is not a court...) To provide sufficient instructions and information for OPM to compute the amount of a former spouse...
ERIC Educational Resources Information Center
Bailey, Suzanne Powers; Jeffers, Marcia
Eighteen interrelated, sequential lesson plans and supporting materials for teaching computer literacy at the elementary and secondary levels are presented. The activities, intended to be infused into the regular curriculum, do not require the use of a computer. The introduction presents background information on computer literacy, suggests a…
ERIC Educational Resources Information Center
Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.
2017-01-01
The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-05
.... Description of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312...). ACTION: Notice of Computer Matching Program (CMP). SUMMARY: In accordance with the requirements of the...
A Brief Analysis of Development Situations and Trend of Cloud Computing
NASA Astrophysics Data System (ADS)
Yang, Wenyan
2017-12-01
in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.
Archer, Charles J.; Inglett, Todd A.; Ratterman, Joseph D.; Smith, Brian E.
2010-03-02
Methods, apparatus, and products are disclosed for configuring compute nodes of a parallel computer in an operational group into a plurality of independent non-overlapping collective networks, the compute nodes in the operational group connected together for data communications through a global combining network, that include: partitioning the compute nodes in the operational group into a plurality of non-overlapping subgroups; designating one compute node from each of the non-overlapping subgroups as a master node; and assigning, to the compute nodes in each of the non-overlapping subgroups, class routing instructions that organize the compute nodes in that non-overlapping subgroup as a collective network such that the master node is a physical root.
Computing technology in the 1980's. [computers
NASA Technical Reports Server (NTRS)
Stone, H. S.
1978-01-01
Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.
On teaching computer ethics within a computer science department.
Quinn, Michael J
2006-04-01
The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.
How to Build a Quantum Computer
NASA Astrophysics Data System (ADS)
Sanders, Barry C.
2017-11-01
Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.
Method for transferring data from an unsecured computer to a secured computer
Nilsen, Curt A.
1997-01-01
A method is described for transferring data from an unsecured computer to a secured computer. The method includes transmitting the data and then receiving the data. Next, the data is retransmitted and rereceived. Then, it is determined if errors were introduced when the data was transmitted by the unsecured computer or received by the secured computer. Similarly, it is determined if errors were introduced when the data was retransmitted by the unsecured computer or rereceived by the secured computer. A warning signal is emitted from a warning device coupled to the secured computer if (i) an error was introduced when the data was transmitted or received, and (ii) an error was introduced when the data was retransmitted or rereceived.
Computer literacy among first year medical students in a developing country: A cross sectional study
2012-01-01
Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p < 0.001). In the linear regression analysis, formal computer training was the strongest predictor of computer literacy (β = 13.034), followed by using internet facility, being from Western province, using computers for Web browsing and computer programming, computer ownership and doing IT (Information Technology) as a subject in GCE (A/L) examination. Conclusion Sri Lankan medical undergraduates had a low-intermediate level of computer literacy. There is a need to improve computer literacy, by increasing computer training in schools, or by introducing computer training in the initial stages of the undergraduate programme. These two options require improvement in infrastructure and other resources. PMID:22980096
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Distributed GPU Computing in GIScience
NASA Astrophysics Data System (ADS)
Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.
2013-12-01
Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE Transactions on, 9(3), 378-394. 2. Li, J., Jiang, Y., Yang, C., Huang, Q., & Rice, M. (2013). Visualizing 3D/4D Environmental Data Using Many-core Graphics Processing Units (GPUs) and Multi-core Central Processing Units (CPUs). Computers & Geosciences, 59(9), 78-89. 3. Owens, J. D., Houston, M., Luebke, D., Green, S., Stone, J. E., & Phillips, J. C. (2008). GPU computing. Proceedings of the IEEE, 96(5), 879-899.
O'Donnell, Michael
2015-01-01
State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf
The impact of individual factors on healthcare staff's computer use in psychiatric hospitals.
Koivunen, Marita; Välimäki, Maritta; Koskinen, Anita; Staggers, Nancy; Katajisto, Jouko
2009-04-01
The study examines whether individual factors of healthcare staff are associated with computer use in psychiatric hospitals. In addition, factors inhibiting staff's optimal use of computers were explored. Computer applications have developed the content of clinical practice and changed patterns of professional working. Healthcare staff need new capacities to work in clinical practice, including the basic computers skills. Computer use amongst healthcare staff has widely been studied in general, but cogent information is still lacking in psychiatric care. Staff's computer use was assessed using a structured questionnaire (The Staggers Nursing Computer Experience Questionnaire). The study population was healthcare staff working in two psychiatric hospitals in Finland (n = 470, response rate = 59%). The data were analysed with descriptive statistics and manova with main effects and two-way interaction effects of six individual factors. Nurses who had more experience of computer use or of the implementation processes of computer systems were more motivated to use computers than those who had less experience of these issues. Males and administrative personnel who were younger had also participated more often than women in implementation processes of computer systems. The most significant factor inhibiting the use of computers was lack of interest in them. In psychiatric hospitals, more direct attention should focus on staff's capacities to use computers and to increase their understanding of the benefits in clinical care, especially for women and ageing staff working in psychiatric hospitals. To avoid exclusion amongst healthcare personnel in information society and to ensure that they have capacities to guide patients on how to use computers or to evaluate the quality of health information on the web, staff's capacities and motivation to use computers in mental health and psychiatric nursing should be ensured.
Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.
Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei
2018-06-15
Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.
Development of a small-scale computer cluster
NASA Astrophysics Data System (ADS)
Wilhelm, Jay; Smith, Justin T.; Smith, James E.
2008-04-01
An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
2013-01-01
Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395
Computer and internet use in a community health clinic population.
Peterson, Neeraja B; Dwyer, Kathleen A; Mulvaney, Shelagh A
2009-01-01
To determine if patients from a community health clinic have access to computers and/or the Internet and if they believe a computer is useful in their medical care. A convenience sample of 100 subjects, aged 50 years and older, from a community health clinic in Nashville, Tennessee, completed a structured interview and a health literacy assessment. Of the 100 participants, 40 did not have any computer access, 27 had computer but not Internet access, and 33 had Internet access. Participants with computer access (with or without Internet) had higher incomes, higher educational status, and higher literacy status than those without computer access. Of participants reporting current computer use (n = 54), 33% reported never using their computer to look up health and medical information. Of those who "never'' used their computer for this activity, 54% reported they did not have Internet connectivity, whereas 31% reported they did not know how to use the Internet. Although this group of individuals reported that they were comfortable using a computer (77%), they reported being uncomfortable with accessing the Internet (53%). Not only does access to computers and the Internet need to be improved before widespread use by patients, but computer users will need to be instructed on how to navigate the Internet.
Advanced Computing for Medicine.
ERIC Educational Resources Information Center
Rennels, Glenn D.; Shortliffe, Edward H.
1987-01-01
Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)
ERIC Educational Resources Information Center
Hagge, John
1986-01-01
Focuses on problems encountered with computer-aided writing instruction. Discusses conflicts caused by the computer classroom concept, some general paradoxes and ethical implications of computer-aided instruction. (EL)
Blind topological measurement-based quantum computation.
Morimae, Tomoyuki; Fujii, Keisuke
2012-01-01
Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.
Cloud Computing for radiologists.
Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit
2012-07-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.
Using NCLab-karel to improve computational thinking skill of junior high school students
NASA Astrophysics Data System (ADS)
Kusnendar, J.; Prabawa, H. W.
2018-05-01
Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Blind topological measurement-based quantum computation
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki; Fujii, Keisuke
2012-09-01
Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3×10-3, which is comparable to that (7.5×10-3) of non-blind topological quantum computation. As the error per gate of the order 10-3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.
Line-plane broadcasting in a data communications network of a parallel computer
Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.
2010-06-08
Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.
Line-plane broadcasting in a data communications network of a parallel computer
Archer, Charles J.; Berg, Jeremy E.; Blocksome, Michael A.; Smith, Brian E.
2010-11-23
Methods, apparatus, and products are disclosed for line-plane broadcasting in a data communications network of a parallel computer, the parallel computer comprising a plurality of compute nodes connected together through the network, the network optimized for point to point data communications and characterized by at least a first dimension, a second dimension, and a third dimension, that include: initiating, by a broadcasting compute node, a broadcast operation, including sending a message to all of the compute nodes along an axis of the first dimension for the network; sending, by each compute node along the axis of the first dimension, the message to all of the compute nodes along an axis of the second dimension for the network; and sending, by each compute node along the axis of the second dimension, the message to all of the compute nodes along an axis of the third dimension for the network.
Cloud Computing for radiologists
Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit
2012-01-01
Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560
Computer conferencing: Choices and strategies
NASA Technical Reports Server (NTRS)
Smith, Jill Y.
1991-01-01
Computer conferencing permits meeting through the computer while sharing a common file. The primary advantages of computer conferencing are that participants may (1) meet simultaneously or nonsimultaneously, and (2) contribute across geographic distance and time zones. Due to these features, computer conferencing offers a viable meeting option for distributed business teams. Past research and practice is summarized denoting practical uses of computer conferencing as well as types of meeting activities ill suited to the medium. Additionally, effective team strategies are outlined which maximize the benefits of computer conferencing.
A view of Kanerva's sparse distributed memory
NASA Technical Reports Server (NTRS)
Denning, P. J.
1986-01-01
Pentti Kanerva is working on a new class of computers, which are called pattern computers. Pattern computers may close the gap between capabilities of biological organisms to recognize and act on patterns (visual, auditory, tactile, or olfactory) and capabilities of modern computers. Combinations of numeric, symbolic, and pattern computers may one day be capable of sustaining robots. The overview of the requirements for a pattern computer, a summary of Kanerva's Sparse Distributed Memory (SDM), and examples of tasks this computer can be expected to perform well are given.
Robb, P; Pawlowski, B
1990-05-01
The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.
Recursive computer architecture for VLSI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Treleaven, P.C.; Hopkins, R.P.
1982-01-01
A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.
Augmented Computer Mouse Would Measure Applied Force
NASA Technical Reports Server (NTRS)
Li, Larry C. H.
1993-01-01
Proposed computer mouse measures force of contact applied by user. Adds another dimension to two-dimensional-position-measuring capability of conventional computer mouse; force measurement designated to represent any desired continuously variable function of time and position, such as control force, acceleration, velocity, or position along axis perpendicular to computer video display. Proposed mouse enhances sense of realism and intuition in interaction between operator and computer. Useful in such applications as three-dimensional computer graphics, computer games, and mathematical modeling of dynamics.
Hypercluster Parallel Processor
NASA Technical Reports Server (NTRS)
Blech, Richard A.; Cole, Gary L.; Milner, Edward J.; Quealy, Angela
1992-01-01
Hypercluster computer system includes multiple digital processors, operation of which coordinated through specialized software. Configurable according to various parallel-computing architectures of shared-memory or distributed-memory class, including scalar computer, vector computer, reduced-instruction-set computer, and complex-instruction-set computer. Designed as flexible, relatively inexpensive system that provides single programming and operating environment within which one can investigate effects of various parallel-computing architectures and combinations on performance in solution of complicated problems like those of three-dimensional flows in turbomachines. Hypercluster software and architectural concepts are in public domain.
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
Wilkinson, Ann; While, Alison E; Roberts, Julia
2009-04-01
This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.
The Role of Parents and Related Factors on Adolescent Computer Use
Epstein, Jennifer A.
2012-01-01
Background Research suggested the importance of parents on their adolescents’ computer activity. Spending too much time on the computer for recreational purposes in particular has been found to be related to areas of public health concern in children/adolescents, including obesity and substance use. Design and Methods The goal of the research was to determine the association between recreational computer use and potentially linked factors (parental monitoring, social influences to use computers including parents, age of first computer use, self-control, and particular internet activities). Participants (aged 13-17 years and residing in the United States) were recruited via the Internet to complete an anonymous survey online using a survey tool. The target sample of 200 participants who completed the survey was achieved. The sample’s average age was 16 and was 63% girls. Results A set of regressions with recreational computer use as dependent variables were run. Conclusions Less parental monitoring, younger age at first computer use, listening or downloading music from the internet more frequently, using the internet for educational purposes less frequently, and parent’s use of the computer for pleasure were related to spending a greater percentage of time on non-school computer use. These findings suggest the importance of parental monitoring and parental computer use on their children’s own computer use, and the influence of some internet activities on adolescent computer use. Finally, programs aimed at parents to help them increase the age when their children start using computers and learn how to place limits on recreational computer use are needed. PMID:25170449
Fluid/Structure Interaction Studies of Aircraft Using High Fidelity Equations on Parallel Computers
NASA Technical Reports Server (NTRS)
Guruswamy, Guru; VanDalsem, William (Technical Monitor)
1994-01-01
Abstract Aeroelasticity which involves strong coupling of fluids, structures and controls is an important element in designing an aircraft. Computational aeroelasticity using low fidelity methods such as the linear aerodynamic flow equations coupled with the modal structural equations are well advanced. Though these low fidelity approaches are computationally less intensive, they are not adequate for the analysis of modern aircraft such as High Speed Civil Transport (HSCT) and Advanced Subsonic Transport (AST) which can experience complex flow/structure interactions. HSCT can experience vortex induced aeroelastic oscillations whereas AST can experience transonic buffet associated structural oscillations. Both aircraft may experience a dip in the flutter speed at the transonic regime. For accurate aeroelastic computations at these complex fluid/structure interaction situations, high fidelity equations such as the Navier-Stokes for fluids and the finite-elements for structures are needed. Computations using these high fidelity equations require large computational resources both in memory and speed. Current conventional super computers have reached their limitations both in memory and speed. As a result, parallel computers have evolved to overcome the limitations of conventional computers. This paper will address the transition that is taking place in computational aeroelasticity from conventional computers to parallel computers. The paper will address special techniques needed to take advantage of the architecture of new parallel computers. Results will be illustrated from computations made on iPSC/860 and IBM SP2 computer by using ENSAERO code that directly couples the Euler/Navier-Stokes flow equations with high resolution finite-element structural equations.
Computer Literacy for Teachers.
ERIC Educational Resources Information Center
Sarapin, Marvin I.; Post, Paul E.
Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…
High Tech: A Place in Our Lives and in Our Schools.
ERIC Educational Resources Information Center
Roach, John V.
1986-01-01
Discusses various aspects of high technology: computers in cars, computer-assisted design and manufacturing, computers in telephones, video recorders, laser technology, home computers, job training, computer education, and the challenge to the technology teacher. (CT)
Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact
48 CFR 227.7202-4 - Contract clause.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software and Computer Software Documentation 227.7202-4 Contract clause. A specific contract clause governing the Government's rights in commercial computer software or commercial computer software..., release, perform, display, or disclose computer software or computer software documentation shall be...
48 CFR 227.7202-4 - Contract clause.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software and Computer Software Documentation 227.7202-4 Contract clause. A specific contract clause governing the Government's rights in commercial computer software or commercial computer software..., release, perform, display, or disclose computer software or computer software documentation shall be...
48 CFR 227.7202-4 - Contract clause.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software and Computer Software Documentation 227.7202-4 Contract clause. A specific contract clause governing the Government's rights in commercial computer software or commercial computer software..., release, perform, display, or disclose computer software or computer software documentation shall be...
48 CFR 227.7202-4 - Contract clause.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software and Computer Software Documentation 227.7202-4 Contract clause. A specific contract clause governing the Government's rights in commercial computer software or commercial computer software..., release, perform, display, or disclose computer software or computer software documentation shall be...
48 CFR 227.7202-4 - Contract clause.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Software and Computer Software Documentation 227.7202-4 Contract clause. A specific contract clause governing the Government's rights in commercial computer software or commercial computer software..., release, perform, display, or disclose computer software or computer software documentation shall be...
Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems
ERIC Educational Resources Information Center
Bostandjiev, Svetlin Alex I.
2012-01-01
The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…
ERIC Educational Resources Information Center
Berkant, Hasan Güner
2016-01-01
This study investigates faculty of education students' computer self-efficacy beliefs and their attitudes towards computers and implementing computer supported education. This study is descriptive and based on a correlational survey model. The final sample consisted of 414 students studying in the faculty of education of a Turkish university. The…
The Computer Revolution. An Introduction to Computers. A Good Apple Activity Book for Grades 4-8.
ERIC Educational Resources Information Center
Colgren, John
This booklet is designed to introduce computers to children. A letter to parents is provided, explaining that a unit on computers will be taught which will discuss the major parts of the computer and programming in the computer language BASIC. Suggestions for teachers provide information on starting, the binary system, base two worksheet, binary…
Implicit Theories of Creativity in Computer Science in the United States and China
ERIC Educational Resources Information Center
Tang, Chaoying; Baer, John; Kaufman, James C.
2015-01-01
To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…
ERIC Educational Resources Information Center
Binkley, Zachary Wayne McClellan
2017-01-01
This study investigates computer self-efficacy and computer anxiety within 61 students across two academic majors, Aviation and Sports and Exercise Science, while investigating the impact residential status, age, and gender has on those two psychological constructs. The purpose of the study is to find if computer self-efficacy and computer anxiety…
ERIC Educational Resources Information Center
Lee, Eun-Ju; Nass, Clifford
2002-01-01
Presents two experiments to address the questions of if and how normative social influence operates in anonymous computer-mediated communication and human-computer interaction. Finds that the perception of interaction partner (human vs. computer) moderated the group conformity effect such that the undergraduate student subjects expressed greater…
ERIC Educational Resources Information Center
Vekiri, Ioanna; Chronaki, Anna
2008-01-01
In this study, we examined relations between outside school computer experiences, perceived social support for using computers, and self-efficacy and value beliefs about computer learning for 340 Greek elementary school boys and girls. Participants responded to a questionnaire about their access to computer use outside school (e.g. frequency of…
ERIC Educational Resources Information Center
Guzdial, Mark; Ericson, Barbara; Mcklin, Tom; Engelman, Shelly
2014-01-01
Georgia Computes! ("GaComputes") was a six-year (2006-2012) project to improve computing education across the state of Georgia in the United States, funded by the National Science Foundation. The goal of GaComputes was to broaden participation in computing and especially to engage more members of underrepresented groups which includes…
ERIC Educational Resources Information Center
Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.
2000-01-01
Of 169 agriculture students surveyed, 79% had computer training, 66% owned computers; they had slightly above average computer self-efficacy, especially in word processing, electronic mail, and Internet use. However, 72.7% scored 60% or less on a test of computer knowledge. There was little correlation between self-efficacy and computer knowledge.…
ERIC Educational Resources Information Center
Shade, Daniel D.
1994-01-01
Provides advice and suggestions for educators or parents who are trying to decide what type of computer to buy to run the latest computer software for children. Suggests that purchasers should buy a computer with as large a hard drive as possible, at least 10 megabytes of RAM, and a CD-ROM drive. (MDM)
ERIC Educational Resources Information Center
Friedman, Batya
This study examines the relationship between societal forces and school computer use in the context of two issues surrounding computer technology: computer property and computer privacy. Four types of data were collected from district administrators, principals, computer teachers, and students over a 9-month period in a high school with a broad,…
ERIC Educational Resources Information Center
Kieren, Thomas E.
This last paper in a set of four reviews research on a wide variety of computer applications in the mathematics classroom. It covers computer-based instruction, especially drill-and-practice and tutorial modes; computer-managed instruction; and computer-augmented problem-solving. Analytical comments on the findings and status of the research are…
Hyperswitch Communication Network Computer
NASA Technical Reports Server (NTRS)
Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.
1993-01-01
Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Small Universal Bacteria and Plasmid Computing Systems.
Wang, Xun; Zheng, Pan; Ma, Tongmao; Song, Tao
2018-05-29
Bacterial computing is a known candidate in natural computing, the aim being to construct "bacterial computers" for solving complex problems. In this paper, a new kind of bacterial computing system, named the bacteria and plasmid computing system (BP system), is proposed. We investigate the computational power of BP systems with finite numbers of bacteria and plasmids. Specifically, it is obtained in a constructive way that a BP system with 2 bacteria and 34 plasmids is Turing universal. The results provide a theoretical cornerstone to construct powerful bacterial computers and demonstrate a concept of paradigms using a "reasonable" number of bacteria and plasmids for such devices.
Computational System For Rapid CFD Analysis In Engineering
NASA Technical Reports Server (NTRS)
Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.
1995-01-01
Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.
Identifying failure in a tree network of a parallel computer
Archer, Charles J.; Pinnow, Kurt W.; Wallenfelt, Brian P.
2010-08-24
Methods, parallel computers, and products are provided for identifying failure in a tree network of a parallel computer. The parallel computer includes one or more processing sets including an I/O node and a plurality of compute nodes. For each processing set embodiments include selecting a set of test compute nodes, the test compute nodes being a subset of the compute nodes of the processing set; measuring the performance of the I/O node of the processing set; measuring the performance of the selected set of test compute nodes; calculating a current test value in dependence upon the measured performance of the I/O node of the processing set, the measured performance of the set of test compute nodes, and a predetermined value for I/O node performance; and comparing the current test value with a predetermined tree performance threshold. If the current test value is below the predetermined tree performance threshold, embodiments include selecting another set of test compute nodes. If the current test value is not below the predetermined tree performance threshold, embodiments include selecting from the test compute nodes one or more potential problem nodes and testing individually potential problem nodes and links to potential problem nodes.
Comparisons of Physicians' and Nurses' Attitudes towards Computers.
Brumini, Gordana; Ković, Ivor; Zombori, Dejvid; Lulić, Ileana; Bilic-Zulle, Lidija; Petrovecki, Mladen
2005-01-01
Before starting the implementation of integrated hospital information systems, the physicians' and nurses' attitudes towards computers were measured by means of a questionnaire. The study was conducted in Dubrava University Hospital, Zagreb in Croatia. Out of 194 respondents, 141 were nurses and 53 physicians, randomly selected. They surveyed by an anonymous questionnaire consisting of 8 closed questions about demographic data, computer science education and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers between groups were compared using Kruskal-Wallis and Mann Whitney test for post-hoc analysis. The total score presented attitudes toward computers. Physicians' total score was 130 (97-144), while nurses' total score was 123 (88-141). It points that the average answer to all statements was between "agree" and "strongly agree", and these high total scores indicated their positive attitudes. Age, computer science education and computer usage were important factors witch enhances the total score. Younger physicians and nurses with computer science education and with previous computer experience had more positive attitudes towards computers than others. Our results are important for planning and implementation of integrated hospital information systems in Croatia.
Utility Computing: Reality and Beyond
NASA Astrophysics Data System (ADS)
Ivanov, Ivan I.
Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?
Some Thoughts Regarding Practical Quantum Computing
NASA Astrophysics Data System (ADS)
Ghoshal, Debabrata; Gomez, Richard; Lanzagorta, Marco; Uhlmann, Jeffrey
2006-03-01
Quantum computing has become an important area of research in computer science because of its potential to provide more efficient algorithmic solutions to certain problems than are possible with classical computing. The ability of performing parallel operations over an exponentially large computational space has proved to be the main advantage of the quantum computing model. In this regard, we are particularly interested in the potential applications of quantum computers to enhance real software systems of interest to the defense, industrial, scientific and financial communities. However, while much has been written in popular and scientific literature about the benefits of the quantum computational model, several of the problems associated to the practical implementation of real-life complex software systems in quantum computers are often ignored. In this presentation we will argue that practical quantum computation is not as straightforward as commonly advertised, even if the technological problems associated to the manufacturing and engineering of large-scale quantum registers were solved overnight. We will discuss some of the frequently overlooked difficulties that plague quantum computing in the areas of memories, I/O, addressing schemes, compilers, oracles, approximate information copying, logical debugging, error correction and fault-tolerant computing protocols.
Panchabhai, T S; Dangayach, N S; Mehta, V S; Patankar, C V; Rege, N N
2011-01-01
Computer usage capabilities of medical students for introduction of computer-aided learning have not been adequately assessed. Cross-sectional study to evaluate computer literacy among medical students. Tertiary care teaching hospital in Mumbai, India. Participants were administered a 52-question questionnaire, designed to study their background, computer resources, computer usage, activities enhancing computer skills, and attitudes toward computer-aided learning (CAL). The data was classified on the basis of sex, native place, and year of medical school, and the computer resources were compared. The computer usage and attitudes toward computer-based learning were assessed on a five-point Likert scale, to calculate Computer usage score (CUS - maximum 55, minimum 11) and Attitude score (AS - maximum 60, minimum 12). The quartile distribution among the groups with respect to the CUS and AS was compared by chi-squared tests. The correlation between CUS and AS was then tested. Eight hundred and seventy-five students agreed to participate in the study and 832 completed the questionnaire. One hundred and twenty eight questionnaires were excluded and 704 were analyzed. Outstation students had significantly lesser computer resources as compared to local students (P<0.0001). The mean CUS for local students (27.0±9.2, Mean±SD) was significantly higher than outstation students (23.2±9.05). No such difference was observed for the AS. The means of CUS and AS did not differ between males and females. The CUS and AS had positive, but weak correlations for all subgroups. The weak correlation between AS and CUS for all students could be explained by the lack of computer resources or inadequate training to use computers for learning. Providing additional resources would benefit the subset of outstation students with lesser computer resources. This weak correlation between the attitudes and practices of all students needs to be investigated. We believe that this gap can be bridged with a structured computer learning program.
Software Systems for High-performance Quantum Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humble, Travis S; Britt, Keith A
Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventionalmore » computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.« less
Influence of direct computer experience on older adults' attitudes toward computers.
Jay, G M; Willis, S L
1992-07-01
This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Changing from computing grid to knowledge grid in life-science grid.
Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy
2009-09-01
Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Asymptotic density and effective negligibility
NASA Astrophysics Data System (ADS)
Astor, Eric P.
In this thesis, we join the study of asymptotic computability, a project attempting to capture the idea that an algorithm might work correctly in all but a vanishing fraction of cases. In collaboration with Hirschfeldt and Jockusch, broadening the original investigation of Jockusch and Schupp, we introduce dense computation, the weakest notion of asymptotic computability (requiring only that the correct answer is produced on a set of density 1), and effective dense computation, where every computation halts with either the correct answer or (on a set of density 0) a symbol denoting uncertainty. A few results make more precise the relationship between these notions and work already done with Jockusch and Schupp's original definitions of coarse and generic computability. For all four types of asymptotic computation, including generic computation, we demonstrate that non-trivial upper cones have measure 0, building on recent work of Hirschfeldt, Jockusch, Kuyper, and Schupp in which they establish this for coarse computation. Their result transfers to yield a minimal pair for relative coarse computation; we generalize their method and extract a similar result for relative dense computation (and thus for its corresponding reducibility). However, all of these notions of near-computation treat a set as negligible iff it has asymptotic density 0. Noting that this definition is not computably invariant, this produces some failures of intuition and a break with standard expectations in computability theory. For instance, as shown by Hamkins and Miasnikov, the halting problem is (in some formulations) effectively densely computable, even in polynomial time---yet this result appears fragile, as indicated by Rybalov. In independent work, we respond to this by strengthening the approach of Jockusch and Schupp to avoid such phenomena; specifically, we introduce a new notion of intrinsic asymptotic density, invariant under computable permutation, with rich relations to both randomness and classical computability theory. For instance, we prove that the stochasticities corresponding to permutation randomness and injection randomness coincide, and identify said stochasticity as intrinsic density 1/2. We then define sets of intrinsic density 0 to be effectively negligible, and classify this as a new immunity property, determining its position in the standard hierarchy from immune to cohesive for both general and Delta02 sets. We further characterize the Turing degrees of effectively negligible sets as those which are either high (a' ≥T 0") or compute a DNC (diagonally non-computable) function. In fact, this result holds over RCA0, demonstrating the reverse-mathematical equivalence of the principles ID0 and DOM \\sext DNR. . Replacing Jockusch and Schupp's negligibility (density 0) by effective negligibility (intrinsic density 0), we then obtain new notions of intrinsically dense computation. Finally, we generalize Rice's Theorem to all forms of intrinsic dense computation, showing that no set that is 1-equivalent to a non-trivial index set is intrinsically densely computable; in particular, in contrast to ordinary dense computation, we see that the halting problem cannot be intrinsically densely computable.
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
48 CFR 12.212 - Computer software.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...
76 FR 41234 - Advanced Scientific Computing Advisory Committee Charter Renewal
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... Secretariat, General Services Administration, notice is hereby given that the Advanced Scientific Computing... advice and recommendations concerning the Advanced Scientific Computing program in response only to... Advanced Scientific Computing Research program and recommendations based thereon; --Advice on the computing...
Getting that Computer into Your School.
ERIC Educational Resources Information Center
Arnold, Anne Jurmu
1982-01-01
Tips for obtaining computers for educational use are presented in this article about grants and foundations, for free computers, from the computer companies of Apple, Atari, and Tandy/Radio Shack. Also discussed are Commodore Business Machines, Osborne Computing Corporation, and Texas Instruments. (CJ)
Scrap computer recycling in Taiwan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C.H.; Chang, S.L.; Wang, K.M.
1999-07-01
It is estimated that approximately 700,000 scrap personal computers will be generated each year in Taiwan. The disposal of such a huge amount of scrap computers presents a difficult task for the island due to the scarcity of landfills and incineration facilities available locally. Also, the hazardous materials contained in the scrap computers may cause serious pollution to the environment, if they are not properly disposed. Thus, EPA of Taiwan has declared scrap personal computers as a producer responsibility recycling product on July 1997 to mandate that the manufacturers, importers and sellers of personal computers have to recover and recyclemore » their scrap computers properly. Beginning on June 1, 1998, a scrap computer recycling plan is officially implemented on the island. Under this plan, consumers can deliver their unwanted personal computers to the designated collection points to receive reward money. Currently, only six items are mandated to be recycled in this recycling plan. They are notebooks, monitor and the hard disk, power supply, printed circuit board and shell of the main frame of the personal computer. This paper presents the current scrap computer recycling system in Taiwan.« less
Kendon, Vivien M; Nemoto, Kae; Munro, William J
2010-08-13
We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.
High school computer science education paves the way for higher education: the Israeli case
NASA Astrophysics Data System (ADS)
Armoni, Michal; Gal-Ezer, Judith
2014-07-01
The gap between enrollments in higher education computing programs and the high-tech industry's demands is widely reported, and is especially prominent for women. Increasing the availability of computer science education in high school is one of the strategies suggested in order to address this gap. We look at the connection between exposure to computer science in high school and pursuing computing in higher education. We also examine the gender gap, in the context of high school computer science education. We show that in Israel, students who took the high-level computer science matriculation exam were more likely to pursue computing in higher education. Regarding the issue of gender, we will show that, in general, in Israel the difference between males and females who take computer science in high school is relatively small, and a larger, though still not very large difference exists only for the highest exam level. In addition, exposing females to high-level computer science in high school has more relative impact on pursuing higher education in computing.
OASIS connections: results from an evaluation study.
Czaja, Sara J; Lee, Chin Chin; Branham, Janice; Remis, Peggy
2012-10-01
The objectives of this study were to evaluate a community-based basic computer and Internet training program designed for older adults, provide recommendations for program refinement, and gather preliminary information on program sustainability. The program was developed by the OASIS Institute, a nonprofit agency serving older adults and implemented in 4 cities by community trainers across the United States. One hundred and ninety-six adults aged 40-90 years were assigned to the training or a wait-list control group. Knowledge of computers and the Internet, attitudes toward computers, and computer/Internet use were assessed at baseline, posttraining, and 3 months posttraining. The program was successful in increasing the computer/Internet skills of the trainees. The data indicated a significant increase in computer and Internet knowledge and comfort with computers among those who received the training. Further, those who completed the course reported an increase in both computer and Internet use 3 months posttraining. The findings indicate that a community-based computer and Internet training program delivered by community instructors can be effective in terms of increasing computer and Internet skills and comfort with computer technology among older adults.
State of the Art of Network Security Perspectives in Cloud Computing
NASA Astrophysics Data System (ADS)
Oh, Tae Hwan; Lim, Shinyoung; Choi, Young B.; Park, Kwang-Roh; Lee, Heejo; Choi, Hyunsang
Cloud computing is now regarded as one of social phenomenon that satisfy customers' needs. It is possible that the customers' needs and the primary principle of economy - gain maximum benefits from minimum investment - reflects realization of cloud computing. We are living in the connected society with flood of information and without connected computers to the Internet, our activities and work of daily living will be impossible. Cloud computing is able to provide customers with custom-tailored features of application software and user's environment based on the customer's needs by adopting on-demand outsourcing of computing resources through the Internet. It also provides cloud computing users with high-end computing power and expensive application software package, and accordingly the users will access their data and the application software where they are located at the remote system. As the cloud computing system is connected to the Internet, network security issues of cloud computing are considered as mandatory prior to real world service. In this paper, survey and issues on the network security in cloud computing are discussed from the perspective of real world service environments.
Boevé, Anja J; Meijer, Rob R; Albers, Casper J; Beetsma, Yta; Bosker, Roel J
2015-01-01
The introduction of computer-based testing in high-stakes examining in higher education is developing rather slowly due to institutional barriers (the need of extra facilities, ensuring test security) and teacher and student acceptance. From the existing literature it is unclear whether computer-based exams will result in similar results as paper-based exams and whether student acceptance can change as a result of administering computer-based exams. In this study, we compared results from a computer-based and paper-based exam in a sample of psychology students and found no differences in total scores across the two modes. Furthermore, we investigated student acceptance and change in acceptance of computer-based examining. After taking the computer-based exam, fifty percent of the students preferred paper-and-pencil exams over computer-based exams and about a quarter preferred a computer-based exam. We conclude that computer-based exam total scores are similar as paper-based exam scores, but that for the acceptance of high-stakes computer-based exams it is important that students practice and get familiar with this new mode of test administration.
Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J
2010-04-01
Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.
A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing
2016-12-08
project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from
ERIC Educational Resources Information Center
Dennis, J. Richard; Thomson, David
This paper is concerned with a low cost alternative for providing computer experience to secondary school students. The brief discussion covers the programmable calculator and its relevance for teaching the concepts and the rudiments of computer programming and for computer problem solving. A list of twenty-five programming activities related to…
2010-07-01
Cloud computing , an emerging form of computing in which users have access to scalable, on-demand capabilities that are provided through Internet... cloud computing , (2) the information security implications of using cloud computing services in the Federal Government, and (3) federal guidance and...efforts to address information security when using cloud computing . The complete report is titled Information Security: Federal Guidance Needed to
ERIC Educational Resources Information Center
Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun
2013-01-01
The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…
Coal-seismic, desktop computer programs in BASIC; Part 7, Display and compute shear-pair seismograms
Hasbrouck, W.P.
1983-01-01
Processing of geophysical data taken with the U.S. Geological Survey's coal-seismic system is done with a desk-top, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report discusses and presents five computer pro grams used to display and compute shear-pair seismograms.
Cognitive Model Exploration and Optimization: A New Challenge for Computational Science
2010-03-01
the generation and analysis of computational cognitive models to explain various aspects of cognition. Typically the behavior of these models...computational scale of a workstation, so we have turned to high performance computing (HPC) clusters and volunteer computing for large-scale...computational resources. The majority of applications on the Department of Defense HPC clusters focus on solving partial differential equations (Post
[Dutch computer domestication, 1975-1990].
Veraart, Frank
2008-01-01
A computer seems an indispensable tool among twenty-first century households. Computers however, did not come as manna from heaven. The domestication and appropriation of computers in Dutch households was a result of activities by various intermediary actors. Computers became household commodities only gradually. Technophile computer hobbyists imported the first computers into the Netherlands from the USA, and started small businesses from 1975 onwards. They developed a social network in which computer technology was made available for use by individuals. This network extended itself via shops, clubs, magazines, and other means of acquiring and exchanging computer hard- and software. Hobbyist culture established the software-copying habits of private computer users as well as their ambivalence to commercial software. They also made the computer into a game machine. Under the impulse of a national policy that aimed at transforming society into an 'Information Society', clubs and other actors extended their activities and tailored them to this new agenda. Hobby clubs presented themselves as consumer organizations and transformed into intermediary actors that filled the gap between suppliers and a growing group of users. They worked hard to give meaning to (proper) use of computers. A second impulse to the increasing use of computers in the household came from so-called 'private-PC' projects in the late 1980s. In these projects employers financially aided employees in purchasing their own private PCs'. The initially important intermediary actors such as hobby clubs lost control and the agenda for personal computers was shifted to interoperability with office equipment. IBM compatible PC's flooded the households. In the household the new equipment blended with the established uses, such as gaming. The copying habits together with the PC standard created a risky combination in which computer viruses could spread easily. New roles arose for intermediary actors in guiding and educating computer users. The activities of intermediaries had a lasting influence on contemporary computer use and user preferences. Technical choices and the nature of Dutch computer use in households can be explained by analyzing the historical developments of intermediaries and users.
Bringing MapReduce Closer To Data With Active Drives
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Prathapan, S.; Warmka, R.; Wyatt, B.; Halem, M.; Trantham, J. D.; Markey, C. A.
2017-12-01
Moving computation closer to the data location has been a much theorized improvement to computation for decades. The increase in processor performance, the decrease in processor size and power requirement combined with the increase in data intensive computing has created a push to move computation as close to data as possible. We will show the next logical step in this evolution in computing: moving computation directly to storage. Hypothetical systems, known as Active Drives, have been proposed as early as 1998. These Active Drives would have a general-purpose CPU on each disk allowing for computations to be performed on them without the need to transfer the data to the computer over the system bus or via a network. We will utilize Seagate's Active Drives to perform general purpose parallel computing using the MapReduce programming model directly on each drive. We will detail how the MapReduce programming model can be adapted to the Active Drive compute model to perform general purpose computing with comparable results to traditional MapReduce computations performed via Hadoop. We will show how an Active Drive based approach significantly reduces the amount of data leaving the drive when performing several common algorithms: subsetting and gridding. We will show that an Active Drive based design significantly improves data transfer speeds into and out of drives compared to Hadoop's HDFS while at the same time keeping comparable compute speeds as Hadoop.
78 FR 53237 - Airworthiness Directives; Airbus Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-29
... control secondary computers (FCSCs), rather than flight control primary computers (FCPCs). This document... control primary computers (FCPCs); modifying two flight control secondary computers (FCSCs); revising the... the AD, which specify FCSCs, instead of flight control primary computers FCPCs. No other part of the...
Applying IRSS Theory: The Clark Atlanta University Exemplar
ERIC Educational Resources Information Center
Payton, Fay Cobb; Suarez-Brown, Tiki L.; Smith Lamar, Courtney
2012-01-01
The percentage of underrepresented minorities (African-American, Hispanic, Native Americans) that have obtained graduate level degrees within computing disciplines (computer science, computer information systems, computer engineering, and information technology) is dismal at best. Despite the fact that academia, the computing workforce,…
ERIC Educational Resources Information Center
Birken, Marvin N.
1967-01-01
Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…
Research Trends in Computational Linguistics. Conference Report.
ERIC Educational Resources Information Center
Center for Applied Linguistics, Washington, DC.
This document contains the reports summarizing the main discussion held during the March 1972 Computational Linguistics Conference. The first report, "Computational Linguistics and Linguistics," helps to establish definitions and an understanding of the scope of computational linguistics. "Integrated Computer Systems for Language" and…
Computer Learning for Young Children.
ERIC Educational Resources Information Center
Choy, Anita Y.
1995-01-01
Computer activities that combine education and entertainment make learning easy and fun for preschoolers. Computers encourage social skills, language and literacy skills, cognitive development, problem solving, and eye-hand coordination. The paper describes one teacher's experiences setting up a computer center and using computers with…
Why Don't All Professors Use Computers?
ERIC Educational Resources Information Center
Drew, David Eli
1989-01-01
Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…
ERIC Educational Resources Information Center
T.H.E. Journal, 2004
2004-01-01
The use of instructional technology has evolved over the last two decades, initially, instructional technology had two uses: learning about computers and using computers to increase basic skills. Learning about computers morphed into computer literacy, which is typically defined as the history, terminology and background of computing, using…
48 CFR 52.227-14 - Rights in Data-General.
Code of Federal Regulations, 2010 CFR
2010-10-01
... software. Computer software—(1) Means (i) Computer programs that comprise a series of instructions, rules... or computer software documentation. Computer software documentation means owner's manuals, user's... medium, that explain the capabilities of the computer software or provide instructions for using the...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...
48 CFR 53.105 - Computer generation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...
Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM
2011-01-18
Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.
Ludwig-Beymer, Patti; Williams, Phillip; Stimac, Ellen
2012-01-01
This research examined bedside medication verification administration in 2 adult critical care units, using portable computers and permanent bedside computers. There were no differences in the number of near-miss errors, the time to administer the medications, or nurse perception of ease of medication administration, care of patients, or reliability of technology. The percentage of medications scanned was significantly higher with the use of permanent bedside computers, and nurses using permanent bedside computers were more likely to agree that the computer was always available.
Cloud Computing Security Issue: Survey
NASA Astrophysics Data System (ADS)
Kamal, Shailza; Kaur, Rajpreet
2011-12-01
Cloud computing is the growing field in IT industry since 2007 proposed by IBM. Another company like Google, Amazon, and Microsoft provides further products to cloud computing. The cloud computing is the internet based computing that shared recourses, information on demand. It provides the services like SaaS, IaaS and PaaS. The services and recourses are shared by virtualization that run multiple operation applications on cloud computing. This discussion gives the survey on the challenges on security issues during cloud computing and describes some standards and protocols that presents how security can be managed.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
[Measurement of intracranial hematoma volume by personal computer].
DU, Wanping; Tan, Lihua; Zhai, Ning; Zhou, Shunke; Wang, Rui; Xue, Gongshi; Xiao, An
2011-01-01
To explore the method for intracranial hematoma volume measurement by the personal computer. Forty cases of various intracranial hematomas were measured by the computer tomography with quantitative software and personal computer with Photoshop CS3 software, respectively. the data from the 2 methods were analyzed and compared. There was no difference between the data from the computer tomography and the personal computer (P>0.05). The personal computer with Photoshop CS3 software can measure the volume of various intracranial hematomas precisely, rapidly and simply. It should be recommended in the clinical medicolegal identification.
Consulting room computers and their effect on general practitioner-patient communication.
Noordman, Janneke; Verhaak, Peter; van Beljouw, Ilse; van Dulmen, Sandra
2010-12-01
in the western medical world, computers form part of the standard equipment in the consulting rooms of most GPs. As the use of a computer requires time and attention from GPs, this may well interfere with the communication process. Yet, the information accessed on the computer may also enhance communication. the present study affords insight into the relationship between computer use and GP-patient communication recorded by the same GPs over two periods. videotaped GP consultations collected in 2001 and 2008 were used to observe computer use and GP-patient communication. In addition, patients questionnaires about their experiences with communication by the GP were analysed using multilevel models with patients (Level 1) nested within GPs (Level 2). both in 2008 and in 2001, GPs used their computer in almost every consultation. Still, our study showed a change in computer use by the GPs over time. In addition, the results indicate that computer use is negatively related to some communication aspects: the patient-directed gaze of the GP and the amount of information given by GPs. There is also a negative association between computer use and the body posture of the GP. Computer use by GPs is not associated with other (analysed) non-verbal and verbal behaviour of GPs and patients. Moreover, computer use is scarcely related to patients' experiences with the communication behaviour of the GP. GPs show greater reluctance to use computers in 2008 compared to 2001. Computer use can indeed affect the communication between GPs and patients. Therefore, GPs ought to remain aware of their computer use during consultations and at the same time keep the interaction with the patient alive.
Undergraduate computational physics projects on quantum computing
NASA Astrophysics Data System (ADS)
Candela, D.
2015-08-01
Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.
NASA Technical Reports Server (NTRS)
Smith, Paul H.
1988-01-01
The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.
Adaptation of a program for nonlinear finite element analysis to the CDC STAR 100 computer
NASA Technical Reports Server (NTRS)
Pifko, A. B.; Ogilvie, P. L.
1978-01-01
The conversion of a nonlinear finite element program to the CDC STAR 100 pipeline computer is discussed. The program called DYCAST was developed for the crash simulation of structures. Initial results with the STAR 100 computer indicated that significant gains in computation time are possible for operations on gloval arrays. However, for element level computations that do not lend themselves easily to long vector processing, the STAR 100 was slower than comparable scalar computers. On this basis it is concluded that in order for pipeline computers to impact the economic feasibility of large nonlinear analyses it is absolutely essential that algorithms be devised to improve the efficiency of element level computations.
Understanding and preventing computer vision syndrome.
Loh, Ky; Redd, Sc
2008-01-01
The invention of computer and advancement in information technology has revolutionized and benefited the society but at the same time has caused symptoms related to its usage such as ocular sprain, irritation, redness, dryness, blurred vision and double vision. This cluster of symptoms is known as computer vision syndrome which is characterized by the visual symptoms which result from interaction with computer display or its environment. Three major mechanisms that lead to computer vision syndrome are extraocular mechanism, accommodative mechanism and ocular surface mechanism. The visual effects of the computer such as brightness, resolution, glare and quality all are known factors that contribute to computer vision syndrome. Prevention is the most important strategy in managing computer vision syndrome. Modification in the ergonomics of the working environment, patient education and proper eye care are crucial in managing computer vision syndrome.
Multiple node remote messaging
Blumrich, Matthias A.; Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Ohmacht, Martin; Salapura, Valentina; Steinmacher-Burow, Burkhard; Vranas, Pavlos
2010-08-31
A method for passing remote messages in a parallel computer system formed as a network of interconnected compute nodes includes that a first compute node (A) sends a single remote message to a remote second compute node (B) in order to control the remote second compute node (B) to send at least one remote message. The method includes various steps including controlling a DMA engine at first compute node (A) to prepare the single remote message to include a first message descriptor and at least one remote message descriptor for controlling the remote second compute node (B) to send at least one remote message, including putting the first message descriptor into an injection FIFO at the first compute node (A) and sending the single remote message and the at least one remote message descriptor to the second compute node (B).
Cloud computing applications for biomedical science: A perspective.
Navale, Vivek; Bourne, Philip E
2018-06-01
Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.
Cloud computing applications for biomedical science: A perspective
2018-01-01
Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Kristan D.; Faraj, Daniel A.
In a parallel computer, a largest logical plane from a plurality of logical planes formed of compute nodes of a subcommunicator may be identified by: identifying, by each compute node of the subcommunicator, all logical planes that include the compute node; calculating, by each compute node for each identified logical plane that includes the compute node, an area of the identified logical plane; initiating, by a root node of the subcommunicator, a gather operation; receiving, by the root node from each compute node of the subcommunicator, each node's calculated areas as contribution data to the gather operation; and identifying, bymore » the root node in dependence upon the received calculated areas, a logical plane of the subcommunicator having the greatest area.« less
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
Computers in medicine: patients' attitudes
Cruickshank, P. J.
1984-01-01
Data are presented from two surveys where a 26-item questionnaire was used to measure patients' attitudes to diagnostic computers and to medical computers in general. The first group of respondents were 229 patients who had been given outpatient appointments at a hospital general medical clinic specializing in gastrointestinal problems, where some had experienced a diagnostic computer in use. The second group of respondents were 416 patients attending a group general practice where there was no computer. Patients who had experience of the diagnostic computer or a personal computer had more favourable attitudes to computers in medicine as did younger people and males. The two samples of patients showed broadly similar attitudes, and a notable finding was that over half of each group believed that, with a computer around, the personal touch of the doctor would be lost. PMID:6471021
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
Universal blind quantum computation for hybrid system
NASA Astrophysics Data System (ADS)
Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang
2017-08-01
As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.
Blind Quantum Signature with Blind Quantum Computation
NASA Astrophysics Data System (ADS)
Li, Wei; Shi, Ronghua; Guo, Ying
2017-04-01
Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.
High-Performance Computing Systems and Operations | Computational Science |
NREL Systems and Operations High-Performance Computing Systems and Operations NREL operates high-performance computing (HPC) systems dedicated to advancing energy efficiency and renewable energy technologies. Capabilities NREL's HPC capabilities include: High-Performance Computing Systems We operate
The Mathematics of Computer Error.
ERIC Educational Resources Information Center
Wood, Eric
1988-01-01
Why a computer error occurred is considered by analyzing the binary system and decimal fractions. How the computer stores numbers is then described. Knowledge of the mathematics behind computer operation is important if one wishes to understand and have confidence in the results of computer calculations. (MNS)
Employment Opportunities for the Handicapped in Programmable Automation.
ERIC Educational Resources Information Center
Swift, Richard; Leneway, Robert
A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…
ERIC Educational Resources Information Center
Peelle, Howard A.
Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2013 CFR
2013-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2010 CFR
2010-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2014 CFR
2014-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2011 CFR
2011-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
48 CFR 227.7203-17 - Overseas contracts with foreign sources.
Code of Federal Regulations, 2012 CFR
2012-10-01
... COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-17 Overseas contracts with foreign sources. (a) The clause at 252.227-7032, Rights in Technical Data and Computer Software... Noncommercial Computer Software and Noncommercial Computer Software Documentation, when the Government requires...
Pervasive Computing Goes to School
ERIC Educational Resources Information Center
Plymale, William O.
2005-01-01
In 1991 Mark Weiser introduced the idea of ubiquitous computing: a world in which computers and associated technologies become invisible, and thus indistinguishable from everyday life. This invisible computing is accomplished by means of "embodied virtuality," the process of drawing computers into the physical world. Weiser proposed that…
48 CFR 352.227-14 - Rights in Data-Exceptional Circumstances.
Code of Federal Regulations, 2014 CFR
2014-10-01
....] Computer database or database means a collection of recorded information in a form capable of, and for the... databases or computer software documentation. Computer software documentation means owner's manuals, user's... nature (including computer databases and computer software documentation). This term does not include...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...
21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
Engineering and Computing Portal to Solve Environmental Problems
NASA Astrophysics Data System (ADS)
Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.
2018-01-01
This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.
Information Sources on Computer Literacy.
ERIC Educational Resources Information Center
Ossman, Marian R.
1984-01-01
Cites books, journals, articles, and speeches covering the gamut from computer literacy as a national crisis to a current listing of popular computer camps, educational computing, library role, and staff training. Primary focus is on microcomputers, but several less recent articles are oriented to computers in general. (MBR)
Computer-Generated Feedback on Student Writing
ERIC Educational Resources Information Center
Ware, Paige
2011-01-01
A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…
Student Computer Dialogs Without Special Purpose Languages.
ERIC Educational Resources Information Center
Bork, Alfred
The phrase "student computer dialogs" refers to interactive sessions between the student and the computer. Rather than using programing languages specifically designed for computer assisted instruction (CAI), existing general purpose languages should be emphasized in the future development of student computer dialogs, as the power and…
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow parameter...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
Davidson, R W
1985-01-01
The increasing need to communicate to exchange data can be handled by personal microcomputers. The necessity for the transference of information stored in one type of personal computer to another type of personal computer is often encountered in the process of integrating multiple sources of information stored in different and incompatible computers in Medical Research and Practice. A practical example is demonstrated with two relatively inexpensive commonly used computers, the IBM PC jr. and the Apple IIe. The basic input/output (I/O) interface chip for serial communication for each computer are joined together using a Null connector and cable to form a communications link. Using BASIC (Beginner's All-purpose Symbolic Instruction Code) Computer Language and the Disk Operating System (DOS) the communications handshaking protocol and file transfer is established between the two computers. The BASIC programming languages used are Applesoft (Apple Personal Computer) and PC BASIC (IBM Personal computer).
Blind topological measurement-based quantum computation
Morimae, Tomoyuki; Fujii, Keisuke
2012-01-01
Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf–Harrington–Goyal scheme. The error threshold of our scheme is 4.3×10−3, which is comparable to that (7.5×10−3) of non-blind topological quantum computation. As the error per gate of the order 10−3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach. PMID:22948818
Nofre, David; Priestley, Mark; Alberts, Gerard
2014-01-01
Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.
Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing
NASA Astrophysics Data System (ADS)
Klems, Markus; Nimis, Jens; Tai, Stefan
On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.
[The effects of computer-use on adolescents].
Stefănescu, C; Chele, Gabriela; Chiriţă, V; Chiriţă, Roxana; Mavros, M; Macarie, G; Ilinca, M
2005-01-01
Computers continue to play a vital role in today's generation. The need for information about the effects of computers on their users also increases. The purpose of this study is to investigate how children and adolescents use a computer and to explore the beneficial and harmful effects of computer use on children's mental and physical health. The studied group of samples comprised 69 subjects, aged between 13 and 18 years, who answered to a questionnaire. The parents of the children also answered another questionnaire with the same subject. Data have been statistically processed using the program SPSS. The results were obtained about computer use and the pathological use was identified. Some children spend much time on computers, 4% more than five hours/day. 41% of the parents believe that the usage of the computer is favorable to the children's mental and physical health and development, 49% of parents believe that the computer may be harmful. 1.4% of the children had pathological use of the computer.
High-Productivity Computing in Computational Physics Education
NASA Astrophysics Data System (ADS)
Tel-Zur, Guy
2011-03-01
We describe the development of a new course in Computational Physics at the Ben-Gurion University. This elective course for 3rd year undergraduates and MSc. students is being taught during one semester. Computational Physics is by now well accepted as the Third Pillar of Science. This paper's claim is that modern Computational Physics education should deal also with High-Productivity Computing. The traditional approach of teaching Computational Physics emphasizes ``Correctness'' and then ``Accuracy'' and we add also ``Performance.'' Along with topics in Mathematical Methods and case studies in Physics the course deals a significant amount of time with ``Mini-Courses'' in topics such as: High-Throughput Computing - Condor, Parallel Programming - MPI and OpenMP, How to build a Beowulf, Visualization and Grid and Cloud Computing. The course does not intend to teach neither new physics nor new mathematics but it is focused on an integrated approach for solving problems starting from the physics problem, the corresponding mathematical solution, the numerical scheme, writing an efficient computer code and finally analysis and visualization.
Quantum computation for solving linear systems
NASA Astrophysics Data System (ADS)
Cao, Yudong
Quantum computation is a subject born out of the combination between physics and computer science. It studies how the laws of quantum mechanics can be exploited to perform computations much more efficiently than current computers (termed classical computers as oppose to quantum computers). The thesis starts by introducing ideas from quantum physics and theoretical computer science and based on these ideas, introducing the basic concepts in quantum computing. These introductory discussions are intended for non-specialists to obtain the essential knowledge needed for understanding the new results presented in the subsequent chapters. After introducing the basics of quantum computing, we focus on the recently proposed quantum algorithm for linear systems. The new results include i) special instances of quantum circuits that can be implemented using current experimental resources; ii) detailed quantum algorithms that are suitable for a broader class of linear systems. We show that for some particular problems the quantum algorithm is able to achieve exponential speedup over their classical counterparts.
Computational Aeroelastic Modeling of Airframes and TurboMachinery: Progress and Challenges
NASA Technical Reports Server (NTRS)
Bartels, R. E.; Sayma, A. I.
2006-01-01
Computational analyses such as computational fluid dynamics and computational structural dynamics have made major advances toward maturity as engineering tools. Computational aeroelasticity is the integration of these disciplines. As computational aeroelasticity matures it too finds an increasing role in the design and analysis of aerospace vehicles. This paper presents a survey of the current state of computational aeroelasticity with a discussion of recent research, success and continuing challenges in its progressive integration into multidisciplinary aerospace design. This paper approaches computational aeroelasticity from the perspective of the two main areas of application: airframe and turbomachinery design. An overview will be presented of the different prediction methods used for each field of application. Differing levels of nonlinear modeling will be discussed with insight into accuracy versus complexity and computational requirements. Subjects will include current advanced methods (linear and nonlinear), nonlinear flow models, use of order reduction techniques and future trends in incorporating structural nonlinearity. Examples in which computational aeroelasticity is currently being integrated into the design of airframes and turbomachinery will be presented.
Effects of portable computing devices on posture, muscle activation levels and efficiency.
Werth, Abigail; Babski-Reeves, Kari
2014-11-01
Very little research exists on ergonomic exposures when using portable computing devices. This study quantified muscle activity (forearm and neck), posture (wrist, forearm and neck), and performance (gross typing speed and error rates) differences across three portable computing devices (laptop, netbook, and slate computer) and two work settings (desk and computer) during data entry tasks. Twelve participants completed test sessions on a single computer using a test-rest-test protocol (30min of work at one work setting, 15min of rest, 30min of work at the other work setting). The slate computer resulted in significantly more non-neutral wrist, elbow and neck postures, particularly when working on the sofa. Performance on the slate computer was four times less than that of the other computers, though lower muscle activity levels were also found. Potential or injury or illness may be elevated when working on smaller, portable computers in non-traditional work settings. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Computation of Standard Errors
Dowd, Bryan E; Greene, William H; Norton, Edward C
2014-01-01
Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304
Measurement-only verifiable blind quantum computing with quantum input verification
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki
2016-10-01
Verifiable blind quantum computing is a secure delegated quantum computing where a client with a limited quantum technology delegates her quantum computing to a server who has a universal quantum computer. The client's privacy is protected (blindness), and the correctness of the computation is verifiable by the client despite her limited quantum technology (verifiability). There are mainly two types of protocols for verifiable blind quantum computing: the protocol where the client has only to generate single-qubit states and the protocol where the client needs only the ability of single-qubit measurements. The latter is called the measurement-only verifiable blind quantum computing. If the input of the client's quantum computing is a quantum state, whose classical efficient description is not known to the client, there was no way for the measurement-only client to verify the correctness of the input. Here we introduce a protocol of measurement-only verifiable blind quantum computing where the correctness of the quantum input is also verifiable.
Lanczos eigensolution method for high-performance computers
NASA Technical Reports Server (NTRS)
Bostic, Susan W.
1991-01-01
The theory, computational analysis, and applications are presented of a Lanczos algorithm on high performance computers. The computationally intensive steps of the algorithm are identified as: the matrix factorization, the forward/backward equation solution, and the matrix vector multiples. These computational steps are optimized to exploit the vector and parallel capabilities of high performance computers. The savings in computational time from applying optimization techniques such as: variable band and sparse data storage and access, loop unrolling, use of local memory, and compiler directives are presented. Two large scale structural analysis applications are described: the buckling of a composite blade stiffened panel with a cutout, and the vibration analysis of a high speed civil transport. The sequential computational time for the panel problem executed on a CONVEX computer of 181.6 seconds was decreased to 14.1 seconds with the optimized vector algorithm. The best computational time of 23 seconds for the transport problem with 17,000 degs of freedom was on the the Cray-YMP using an average of 3.63 processors.
ABSENTEE COMPUTATIONS IN A MULTIPLE-ACCESS COMPUTER SYSTEM.
require user interaction, and the user may therefore want to run these computations ’ absentee ’ (or, user not present). A mechanism is presented which...provides for the handling of absentee computations in a multiple-access computer system. The design is intended to be implementation-independent...Some novel features of the system’s design are: a user can switch computations from interactive to absentee (and vice versa), the system can
ERIC Educational Resources Information Center
Sarfo, Frederick Kwaku; Amankwah, Francis; Konin, Daniel
2017-01-01
The study is aimed at investigating 1) the level of computer self-efficacy among public senior high school (SHS) teachers in Ghana and 2) the functionality of teachers' age, gender, and computer experiences on their computer self-efficacy. Four hundred and Seven (407) SHS teachers were used for the study. The "Computer Self-Efficacy"…
Infrastructures for Distributed Computing: the case of BESIII
NASA Astrophysics Data System (ADS)
Pellegrino, J.
2018-05-01
The BESIII is an electron-positron collision experiment hosted at BEPCII in Beijing and aimed to investigate Tau-Charm physics. Now BESIII has been running for several years and gathered more than 1PB raw data. In order to analyze these data and perform massive Monte Carlo simulations, a large amount of computing and storage resources is needed. The distributed computing system is based up on DIRAC and it is in production since 2012. It integrates computing and storage resources from different institutes and a variety of resource types such as cluster, grid, cloud or volunteer computing. About 15 sites from BESIII Collaboration from all over the world joined this distributed computing infrastructure, giving a significant contribution to the IHEP computing facility. Nowadays cloud computing is playing a key role in the HEP computing field, due to its scalability and elasticity. Cloud infrastructures take advantages of several tools, such as VMDirac, to manage virtual machines through cloud managers according to the job requirements. With the virtually unlimited resources from commercial clouds, the computing capacity could scale accordingly in order to deal with any burst demands. General computing models have been discussed in the talk and are addressed herewith, with particular focus on the BESIII infrastructure. Moreover new computing tools and upcoming infrastructures will be addressed.
Asah, Flora
2013-04-01
This study discusses factors inhibiting computer usage for work-related tasks among computer-literate professional nurses within rural healthcare facilities in South Africa. In the past two decades computer literacy courses have not been part of the nursing curricula. Computer courses are offered by the State Information Technology Agency. Despite this, there seems to be limited use of computers by professional nurses in the rural context. Focus group interviews held with 40 professional nurses from three government hospitals in northern KwaZulu-Natal. Contributing factors were found to be lack of information technology infrastructure, restricted access to computers and deficits in regard to the technical and nursing management support. The physical location of computers within the health-care facilities and lack of relevant software emerged as specific obstacles to usage. Provision of continuous and active support from nursing management could positively influence computer usage among professional nurses. A closer integration of information technology and computer literacy skills into existing nursing curricula would foster a positive attitude towards computer usage through early exposure. Responses indicated that change of mindset may be needed on the part of nursing management so that they begin to actively promote ready access to computers as a means of creating greater professionalism and collegiality. © 2011 Blackwell Publishing Ltd.
Partitioning sparse matrices with eigenvectors of graphs
NASA Technical Reports Server (NTRS)
Pothen, Alex; Simon, Horst D.; Liou, Kang-Pu
1990-01-01
The problem of computing a small vertex separator in a graph arises in the context of computing a good ordering for the parallel factorization of sparse, symmetric matrices. An algebraic approach for computing vertex separators is considered in this paper. It is shown that lower bounds on separator sizes can be obtained in terms of the eigenvalues of the Laplacian matrix associated with a graph. The Laplacian eigenvectors of grid graphs can be computed from Kronecker products involving the eigenvectors of path graphs, and these eigenvectors can be used to compute good separators in grid graphs. A heuristic algorithm is designed to compute a vertex separator in a general graph by first computing an edge separator in the graph from an eigenvector of the Laplacian matrix, and then using a maximum matching in a subgraph to compute the vertex separator. Results on the quality of the separators computed by the spectral algorithm are presented, and these are compared with separators obtained from other algorithms for computing separators. Finally, the time required to compute the Laplacian eigenvector is reported, and the accuracy with which the eigenvector must be computed to obtain good separators is considered. The spectral algorithm has the advantage that it can be implemented on a medium-size multiprocessor in a straightforward manner.
NASA Astrophysics Data System (ADS)
Clementi, Enrico
2012-06-01
This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.
Defining Computational Thinking for Mathematics and Science Classrooms
NASA Astrophysics Data System (ADS)
Weintrop, David; Beheshti, Elham; Horn, Michael; Orton, Kai; Jona, Kemi; Trouille, Laura; Wilensky, Uri
2016-02-01
Science and mathematics are becoming computational endeavors. This fact is reflected in the recently released Next Generation Science Standards and the decision to include "computational thinking" as a core scientific practice. With this addition, and the increased presence of computation in mathematics and scientific contexts, a new urgency has come to the challenge of defining computational thinking and providing a theoretical grounding for what form it should take in school science and mathematics classrooms. This paper presents a response to this challenge by proposing a definition of computational thinking for mathematics and science in the form of a taxonomy consisting of four main categories: data practices, modeling and simulation practices, computational problem solving practices, and systems thinking practices. In formulating this taxonomy, we draw on the existing computational thinking literature, interviews with mathematicians and scientists, and exemplary computational thinking instructional materials. This work was undertaken as part of a larger effort to infuse computational thinking into high school science and mathematics curricular materials. In this paper, we argue for the approach of embedding computational thinking in mathematics and science contexts, present the taxonomy, and discuss how we envision the taxonomy being used to bring current educational efforts in line with the increasingly computational nature of modern science and mathematics.
Richard Feynman and computation
NASA Astrophysics Data System (ADS)
Hey, Tony
1999-04-01
The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.
Computer literacy and attitudes towards e-learning among first year medical students
Link, Thomas Michael; Marz, Richard
2006-01-01
Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524
Computer literacy and attitudes towards e-learning among first year medical students.
Link, Thomas Michael; Marz, Richard
2006-06-19
At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less
Volunteered Cloud Computing for Disaster Management
NASA Astrophysics Data System (ADS)
Evans, J. D.; Hao, W.; Chettri, S. R.
2014-12-01
Disaster management relies increasingly on interpreting earth observations and running numerical models; which require significant computing capacity - usually on short notice and at irregular intervals. Peak computing demand during event detection, hazard assessment, or incident response may exceed agency budgets; however some of it can be met through volunteered computing, which distributes subtasks to participating computers via the Internet. This approach has enabled large projects in mathematics, basic science, and climate research to harness the slack computing capacity of thousands of desktop computers. This capacity is likely to diminish as desktops give way to battery-powered mobile devices (laptops, smartphones, tablets) in the consumer market; but as cloud computing becomes commonplace, it may offer significant slack capacity -- if its users are given an easy, trustworthy mechanism for participating. Such a "volunteered cloud computing" mechanism would also offer several advantages over traditional volunteered computing: tasks distributed within a cloud have fewer bandwidth limitations; granular billing mechanisms allow small slices of "interstitial" computing at no marginal cost; and virtual storage volumes allow in-depth, reversible machine reconfiguration. Volunteered cloud computing is especially suitable for "embarrassingly parallel" tasks, including ones requiring large data volumes: examples in disaster management include near-real-time image interpretation, pattern / trend detection, or large model ensembles. In the context of a major disaster, we estimate that cloud users (if suitably informed) might volunteer hundreds to thousands of CPU cores across a large provider such as Amazon Web Services. To explore this potential, we are building a volunteered cloud computing platform and targeting it to a disaster management context. Using a lightweight, fault-tolerant network protocol, this platform helps cloud users join parallel computing projects; automates reconfiguration of their virtual machines; ensures accountability for donated computing; and optimizes the use of "interstitial" computing. Initial applications include fire detection from multispectral satellite imagery and flood risk mapping through hydrological simulations.
Investigation of a computer virus outbreak in the pharmacy of a tertiary care teaching hospital.
Bailey, T C; Reichley, R M
1992-10-01
A computer virus outbreak was recognized, verified, defined, investigated, and controlled using an infection control approach. The pathogenesis and epidemiology of computer virus infection are reviewed. Case-control study. Pharmacy of a tertiary care teaching institution. On October 28, 1991, 2 personal computers in the drug information center manifested symptoms consistent with the "Jerusalem" virus infection. The same day, a departmental personal computer began playing "Yankee Doodle," a sign of "Doodle" virus infection. An investigation of all departmental personal computers identified the "Stoned" virus in an additional personal computer. Controls were functioning virus-free personal computers within the department. Cases were associated with users who brought diskettes from outside the department (5/5 cases versus 5/13 controls, p = .04) and with College of Pharmacy student users (3/5 cases versus 0/13 controls, p = .012). The detection of a virus-infected diskette or personal computer was associated with the number of 5 1/4-inch diskettes in the files of personal computers, a surrogate for rate of media exchange (mean = 17.4 versus 152.5, p = .018, Wilcoxon rank sum test). After education of departmental personal computer users regarding appropriate computer hygiene and installation of virus protection software, no further spread of personal computer viruses occurred, although 2 additional Stoned-infected and 1 Jerusalem-infected diskettes were detected. We recommend that virus detection software be installed on personal computers where the interchange of diskettes among computers is necessary, that write-protect tabs be placed on all program master diskettes and data diskettes where data are being read and not written, that in the event of a computer virus outbreak, all available diskettes be quarantined and scanned by virus detection software, and to facilitate quarantine and scanning in an outbreak, that diskettes be stored in organized files.
Evaluating Computer Technology Integration in a Centralized School System
ERIC Educational Resources Information Center
Eteokleous, N.
2008-01-01
The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…
ERIC Educational Resources Information Center
Guelph Univ. (Ontario).
This 21-paper collection examines various issues in electronic networking and conferencing with computers, including design issues, conferencing in education, electronic messaging, computer conferencing applications, social issues of computer conferencing, and distributed computer conferencing. In addition to a keynote address, "Computer…