Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-07
... technology, to include computer telecommunications or other electronic means, that the lead agency is... assess the capacity and resources of the public to utilize and maintain an electronic- or computer... the technology, to include computer telecommunications or other electronic means, that the lead agency...
Controlling user access to electronic resources without password
Smith, Fred Hewitt
2015-06-16
Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.
Mediagraphy: Print and Nonprint Resources.
ERIC Educational Resources Information Center
Educational Media and Technology Yearbook, 1998
1998-01-01
Lists educational media-related journals, books, ERIC documents, journal articles, and nonprint resources classified by Artificial Intelligence, Robotics, Electronic Performance Support Systems; Computer-Assisted Instruction; Distance Education; Educational Research; Educational Technology; Electronic Publishing; Information Science and…
Using Personal Computers To Acquire Special Education Information. Revised. ERIC Digest #429.
ERIC Educational Resources Information Center
ERIC Clearinghouse on Handicapped and Gifted Children, Reston, VA.
This digest offers basic information about resources, available to users of personal computers, in the area of professional development in special education. Two types of resources are described: those that can be purchased on computer diskettes and those made available by linking personal computers through electronic telephone networks. Resources…
Electronic Advocacy and Social Welfare Policy Education
ERIC Educational Resources Information Center
Moon, Sung Seek; DeWeaver, Kevin L.
2005-01-01
The rapid increase in the number of low-cost computers, the proliferation of user-friendly software, and the development of electronic networks have created the "informatics era." The Internet is a rapidly growing communication resource that is becoming mainstream in the American society. Computer-based electronic political advocacy by social…
Murphy, Andrea L; Fleming, Mark; Martin-Misener, Ruth; Sketris, Ingrid S; MacCara, Mary; Gass, David
2006-01-01
Background Keeping current with drug therapy information is challenging for health care practitioners. Technologies are often implemented to facilitate access to current and credible drug information sources. In the Canadian province of Nova Scotia, legislation was passed in 2002 to allow nurse practitioners (NPs) to practice collaboratively with physician partners. The purpose of this study was to determine the current utilization patterns of information technologies by these groups of practitioners. Methods Nurse practitioners and their collaborating physician partners in Nova Scotia were sent a survey in February 2005 to determine the frequency of use, usefulness, accessibility, credibility, and current/timeliness of personal digital assistant (PDA), computer, and print drug information resources. Two surveys were developed (one for PDA users and one for computer users) and revised based on a literature search, stakeholder consultation, and pilot-testing results. A second distribution to nonresponders occurred two weeks following the first. Data were entered and analysed with SPSS. Results Twenty-seven (14 NPs and 13 physicians) of 36 (75%) recipients responded. 22% (6) returned personal digital assistant (PDA) surveys. Respondents reported print, health professionals, and online/electronic resources as the most to least preferred means to access drug information, respectively. 37% and 35% of respondents reported using "both print and electronic but print more than electronic" and "print only", respectively, to search monograph-related drug information queries whereas 4% reported using "PDA only". Analysis of respondent ratings for all resources in the categories print, health professionals and other, and online/electronic resources, indicated that the Compendium of Pharmaceuticals and Specialties and pharmacists ranked highly for frequency of use, usefulness, accessibility, credibility, and current/timeliness by both groups of practitioners. Respondents' preferences and resource ratings were consistent with self-reported methods for conducting drug information queries. Few differences existed between NP and physician rankings of resources. Conclusion The use of computers and PDAs remains limited, which is also consistent with preferred and frequent use of print resources. Education for these practitioners regarding available electronic drug information resources may facilitate future computer and PDA use. Further research is needed to determine methods to increase computer and PDA use and whether these technologies affect prescribing and patient outcomes. PMID:16822323
Hybrid quantum-classical hierarchy for mitigation of decoherence and determination of excited states
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClean, Jarrod R.; Kimchi-Schwartz, Mollie E.; Carter, Jonathan
Using quantum devices supported by classical computational resources is a promising approach to quantum-enabled computation. One powerful example of such a hybrid quantum-classical approach optimized for classically intractable eigenvalue problems is the variational quantum eigensolver, built to utilize quantum resources for the solution of eigenvalue problems and optimizations with minimal coherence time requirements by leveraging classical computational resources. These algorithms have been placed as leaders among the candidates for the first to achieve supremacy over classical computation. Here, we provide evidence for the conjecture that variational approaches can automatically suppress even nonsystematic decoherence errors by introducing an exactly solvable channelmore » model of variational state preparation. Moreover, we develop a more general hierarchy of measurement and classical computation that allows one to obtain increasingly accurate solutions by leveraging additional measurements and classical resources. In conclusion, we demonstrate numerically on a sample electronic system that this method both allows for the accurate determination of excited electronic states as well as reduces the impact of decoherence, without using any additional quantum coherence time or formal error-correction codes.« less
Electronics Environmental Benefits Calculator
The Electronics Environmental Benefits Calculator (EEBC) was developed to assist organizations in estimating the environmental benefits of greening their purchase, use and disposal of electronics.The EEBC estimates the environmental and economic benefits of: Purchasing Electronic Product Environmental Assessment Tool (EPEAT)-registered products; Enabling power management features on computers and monitors above default percentages; Extending the life of equipment beyond baseline values; Reusing computers, monitors and cell phones; and Recycling computers, monitors, cell phones and loads of mixed electronic products.The EEBC may be downloaded as a Microsoft Excel spreadsheet.See https://www.federalelectronicschallenge.net/resources/bencalc.htm for more details.
JSC earth resources data analysis capabilities available to EOD revision B
NASA Technical Reports Server (NTRS)
1974-01-01
A list and summary description of all Johnson Space Center electronic laboratory and photographic laboratory capabilities available to earth resources division personnel for processing earth resources data are provided. The electronic capabilities pertain to those facilities and systems that use electronic and/or photographic products as output. The photographic capabilities pertain to equipment that uses photographic images as input and electronic and/or table summarizes processing steps. A general hardware description is presented for each of the data processing systems, and the titles of computer programs are used to identify the capabilities and data flow.
Electronic Journals in Academic Libraries: A Comparison of ARL and Non-ARL Libraries.
ERIC Educational Resources Information Center
Shemberg, Marian; Grossman, Cheryl
1999-01-01
Describes a survey dealing with academic library provision of electronic journals and other electronic resources that compared ARL (Association of Research Libraries) members to non-ARL members. Highlights include full-text electronic journals; computers in libraries; online public access catalogs; interlibrary loan and electronic reserves; access…
Document Delivery: An Annotated Selective Bibliography.
ERIC Educational Resources Information Center
Khalil, Mounir A.; Katz, Suzanne R.
1992-01-01
Presents a selective annotated bibliography of 61 items that deal with topics related to document delivery, including networks; hypertext; interlibrary loan; computer security; electronic publishing; copyright; online catalogs; resource sharing; electronic mail; electronic libraries; optical character recognition; microcomputers; liability issues;…
ERIC Educational Resources Information Center
McAnear, Anita
2006-01-01
When we planned the editorial calendar with the topic ubiquitous computing, we were thinking of ubiquitous computing as the one-to-one ratio of computers to students and teachers and 24/7 access to electronic resources. At the time, we were aware that ubiquitous computing in the computer science field had more to do with wearable computers. Our…
Relate@IU>>>Share@IU: A New and Different Computer-Based Communications Paradigm.
ERIC Educational Resources Information Center
Frick, Theodore W.; Roberto, Joseph; Korkmaz, Ali; Oh, Jeong-En; Twal, Riad
The purpose of this study was to examine problems with the current computer-based electronic communication systems and to initially test and revise a new and different paradigm for e-collaboration, Relate@IU. Understanding the concept of sending links to resources, rather than sending the resource itself, is at the core of how Relate@IU differs…
Reconfigurable Computing Concepts for Space Missions: Universal Modular Spares
NASA Technical Reports Server (NTRS)
Patrick, M. Clinton
2007-01-01
Computing hardware for control, data collection, and other purposes will prove many times over crucial resources in NASA's upcoming space missions. Ability to provide these resources within mission payload requirements, with the hardiness to operate for extended periods under potentially harsh conditions in off-World environments, is daunting enough without considering the possibility of doing so with conventional electronics. This paper examines some ideas and options, and proposes some initial approaches, for logical design of reconfigurable computing resources offering true modularity, universal compatibility, and unprecedented flexibility to service all forms and needs of mission infrastructure.
Stiltner, G.J.
1990-01-01
In 1987, the Water Resources Division of the U.S. Geological Survey undertook three pilot projects to evaluate electronic report processing systems as a means to improve the quality and timeliness of reports pertaining to water resources investigations. The three projects selected for study included the use of the following configuration of software and hardware: Ventura Publisher software on an IBM model AT personal computer, PageMaker software on a Macintosh computer, and FrameMaker software on a Sun Microsystems workstation. The following assessment criteria were to be addressed in the pilot studies: The combined use of text, tables, and graphics; analysis of time; ease of learning; compatibility with the existing minicomputer system; and technical limitations. It was considered essential that the camera-ready copy produced be in a format suitable for publication. Visual improvement alone was not a consideration. This report consolidates and summarizes the findings of the electronic report processing pilot projects. Text and table files originating on the existing minicomputer system were successfully transformed to the electronic report processing systems in American Standard Code for Information Interchange (ASCII) format. Graphics prepared using a proprietary graphics software package were transferred to all the electronic report processing software through the use of Computer Graphic Metafiles. Graphics from other sources were entered into the systems by scanning paper images. Comparative analysis of time needed to process text and tables by the electronic report processing systems and by conventional methods indicated that, although more time is invested in creating the original page composition for an electronically processed report , substantial time is saved in producing subsequent reports because the format can be stored and re-used by electronic means as a template. Because of the more compact page layouts, costs of printing the reports were 15% to 25% less than costs of printing the reports prepared by conventional methods. Because the largest report workload in the offices conducting water resources investigations is preparation of Water-Resources Investigations Reports, Open-File Reports, and annual State Data Reports, the pilot studies only involved these projects. (USGS)
Mediagraphy: Print and Nonprint Resources.
ERIC Educational Resources Information Center
Educational Media and Technology Yearbook, 1999
1999-01-01
Provides annotated listings for current journals, books, ERIC documents, articles, and nonprint resources in the following categories: artificial intelligence/robotics/electronic performance support systems; computer-assisted instruction; distance education; educational research; educational technology; information science and technology;…
Controlling user access to electronic resources without password
Smith, Fred Hewitt
2017-08-22
Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes obtaining an image from a communication device of a user. An individual and a landmark are identified within the image. Determinations are made that the individual is the user and that the landmark is a predetermined landmark. Access to a restricted computing resource is granted based on the determining that the individual is the user and that the landmark is the predetermined landmark. Other embodiments are disclosed.
Mediagraphy: Print and Nonprint Resources.
ERIC Educational Resources Information Center
Educational Media and Technology Yearbook, 1996
1996-01-01
This annotated list includes media-related resources classified under the following headings: artificial intelligence and robotics, CD-ROM, computer-assisted instruction, databases and online searching, distance education, educational research, educational technology, electronic publishing, information science and technology, instructional design…
Mediagraphy: Print and Nonprint Resources.
ERIC Educational Resources Information Center
Educational Media and Technology Yearbook, 1997
1997-01-01
This annotated list includes media-related resources classified under the following headings: artificial intelligence and robotics, CD-ROM, computer-assisted instruction, databases and online searching, distance education, educational research, educational technology, electronic publishing, information science and technology, instructional design…
ERIC Educational Resources Information Center
Blodgett, Teresa; Repman, Judi
1995-01-01
Addresses the necessity of incorporating new computer technologies into school library resource centers and notes some administrative challenges. An extensive checklist is provided for assessing equipment and furniture needs, physical facilities, and rewiring needs. A glossary of 20 terms and 11 additional resources is included. (AEF)
ERIC Educational Resources Information Center
Henninger, Jessamyn; Aber, Susan Ward
2010-01-01
Systems Architects and Information Technology administrators working in higher education help faculty, staff, and student computer users. Yet, who helps them? What resources do these professionals value? A case study was conducted using purposeful sampling and data collection through electronic interview to gather the preferred information-seeking…
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.
Cianfrocco, Michael A; Leschziner, Andres E
2015-05-08
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.
HyperCard K-12: Classroom Computer Learning Special Supplement Sponsored by Apple Computer.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1989
1989-01-01
Follows the development of hypertext which is the electronic movement of large amounts of text. Probes the use of the Macintosh HyperCard and its applications in education. Notes programs are stackable in the computer. Provides tool, resource, and stack directory along with tips for using HyperCard. (MVL)
ERIC Educational Resources Information Center
Dillon, Martin; And Others
The Online Computer Library Center Internet Resource project focused on the nature of electronic textual information available through remote access using the Internet and the problems associated with creating machine-readable cataloging (MARC) records for these objects using current USMARC format for computer files and "Anglo-American…
The 3d International Workshop on Computational Electronics
NASA Astrophysics Data System (ADS)
Goodnick, Stephen M.
1994-09-01
The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.
Sarpeshkar, R
2014-03-28
We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog-digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA-protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations.
Sarpeshkar, R.
2014-01-01
We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog–digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA–protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations. PMID:24567476
Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice
Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.
2015-01-01
This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248
Psychiatrists' Comfort Using Computers and Other Electronic Devices in Clinical Practice.
Duffy, Farifteh F; Fochtmann, Laura J; Clarke, Diana E; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K; Plovnick, Robert M
2016-09-01
This report highlights findings from the Study of Psychiatrists' Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists' comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted.
The transforming effect of handheld computers on nursing practice.
Thompson, Brent W
2005-01-01
Handheld computers have the power to transform nursing care. The roots of this power are the shift to decentralization of communication, electronic health records, and nurses' greater need for information at the point of care. This article discusses the effects of handheld resources, calculators, databases, electronic health records, and communication devices on nursing practice. The US government has articulated the necessity of implementing the use of handheld computers in healthcare. Nurse administrators need to encourage and promote the diffusion of this technology, which can reduce costs and improve care.
Infrastructures for Distributed Computing: the case of BESIII
NASA Astrophysics Data System (ADS)
Pellegrino, J.
2018-05-01
The BESIII is an electron-positron collision experiment hosted at BEPCII in Beijing and aimed to investigate Tau-Charm physics. Now BESIII has been running for several years and gathered more than 1PB raw data. In order to analyze these data and perform massive Monte Carlo simulations, a large amount of computing and storage resources is needed. The distributed computing system is based up on DIRAC and it is in production since 2012. It integrates computing and storage resources from different institutes and a variety of resource types such as cluster, grid, cloud or volunteer computing. About 15 sites from BESIII Collaboration from all over the world joined this distributed computing infrastructure, giving a significant contribution to the IHEP computing facility. Nowadays cloud computing is playing a key role in the HEP computing field, due to its scalability and elasticity. Cloud infrastructures take advantages of several tools, such as VMDirac, to manage virtual machines through cloud managers according to the job requirements. With the virtually unlimited resources from commercial clouds, the computing capacity could scale accordingly in order to deal with any burst demands. General computing models have been discussed in the talk and are addressed herewith, with particular focus on the BESIII infrastructure. Moreover new computing tools and upcoming infrastructures will be addressed.
Teachers and Electronic Mail: Networking on the Network.
ERIC Educational Resources Information Center
Broholm, John R.; Aust, Ronald
1994-01-01
Describes a study that examined the communication patterns of teachers who used UNITE (Unified Network for Informatics in Teacher Education), an electronic mail system designed to encourage curricular collaboration and resource sharing. Highlights include computer-mediated communication, use of UNITE by librarians, and recommendations for…
Progress in Computational Electron-Molecule Collisions
NASA Astrophysics Data System (ADS)
Rescigno, Tn
1997-10-01
The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud
Cianfrocco, Michael A; Leschziner, Andres E
2015-01-01
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969
Make It and Take It: Computer-Based Resources for Lesson Planning.
ERIC Educational Resources Information Center
Brown, Tasha; Cargill, Debby; Hostetler, Jan; Joyner, Susan; Phillips, Vanessa
This document is part lesson planner and idea resource and part annotated bibliography of electronic resources. The lesson planner is divided into four parts. Part one, "Tables to Go," contains different tables that can be used for a variety of exercises at all levels of the English-as-a-Second-Language (ESL) classroom. Part two, "Exploring the…
The Electron Microscopy Outreach Program: A Web-based resource for research and education.
Sosinsky, G E; Baker, T S; Hand, G; Ellisman, M H
1999-01-01
We have developed a centralized World Wide Web (WWW)-based environment that serves as a resource of software tools and expertise for biological electron microscopy. A major focus is molecular electron microscopy, but the site also includes information and links on structural biology at all levels of resolution. This site serves to help integrate or link structural biology techniques in accordance with user needs. The WWW site, called the Electron Microscopy (EM) Outreach Program (URL: http://emoutreach.sdsc.edu), provides scientists with computational and educational tools for their research and edification. In particular, we have set up a centralized resource containing course notes, references, and links to image analysis and three-dimensional reconstruction software for investigators wanting to learn about EM techniques either within or outside of their fields of expertise. Copyright 1999 Academic Press.
[HYGIENIC REGULATION OF THE USE OF ELECTRONIC EDUCATIONAL RESOURCES IN THE MODERN SCHOOL].
Stepanova, M I; Aleksandrova, I E; Sazanyuk, Z I; Voronova, B Z; Lashneva, L P; Shumkova, T V; Berezina, N O
2015-01-01
We studied the effect of academic studies with the use a notebook computer and interactive whiteboard on the functional state of an organism of schoolchildren. Using a complex of hygienic and physiological methods of the study we established that regulation of the computer activity of students must take into account not only duration but its intensity either. Design features of a notebook computer were shown both to impede keeping the optimal working posture in primary school children and increase the risk offormation of disorders of vision and musculoskeletal system. There were established the activating influence of the interactive whiteboard on performance activities and favorable dynamics of indices of the functional state of the organism of students under keeping optimal density of the academic study and the duration of its use. There are determined safety regulations of the work of schoolchildren with electronic resources in the educational process.
Electronic Data Interchange: Selected Issues and Trends.
ERIC Educational Resources Information Center
Wigand, Rolf T.; And Others
1993-01-01
Describes electronic data interchange (EDI) as the application-to-application exchange of business documents in a computer-readable format. Topics discussed include EDI in various industries, EDI in finance and banking, organizational impacts of EDI, future EDI markets and organizations, and implications for information resources management.…
Crocodile Technology. [CD-ROM].
ERIC Educational Resources Information Center
2000
This high school physics computer software resource is a systems and control simulator that covers the topics of electricity, electronics, mechanics, and programming. Circuits can easily be simulated on the screen and electronic and mechanical components can be combined. In addition to those provided in Crocodile Technology, a student can create…
Cloud Computing and Your Library
ERIC Educational Resources Information Center
Mitchell, Erik T.
2010-01-01
One of the first big shifts in how libraries manage resources was the move from print-journal purchasing models to database-subscription and electronic-journal purchasing models. Libraries found that this transition helped them scale their resources and provide better service just by thinking a bit differently about their services. Likewise,…
Habib, Komal; Parajuly, Keshav; Wenzel, Henrik
2015-10-20
Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.
The Internet and World-Wide-Web: Potential Benefits to Rural Schools.
ERIC Educational Resources Information Center
Barker, Bruce O.
The Internet is a decentralized collection of computer networks managed by separate groups using a common set of technical standards. The Internet has tremendous potential as an educational resource by providing access to networking through worldwide electronic mail, various databases, and electronic bulletin boards; collaborative investigation…
Electronic Resources and the Education of History Professionals
ERIC Educational Resources Information Center
Mulligan, William H., Jr.
2001-01-01
The transforming effects of the tremendous advances in technology that have reshaped the economy and many other elements of American society have had an equally profound impact on historical agencies. The personal computer, the Internet, and associated electronic communications developments have already transformed the museum and historical agency…
Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.
ERIC Educational Resources Information Center
Beltrametti, Monica; English, Will
1994-01-01
Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1988-01-01
The NASA Spacelink is an electronic information service operated by the Marshall Space Flight Center. The Spacelink contains extensive NASA news and educational resources that can be accessed by a computer and modem. Updates and information are provided on: current NASA news; aeronautics; space exploration: before the Shuttle; space exploration: the Shuttle and beyond; NASA installations; NASA educational services; materials for classroom use; and space program spinoffs.
NASA Astrophysics Data System (ADS)
Wang, Xi Vincent; Wang, Lihui
2017-08-01
Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.
NASA Technical Reports Server (NTRS)
Thakoor, Anil
1990-01-01
Viewgraphs on electronic neural networks for space station are presented. Topics covered include: electronic neural networks; electronic implementations; VLSI/thin film hybrid hardware for neurocomputing; computations with analog parallel processing; features of neuroprocessors; applications of neuroprocessors; neural network hardware for terrain trafficability determination; a dedicated processor for path planning; neural network system interface; neural network for robotic control; error backpropagation algorithm for learning; resource allocation matrix; global optimization neuroprocessor; and electrically programmable read only thin-film synaptic array.
ERIC Educational Resources Information Center
Bayram, Servet
2005-01-01
The concept of Electronic Performance Support Systems (EPSS) is containing multimedia or computer based instruction components that improves human performance by providing process simplification, performance information and decision support system. EPSS has become a hot topic for organizational development, human resources, performance technology,…
... in Your Area Stories of Hope Videos Resources Low Vision Specialists Retinal Physicians My Retina Tracker Registry Genetic ... a treatment is discovered, help is available through low-vision aids, including optical, electronic, and computer-based devices. ...
Defense Automation Resources Management Manual
1988-09-01
Electronic Command Signals Programmer, Plugboard Programmers Punch, Card Punch, Paper Tape Reader, Character Reader-Generator, Time Cards Reader...Multiplexor-Shift Register Group Multiplier Panel Control, Plugboard Panel, Interconnection, Digital Computer Panel, Meter-Attenuator, Tape Recorder PC Cards...Perforator, Tape Plug-In Unit Potentiometer, Coefficient, Analog Computer Programmer, Plugboard Punch, Paper Tape Racks Reader, Time Code Reader
Educators and the Internet: What's out There, How to Get Some of It.
ERIC Educational Resources Information Center
Rosenbaum, Howard
1994-01-01
Argues that K-12 media educators and librarians should become vocal advocates for Internet connection in their elementary and secondary schools. Topics include K-12 uses of the Internet; Internet resources; FrEdMail (Free Educational Electronic Mail); and a BITNET computer conference. Appendices list computer conferences, networking contracts, and…
A Collaborative Model for Teaching E-Resources: Northwestern University's Graduate Training Day
ERIC Educational Resources Information Center
Lightman, Harriet; Reingold, Ruth N.
2005-01-01
The authors report on the planning, execution, and future of Northwestern University's Introduction to Electronic Resources/Humanities Computing Training Day, a mandatory one-day set of classes for first-year doctoral students in humanities disciplines. The project is a collaborative effort among the Office of the Dean of the Weinberg College of…
Electronic processing and control system with programmable hardware
NASA Technical Reports Server (NTRS)
Alkalaj, Leon (Inventor); Fang, Wai-Chi (Inventor); Newell, Michael A. (Inventor)
1998-01-01
A computer system with reprogrammable hardware allowing dynamically allocating hardware resources for different functions and adaptability for different processors and different operating platforms. All hardware resources are physically partitioned into system-user hardware and application-user hardware depending on the specific operation requirements. A reprogrammable interface preferably interconnects the system-user hardware and application-user hardware.
Analog Processor To Solve Optimization Problems
NASA Technical Reports Server (NTRS)
Duong, Tuan A.; Eberhardt, Silvio P.; Thakoor, Anil P.
1993-01-01
Proposed analog processor solves "traveling-salesman" problem, considered paradigm of global-optimization problems involving routing or allocation of resources. Includes electronic neural network and auxiliary circuitry based partly on concepts described in "Neural-Network Processor Would Allocate Resources" (NPO-17781) and "Neural Network Solves 'Traveling-Salesman' Problem" (NPO-17807). Processor based on highly parallel computing solves problem in significantly less time.
Electronic Job Search Revolution. Win with the New Technology that's Reshaping Today's Job Market.
ERIC Educational Resources Information Center
Kennedy, Joyce Lain; Morrow, Thomas J.
This book contains information about the resources available to merge new technology and the search for employment. It offers suggestions from human resource specialists, software authors, and database experts. Chapter 1 is an overview of how the computer has become indispensable in a job search. Chapter 2 focuses on external, third-party resume…
Five Ways to Hack and Cheat with Bring-Your-Own-Device Electronic Examinations
ERIC Educational Resources Information Center
Dawson, Phillip
2016-01-01
Bring-your-own-device electronic examinations (BYOD e-exams) are a relatively new type of assessment where students sit an in-person exam under invigilated conditions with their own laptop. Special software restricts student access to prohibited computer functions and files, and provides access to any resources or software the examiner approves.…
NASA Tech Briefs, May 1995. Volume 19, No. 5
NASA Technical Reports Server (NTRS)
1995-01-01
This issue features an resource report on Jet Propulsion Laboratory and a special focus on advanced composites and plastics. It also contains articles on electronic components and circuits, electronic systems, physical sciences, computer programs, mechanics, machinery, manufacturing and fabrication, mathematics and information sciences, and life sciences. This issue also contains a supplement on federal laboratory test and measurements.
ERIC Educational Resources Information Center
Anderson, Mary Alice, Ed.
This notebook is a compilation of 53 lesson plans for grades 6-12, written by various authors and focusing on the integration of technology into the curriculum. Lesson plans include topics such as online catalog searching, electronic encyclopedias, CD-ROM databases, exploring the Internet, creating a computer slide show, desktop publishing, and…
ERIC Educational Resources Information Center
Tennant, Roy
1992-01-01
Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)
ERIC Educational Resources Information Center
Hernandez, Nicolas, Jr.
1988-01-01
Traces the origin of ISAAC (Information System for Advanced Academic Computing) and the development of a languages and linguistics "room" at the University of Washington-Seattle. ISAAC, a free, valuable resource, consists of two databases and an electronic bulletin board spanning broad areas of pedagogical and research fields. (Author/CB)
Hathaway, R.M.; McNellis, J.M.
1989-01-01
Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously. The new approaches and expanded use of computers will require substantial increases in the quantity and sophistication of the Division 's computer resources. The requirements presented in this report will be used to develop technical specifications that describe the computer resources needed during the 1990's. (USGS)
Ergonomics in the electronic library.
Thibodeau, P L; Melamut, S J
1995-01-01
New technologies are changing the face of information services and how those services are delivered. Libraries spend a great deal of time planning the hardware and software implementations of electronic information services, but the human factors are often overlooked. Computers and electronic tools have changed the nature of many librarians' daily work, creating new problems, including stress, fatigue, and cumulative trauma disorders. Ergonomic issues need to be considered when designing or redesigning facilities for electronic resources and services. Libraries can prevent some of the common problems that appear in the digital workplace by paying attention to basic ergonomic issues when designing workstations and work areas. Proper monitor placement, lighting, workstation setup, and seating prevent many of the common occupational problems associated with computers. Staff training will further reduce the likelihood of ergonomic problems in the electronic workplace. PMID:7581189
Potential resource and toxicity impacts from metals in waste electronic devices.
Woo, Seung H; Lee, Dae Sung; Lim, Seong-Rin
2016-04-01
As a result of the continuous release of new electronic devices, existing electronic devices are quickly made obsolete and rapidly become electronic waste (e-waste). Because e-waste contains a variety of metals, information about those metals with the potential for substantial environmental impact should be provided to manufacturers, recyclers, and disposers to proactively reduce this impact. This study assesses the resource and toxicity (i.e., cancer, noncancer, and ecotoxicity) potentials of various heavy metals commonly found in e-waste from laptop computers, liquid-crystal display (LCD) monitors, LCD TVs, plasma TVs, color cathode ray tube (CRT) TVs, and cell phones and then evaluates such potentials using life cycle impact-based methods. Resource potentials derive primarily from Cu, Sb, Ag, and Pb. Toxicity potentials derive primarily from Pb, Ni, and Hg for cancer toxicity; from Pb, Hg, Zn, and As for noncancer toxicity; and from Cu, Pb, Hg, and Zn for ecotoxicity. Therefore, managing these heavy metals should be a high priority in the design, recycling, and disposal stages of electronic devices. © 2015 SETAC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lincoln, Don
The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Lincoln, Don
2018-01-16
The LHC is the worldâs highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilabâs Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.
Will Allis Prize Talk: Electron Collisions - Experiment, Theory and Applications
NASA Astrophysics Data System (ADS)
Bartschat, Klaus
2016-05-01
Electron collisions with atoms, ions, and molecules represent one of the very early topics of quantum mechanics. In spite of the field's maturity, a number of recent developments in detector technology (e.g., the ``reaction microscope'' or the ``magnetic-angle changer'') and the rapid increase in computational resources have resulted in significant progress in the measurement, understanding, and theoretical/computational description of few-body Coulomb problems. Close collaborations between experimentalists and theorists worldwide continue to produce high-quality benchmark data, which allow for thoroughly testing and further developing a variety of theoretical approaches. As a result, it has now become possible to reliably calculate the vast amount of atomic data needed for detailed modelling of the physics and chemistry of planetary atmospheres, the interpretation of astrophysical data, optimizing the energy transport in reactive plasmas, and many other topics - including light-driven processes, in which electrons are produced by continuous or short-pulse ultra-intense electromagnetic radiation. In this talk, I will highlight some of the recent developments that have had a major impact on the field. This will be followed by showcasing examples, in which accurate electron collision data enabled applications in fields beyond traditional AMO physics. Finally, open problems and challenges for the future will be outlined. I am very grateful for fruitful scientific collaborations with many colleagues, and the long-term financial support by the NSF through the Theoretical AMO and Computational Physics programs, as well as supercomputer resources through TeraGrid and XSEDE.
NASA Astrophysics Data System (ADS)
Bartschat, Klaus
2016-09-01
Electron collisions with atoms, ions, and molecules represent one of the very early topics of quantum mechanics. In spite of the field's maturity, a number of recent developments in detector technology (e.g., the ``reaction microscope'' or the ``magnetic-angle changer'') and the rapid increase in computational resources have resulted in significant progress in the measurement, understanding, and theoretical/computational description of few-body Coulomb problems. Close collaborations between experimentalists and theorists worldwide continue to produce high-quality benchmark data, which allow for thoroughly testing and further developing a variety of theoretical approaches. As a result, it has now become possible to reliably calculate the vast amount of atomic data needed for detailed modelling of the physics and chemistry of planetary atmospheres, the interpretation of astrophysical data, optimizing the energy transport in reactive plasmas, and many other topics - including light-driven processes, in which electrons are produced by continuous or short-pulse ultra-intense electromagnetic radiation. I will highlight some of the recent developments that have had a major impact on the field. This will be followed by showcasing examples, in which accurate electron collision data enabled applications in fields beyond traditional AMO physics. Finally, open problems and challenges for the future will be outlined. I am very grateful for fruitful scientific collaborations with many colleagues, and the long-term financial support by the NSF through the Theoretical AMO and Computational Physics programs, as well as supercomputer resources through TeraGrid and XSEDE.
Towards prediction of correlated material properties using quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Wagner, Lucas
Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.
Robertson, Jane; Moxey, Annette J; Newby, David A; Gillies, Malcolm B; Williamson, Margaret; Pearson, Sallie-Anne
2011-01-01
Background. Investments in eHealth worldwide have been mirrored in Australia, with >90% of general practices computerized. Recent eHealth incentives promote the use of up to date electronic information sources relevant to general practice with flexibility in mode of access. Objective. To determine GPs’ access to and use of electronic information sources and computerized clinical decision support systems (CDSSs) for prescribing. Methods. Semi-structured interviews were conducted with 18 experienced GPs and nine GP trainees in New South Wales, Australia in 2008. A thematic analysis of interview transcripts was undertaken. Results. Information needs varied with clinical experience, and people resources (specialists, GP peers and supervisors for trainees) were often preferred over written formats. Experienced GPs used a small number of electronic resources and accessed them infrequently. Familiarity from training and early clinical practice and easy access were dominant influences on resource use. Practice time constraints meant relevant information needed to be readily accessible during consultations, requiring integration or direct access from prescribing software. Quality of electronic resource content was assumed and cost a barrier for some GPs. Conclusions. The current Australian practice incentives do not prescribe which information resources GPs should use. Without integration into practice computing systems, uptake and routine use seem unlikely. CDSS developments must recognize the time pressures of practice, preference for integration and cost concerns. Minimum standards are required to ensure that high-quality information resources are integrated and regularly updated. Without standards, the anticipated benefits of computerization on patient safety and health outcomes will be uncertain. PMID:21109619
ERIC Educational Resources Information Center
Vail, Kathleen
2003-01-01
Practitioners and researchers in the education technology field asked to give their vision of the future list laptop computers, personal digital assistants, electronic testing, wireless networking, and multimedia technology among the technology advances headed soon for schools. A sidebar lists 12 online resources. (MLF)
ERIC Educational Resources Information Center
Online-Offline, 1998
1998-01-01
Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…
Introducing Electronic Encyclopedias to Young Children.
ERIC Educational Resources Information Center
Human, Suzanne
1997-01-01
To teach computer skills to kindergartners, classroom teachers and library media specialists can take them on a multimedia field trip to the zoo. Provides a lesson plan that lists library media objectives, resources, instructional roles, activity and procedures for completion, class instructions, evaluation, and follow-up. (PEN)
ERIC Educational Resources Information Center
Center for Best Practices in Early Childhood Education, 2005
2005-01-01
The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…
An Ongoing Revolution: Resource Sharing and OCLC.
ERIC Educational Resources Information Center
Nevins, Kate
1998-01-01
Discusses early developments in the Online Computer Library Center (OCLC) interlibrary loan, including use of OCLC for verification and request transmittal, improved service to patrons, internal cost control, affect on work flow and borrowing patterns. Describes advances in OCLC, including internationalization, electronic information access,…
Evaluation of School Library Media Centers: Demonstrating Quality.
ERIC Educational Resources Information Center
Everhart, Nancy
2003-01-01
Discusses ways to evaluate school library media programs and how to demonstrate quality. Topics include how principals evaluate programs; sources of evaluative data; national, state, and local instruments; surveys and interviews; Colorado benchmarks; evaluating the use of electronic resources; and computer reporting options. (LRW)
Implementation and Student Assessment of Intranet-Based Learning Resources.
ERIC Educational Resources Information Center
Sosabowski, Michael H.; Herson, Katie; Lloyd, Andrew W.
1998-01-01
The University of Brighton (England) pharmacy and biomedical sciences school developed an institutional intranet providing course information, Internet links, lecture notes, links to computer-assisted instructional packages, and worksheets. Electronic monitoring of usage and subsequent questionnaire-based evaluation showed the intranet to be a…
ERIC Educational Resources Information Center
Hazari, Sunil I.
1991-01-01
Local area networks (LANs) are systems of computers and peripherals connected together for the purposes of electronic mail and the convenience of sharing information and expensive resources. In planning the design of such a system, the components to consider are hardware, software, transmission media, topology, operating systems, and protocols.…
NASA Technical Reports Server (NTRS)
Rinker, Nancy A.
1994-01-01
The role of librarians today is drastically influenced by the changing nature of information and library services. The museum-like libraries of yesterday are a thing of the past: today's libraries are bustling with life, activity, and the sounds of new technologies. Libraries are replacing their paper card catalogs with state-of-the-art online systems, which provide faster and more comprehensive search capabilities. Even the resources themselves are changing. New formats for information, such as CD-ROM's, are becoming popular for all types of publications, from bibliographic tools to encyclopedias to electronic journals, even replacing print materials completely in some cases. Today it is almost impossible to walk into a library and find the information you need without coming into contact with at least one computer system. Librarians are not only struggling to keep up with the technological advancements of the day, but they are becoming information intermediaries: they must teach library users how to use all of the new systems and electronic resources. Not surprisingly, bibliographic instruction itself has taken on a new look and feel in these electronically advanced libraries. Many libraries are experimenting with the development of expert systems and other computer aided instruction interfaces for teaching patrons how to use the library and its resources. One popular type of interface in library instruction programs is hypertext, which utilizes 'stacks' or linked pages of information. Hypertext stacks can incorporate color graphics along with text to provide a more interesting interface and entice users into trying out the system. Another advantage of hypertext is that it is generally easy to use, even for those unfamiliar with computers. As such, it lends itself well to application in libraries, which often serve a broad range of clientele. This paper will discuss the design, development, and implementation of a hypertext library tour in a special library setting. The library featured in the electronic library tour is the National Aeronautics and Space Administration's Technical Library at Langley Research Center in Hampton, Virginia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massimo, F., E-mail: francesco.massimo@ensta-paristech.fr; Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma; Atzeni, S.
Architect, a time explicit hybrid code designed to perform quick simulations for electron driven plasma wakefield acceleration, is described. In order to obtain beam quality acceptable for applications, control of the beam-plasma-dynamics is necessary. Particle in Cell (PIC) codes represent the state-of-the-art technique to investigate the underlying physics and possible experimental scenarios; however PIC codes demand the necessity of heavy computational resources. Architect code substantially reduces the need for computational resources by using a hybrid approach: relativistic electron bunches are treated kinetically as in a PIC code and the background plasma as a fluid. Cylindrical symmetry is assumed for themore » solution of the electromagnetic fields and fluid equations. In this paper both the underlying algorithms as well as a comparison with a fully three dimensional particle in cell code are reported. The comparison highlights the good agreement between the two models up to the weakly non-linear regimes. In highly non-linear regimes the two models only disagree in a localized region, where the plasma electrons expelled by the bunch close up at the end of the first plasma oscillation.« less
Papež, Václav; Denaxas, Spiros; Hemingway, Harry
2017-01-01
Electronic Health Records are electronic data generated during or as a byproduct of routine patient care. Structured, semi-structured and unstructured EHR offer researchers unprecedented phenotypic breadth and depth and have the potential to accelerate the development of precision medicine approaches at scale. A main EHR use-case is defining phenotyping algorithms that identify disease status, onset and severity. Phenotyping algorithms utilize diagnoses, prescriptions, laboratory tests, symptoms and other elements in order to identify patients with or without a specific trait. No common standardized, structured, computable format exists for storing phenotyping algorithms. The majority of algorithms are stored as human-readable descriptive text documents making their translation to code challenging due to their inherent complexity and hinders their sharing and re-use across the community. In this paper, we evaluate the two key Semantic Web Technologies, the Web Ontology Language and the Resource Description Framework, for enabling computable representations of EHR-driven phenotyping algorithms.
P3 DESIGN OF A NATIONAL ELECTRONICS PRODUCT REUSE AND RECYCLING SYSTEM
Material and resource conservation are critical to sustainability; and, the ability to efficiently and effectively recover old products for reuse and recycle is an essential element in these conservation efforts. In California alone, it has been estimated that 10,000 computers a...
Autonomous self-organizing resource manager for multiple networked platforms
NASA Astrophysics Data System (ADS)
Smith, James F., III
2002-08-01
A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.
Two Quantum Protocols for Oblivious Set-member Decision Problem
NASA Astrophysics Data System (ADS)
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
Two Quantum Protocols for Oblivious Set-member Decision Problem
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-01-01
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology. PMID:26514668
Two Quantum Protocols for Oblivious Set-member Decision Problem.
Shi, Run-Hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2015-10-30
In this paper, we defined a new secure multi-party computation problem, called Oblivious Set-member Decision problem, which allows one party to decide whether a secret of another party belongs to his private set in an oblivious manner. There are lots of important applications of Oblivious Set-member Decision problem in fields of the multi-party collaborative computation of protecting the privacy of the users, such as private set intersection and union, anonymous authentication, electronic voting and electronic auction. Furthermore, we presented two quantum protocols to solve the Oblivious Set-member Decision problem. Protocol I takes advantage of powerful quantum oracle operations so that it needs lower costs in both communication and computation complexity; while Protocol II takes photons as quantum resources and only performs simple single-particle projective measurements, thus it is more feasible with the present technology.
1991-09-01
other networks . 69 For example, E-mail can be sent to an SNA network through a Softswitch gateway, but at a very slow rate. As discussed in Chapter III...10 6. Communication Protocols ..................... 10 D. NEW INFRASTRUCTURES ....................... 11 1. CALS Test Network (CTN...11 2. Industrial Networks ......................... 12 3. FTS-2000 and ISDN ........................ 12 4. CALS Operational Resource
Building a Library Network from Scratch: Eric & Veronica's Excellent Adventure.
ERIC Educational Resources Information Center
Sisler, Eric; Smith, Veronica
2000-01-01
Describes library automation issues during the planning and construction of College Hill Library (Colorado), a joint-use facility shared by a community college and a public library. Discuses computer networks; hardware selection; public access to catalogs and electronic resources; classification schemes and bibliographic data; children's…
Instructional Technology and Higher Education: Rewards, Rights, and Responsibilities.
ERIC Educational Resources Information Center
Albright, Michael J.
This keynote address seeks to establish a definition for "instructional technology" that does not emphasize computer hardware and software but instead focuses on human skills, resource management, problem solving, and educational settings. Also discussed are ways in which technology like electronic mail and the world wide web has…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oguchi, Masahiro, E-mail: oguchi.masahiro@nies.go.jp; Murakami, Shinsuke; Sakanakura, Hirofumi
2011-09-15
Highlights: > End-of-life electrical and electronic equipment (EEE) as secondary metal resources. > The content and the total amount of metals in specific equipment are both important. > We categorized 21 EEE types from contents and total amounts of various metals. > Important equipment types as secondary resources were listed for each metal kind. > Collectability and possible collection systems of various EEE types were discussed. - Abstract: End-of-life electrical and electronic equipment (EEE) has recently received attention as a secondary source of metals. This study examined characteristics of end-of-life EEE as secondary metal resources to consider efficient collection andmore » metal recovery systems according to the specific metals and types of EEE. We constructed an analogy between natural resource development and metal recovery from end-of-life EEE and found that metal content and total annual amount of metal contained in each type of end-of-life EEE should be considered in secondary resource development, as well as the collectability of the end-of-life products. We then categorized 21 EEE types into five groups and discussed their potential as secondary metal resources. Refrigerators, washing machines, air conditioners, and CRT TVs were evaluated as the most important sources of common metals, and personal computers, mobile phones, and video games were evaluated as the most important sources of precious metals. Several types of small digital equipment were also identified as important sources of precious metals; however, mid-size information and communication technology (ICT) equipment (e.g., printers and fax machines) and audio/video equipment were shown to be more important as a source of a variety of less common metals. The physical collectability of each type of EEE was roughly characterized by unit size and number of end-of-life products generated annually. Current collection systems in Japan were examined and potentially appropriate collection methods were suggested for equipment types that currently have no specific collection systems in Japan, particularly for video games, notebook computers, and mid-size ICT and audio/video equipment.« less
Mesa-Gutiérrez, J C; Bardají, C; Brun, N; Núñez, B; Sánchez, B; Sanvicente, B; Obiols, P; Rigol, S
2012-04-01
New tools from the web are a complete breakthrough in management of information. The aim of this paper is to present different resources in a friendly way, with apps and examples in the different phases of the knowledge management for the paediatric surgeon: search, filter, reception, classification, sharing, collaborative work and publication. We are assisting to a real revolution on how to manage knowledge and information. The main charateristics are: immediateness, social component, growing interaction, and easiness. Every physician has clinical questions and the Internet gives us more and more resources to make searchs easier. Along with them we need electronic resources to filter information of quality and to make easier transfer of knowledge to clinical practice. Cloud computing is on continuous development and makes possible sharing information with differents users and computers. The main feature of the apps from the Intenet is the social component, that makes possible interaction, sharing and collaborative work.
The precision-processing subsystem for the Earth Resources Technology Satellite.
NASA Technical Reports Server (NTRS)
Chapelle, W. E.; Bybee, J. E.; Bedross, G. M.
1972-01-01
Description of the precision processor, a subsystem in the image-processing system for the Earth Resources Technology Satellite (ERTS). This processor is a special-purpose image-measurement and printing system, designed to process user-selected bulk images to produce 1:1,000,000-scale film outputs and digital image data, presented in a Universal-Transverse-Mercator (UTM) projection. The system will remove geometric and radiometric errors introduced by the ERTS multispectral sensors and by the bulk-processor electron-beam recorder. The geometric transformations required for each input scene are determined by resection computations based on reseau measurements and image comparisons with a special ground-control base contained within the system; the images are then printed and digitized by electronic image-transfer techniques.
Whalen, Christopher J; Donnell, Deborah; Tartakovsky, Michael
2014-01-01
As information and communication technology infrastructure becomes more reliable, new methods of electronic data capture, data marts/data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of Internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on electronic data capture and Internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings. We describe examples of the practical and ethical/regulatory challenges raised by the use of these newer technologies for data collection in multisite clinical studies.
Onboard Short Term Plan Viewer
NASA Technical Reports Server (NTRS)
Hall, Tim; LeBlanc, Troy; Ulman, Brian; McDonald, Aaron; Gramm, Paul; Chang, Li-Min; Keerthi, Suman; Kivlovitz, Dov; Hadlock, Jason
2011-01-01
Onboard Short Term Plan Viewer (OSTPV) is a computer program for electronic display of mission plans and timelines, both aboard the International Space Station (ISS) and in ISS ground control stations located in several countries. OSTPV was specifically designed both (1) for use within the limited ISS computing environment and (2) to be compatible with computers used in ground control stations. OSTPV supplants a prior system in which, aboard the ISS, timelines were printed on paper and incorporated into files that also contained other paper documents. Hence, the introduction of OSTPV has both reduced the consumption of resources and saved time in updating plans and timelines. OSTPV accepts, as input, the mission timeline output of a legacy, print-oriented, UNIX-based program called "Consolidated Planning System" and converts the timeline information for display in an interactive, dynamic, Windows Web-based graphical user interface that is used by both the ISS crew and ground control teams in real time. OSTPV enables the ISS crew to electronically indicate execution of timeline steps, launch electronic procedures, and efficiently report to ground control teams on the statuses of ISS activities, all by use of laptop computers aboard the ISS.
Dynamic electronic institutions in agent oriented cloud robotic systems.
Nagrath, Vineet; Morel, Olivier; Malik, Aamir; Saad, Naufal; Meriaudeau, Fabrice
2015-01-01
The dot-com bubble bursted in the year 2000 followed by a swift movement towards resource virtualization and cloud computing business model. Cloud computing emerged not as new form of computing or network technology but a mere remoulding of existing technologies to suit a new business model. Cloud robotics is understood as adaptation of cloud computing ideas for robotic applications. Current efforts in cloud robotics stress upon developing robots that utilize computing and service infrastructure of the cloud, without debating on the underlying business model. HTM5 is an OMG's MDA based Meta-model for agent oriented development of cloud robotic systems. The trade-view of HTM5 promotes peer-to-peer trade amongst software agents. HTM5 agents represent various cloud entities and implement their business logic on cloud interactions. Trade in a peer-to-peer cloud robotic system is based on relationships and contracts amongst several agent subsets. Electronic Institutions are associations of heterogeneous intelligent agents which interact with each other following predefined norms. In Dynamic Electronic Institutions, the process of formation, reformation and dissolution of institutions is automated leading to run time adaptations in groups of agents. DEIs in agent oriented cloud robotic ecosystems bring order and group intellect. This article presents DEI implementations through HTM5 methodology.
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
Public Health Surveillance and Meaningful Use Regulations: A Crisis of Opportunity
Sundwall, David N.
2012-01-01
The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health’s information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints. PMID:22390523
Public health surveillance and meaningful use regulations: a crisis of opportunity.
Lenert, Leslie; Sundwall, David N
2012-03-01
The Health Information Technology for Economic and Clinical Health Act is intended to enhance reimbursement of health care providers for meaningful use of electronic health records systems. This presents both opportunities and challenges for public health departments. To earn incentive payments, clinical providers must exchange specified types of data with the public health system, such as immunization and syndromic surveillance data and notifiable disease reporting. However, a crisis looms because public health's information technology systems largely lack the capabilities to accept the types of data proposed for exchange. Cloud computing may be a solution for public health information systems. Through shared computing resources, public health departments could reap the benefits of electronic reporting within federal funding constraints.
The Full Monty: Locating Resources, Creating, and Presenting a Web Enhanced History Course.
ERIC Educational Resources Information Center
Bazillion, Richard J.; Braun, Connie L.
2001-01-01
Discusses how to develop a history course using the World Wide Web; course development software; full text digitized articles, electronic books, primary documents, images, and audio files; and computer equipment such as LCD projectors and interactive whiteboards. Addresses the importance of support for faculty using technology in teaching. (PAL)
The Application of Large-Scale Hypermedia Information Systems to Training.
ERIC Educational Resources Information Center
Crowder, Richard; And Others
1995-01-01
Discusses the use of hypermedia in electronic information systems that support maintenance operations in large-scale industrial plants. Findings show that after establishing an information system, the same resource base can be used to train personnel how to use the computer system and how to perform operational and maintenance tasks. (Author/JMV)
Academic Honesty through Technology
ERIC Educational Resources Information Center
Lecher, Mark
2005-01-01
Over the past two decades, technology use has increased in the classroom. What started out as a single computer in a classroom has evolved into a laptop or handheld for every student, with a wireless connection to the Internet and other network resources. Cell phones, PDAs, and other electronic tools have opened up new horizons for utilizing…
Working for America. Career Schools: A Tremendous Resource for Employers.
ERIC Educational Resources Information Center
Career Education, 1992
1992-01-01
Discusses industries that are vital to the nation's economy and the numbers of skilled workers they will need to keep moving ahead. Industries profiled are aviation, automotive, allied health, trucking, paralegal, electronics, and computer-aided drafting. Also looks at proprietary schools that are educating the work force of the future. (JOW)
An Assessment of Remote Laboratory Experiments in Radio Communication
ERIC Educational Resources Information Center
Gampe, Andreas; Melkonyan, Arsen; Pontual, Murillo; Akopian, David
2014-01-01
Today's electrical and computer engineering graduates need marketable skills to work with electronic devices. Hands-on experiments prepare students to deal with real-world problems and help them to comprehend theoretical concepts and relate these to practical tasks. However, shortage of equipment, high costs, and a lack of human resources for…
Computers and Mental Health Care Delivery. A Resource Guide to Federal Information.
ERIC Educational Resources Information Center
Levy, Louise
Prepared for the mental health professional or administrator who is involved in the planning, developing, or implementation of an automated information system in a mental health environment, this guide is limited to the electronic processing and storage of information for management and clinical functions. Management application areas include…
The CAMILLE Project: Espana Interactiva (The CAMILLE Project: Interactive Spanish).
ERIC Educational Resources Information Center
Gimeno, Ana; Ingraham, Bruce
CAMILLE's primary objective is to exploit recent developments in multimedia computing to create a flexible, student-centered, electronic language learning environment to support the acquisition of a second language. The consortium's first target was to produce a learning resource for beginners of Spanish and another for beginners of Dutch, as well…
Authentication of Radio Frequency Identification Devices Using Electronic Characteristics
ERIC Educational Resources Information Center
Chinnappa Gounder Periaswamy, Senthilkumar
2010-01-01
Radio frequency identification (RFID) tags are low-cost devices that are used to uniquely identify the objects to which they are attached. Due to the low cost and size that is driving the technology, a tag has limited computational capabilities and resources. This limitation makes the implementation of conventional security protocols to prevent…
Proper Resonance Depiction of Acylium Cation: A High-Level and Student Computational Investigation
ERIC Educational Resources Information Center
Esselman, Brian J.; Hill, Nicholas J.
2015-01-01
The electronic and molecular structure of the acylium cation ([CH[subscript 3]CO][superscript +], 1) receives varied treatment in undergraduate textbooks and online resources. The overall structure of 1 is typically represented as an equal combination of resonance structures containing C-O triple and double bonds, the latter structure occasionally…
Testing a computer-based ostomy care training resource for staff nurses.
Bales, Isabel
2010-05-01
Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.
Accurate quantum chemical calculations
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.
1989-01-01
An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.
NASA Astrophysics Data System (ADS)
Schulthess, Thomas C.
2013-03-01
The continued thousand-fold improvement in sustained application performance per decade on modern supercomputers keeps opening new opportunities for scientific simulations. But supercomputers have become very complex machines, built with thousands or tens of thousands of complex nodes consisting of multiple CPU cores or, most recently, a combination of CPU and GPU processors. Efficient simulations on such high-end computing systems require tailored algorithms that optimally map numerical methods to particular architectures. These intricacies will be illustrated with simulations of strongly correlated electron systems, where the development of quantum cluster methods, Monte Carlo techniques, as well as their optimal implementation by means of algorithms with improved data locality and high arithmetic density have gone hand in hand with evolving computer architectures. The present work would not have been possible without continued access to computing resources at the National Center for Computational Science of Oak Ridge National Laboratory, which is funded by the Facilities Division of the Office of Advanced Scientific Computing Research, and the Swiss National Supercomputing Center (CSCS) that is funded by ETH Zurich.
Introduction to the LaRC central scientific computing complex
NASA Technical Reports Server (NTRS)
Shoosmith, John N.
1993-01-01
The computers and associated equipment that make up the Central Scientific Computing Complex of the Langley Research Center are briefly described. The electronic networks that provide access to the various components of the complex and a number of areas that can be used by Langley and contractors staff for special applications (scientific visualization, image processing, software engineering, and grid generation) are also described. Flight simulation facilities that use the central computers are described. Management of the complex, procedures for its use, and available services and resources are discussed. This document is intended for new users of the complex, for current users who wish to keep appraised of changes, and for visitors who need to understand the role of central scientific computers at Langley.
NASA Astrophysics Data System (ADS)
Zwart, Christine M.; Venkatesan, Ragav; Frakes, David H.
2012-10-01
Interpolation is an essential and broadly employed function of signal processing. Accordingly, considerable development has focused on advancing interpolation algorithms toward optimal accuracy. Such development has motivated a clear shift in the state-of-the art from classical interpolation to more intelligent and resourceful approaches, registration-based interpolation for example. As a natural result, many of the most accurate current algorithms are highly complex, specific, and computationally demanding. However, the diverse hardware destinations for interpolation algorithms present unique constraints that often preclude use of the most accurate available options. For example, while computationally demanding interpolators may be suitable for highly equipped image processing platforms (e.g., computer workstations and clusters), only more efficient interpolators may be practical for less well equipped platforms (e.g., smartphones and tablet computers). The latter examples of consumer electronics present a design tradeoff in this regard: high accuracy interpolation benefits the consumer experience but computing capabilities are limited. It follows that interpolators with favorable combinations of accuracy and efficiency are of great practical value to the consumer electronics industry. We address multidimensional interpolation-based image processing problems that are common to consumer electronic devices through a decomposition approach. The multidimensional problems are first broken down into multiple, independent, one-dimensional (1-D) interpolation steps that are then executed with a newly modified registration-based one-dimensional control grid interpolator. The proposed approach, decomposed multidimensional control grid interpolation (DMCGI), combines the accuracy of registration-based interpolation with the simplicity, flexibility, and computational efficiency of a 1-D interpolation framework. Results demonstrate that DMCGI provides improved interpolation accuracy (and other benefits) in image resizing, color sample demosaicing, and video deinterlacing applications, at a computational cost that is manageable or reduced in comparison to popular alternatives.
Computationally efficient methods for modelling laser wakefield acceleration in the blowout regime
NASA Astrophysics Data System (ADS)
Cowan, B. M.; Kalmykov, S. Y.; Beck, A.; Davoine, X.; Bunkers, K.; Lifschitz, A. F.; Lefebvre, E.; Bruhwiler, D. L.; Shadwick, B. A.; Umstadter, D. P.; Umstadter
2012-08-01
Electron self-injection and acceleration until dephasing in the blowout regime is studied for a set of initial conditions typical of recent experiments with 100-terawatt-class lasers. Two different approaches to computationally efficient, fully explicit, 3D particle-in-cell modelling are examined. First, the Cartesian code vorpal (Nieter, C. and Cary, J. R. 2004 VORPAL: a versatile plasma simulation code. J. Comput. Phys. 196, 538) using a perfect-dispersion electromagnetic solver precisely describes the laser pulse and bubble dynamics, taking advantage of coarser resolution in the propagation direction, with a proportionally larger time step. Using third-order splines for macroparticles helps suppress the sampling noise while keeping the usage of computational resources modest. The second way to reduce the simulation load is using reduced-geometry codes. In our case, the quasi-cylindrical code calder-circ (Lifschitz, A. F. et al. 2009 Particle-in-cell modelling of laser-plasma interaction using Fourier decomposition. J. Comput. Phys. 228(5), 1803-1814) uses decomposition of fields and currents into a set of poloidal modes, while the macroparticles move in the Cartesian 3D space. Cylindrical symmetry of the interaction allows using just two modes, reducing the computational load to roughly that of a planar Cartesian simulation while preserving the 3D nature of the interaction. This significant economy of resources allows using fine resolution in the direction of propagation and a small time step, making numerical dispersion vanishingly small, together with a large number of particles per cell, enabling good particle statistics. Quantitative agreement of two simulations indicates that these are free of numerical artefacts. Both approaches thus retrieve the physically correct evolution of the plasma bubble, recovering the intrinsic connection of electron self-injection to the nonlinear optical evolution of the driver.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Pallen, M
1995-11-25
The benefits to medical practitioners of using the Internet are growing rapidly as the Internet becomes easier to use and ever more biomedical resources become available on line. The Internet is the largest computer network in the world; it is also a virtual community, larger than many nation states, with its own rules of behaviour or "netiquette." There are several types of Internet connection and various ways of acquiring a connection. Once connected, you can obtain, free of charge, programs that allow easy use of the Internet's resources and help on how to use these resources; you can access many of these resources through the hypertext references in the on line version of this series (go to http:@www.bmj.com/bmj/ to reach the electronic version). You can then explore the various methods for accessing, manipulating, or disseminating data on the Internet, such as electronic mail, telnet, file transfer protocol, and the world wide web. Results from a search of the world wide web for information on the rare condition of Recklinghausen's neurofibromatosis illustrate the breadth of medical information available on the Internet.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-01
... a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X- ray... equipment from solid waste landfills in the United States. EPA does, however, control how cathode ray tube... cell phone and computers/laptops or recover valuable resources, such as precious metals, plastics or...
Connexions: An Open Educational Resource for the 21st Century
ERIC Educational Resources Information Center
Burrus, C. Sidney
2007-01-01
The technology for information organization, communication, storage, and use today is the book. It has evolved over 3000 years (in its modern form over 500 years) to the mature object we currently enjoy. The book is now the primary technology used in education. But with the development of the computer and the Web, a new electronic information…
Touch and Gesture-Based Language Learning: Some Possible Avenues for Research and Classroom Practice
ERIC Educational Resources Information Center
Reinders, Hayo
2014-01-01
Our interaction with digital resources is becoming increasingly based on touch, gestures, and now also eye movement. Many everyday consumer electronics products already include touch-based interfaces, from e-book readers to tablets, and from the last personal computers to the GPS system in your car. What implications do these new forms of…
Exploiting the Potential of CD-ROM Databases: Staff Induction at the University of East Anglia.
ERIC Educational Resources Information Center
Guillot, Marie-Noelle; Kenning, Marie-Madeleine
1995-01-01
Overviews a project exploring the possibility of using CD-ROM applications and the design of exploratory didactic materials to introduce academic staff to the field of computer-assisted instruction. The project heightened the staff's awareness of electronic resources and their potential as research, teaching, and learning aids, with particular…
An evaluation of immunization education resources by family medicine residency directors.
Nowalk, Mary Patricia; Zimmerman, Richard K; Middleton, Donald B; Sherwood, Roger A; Ko, Feng-Shou; Kimmel, Sanford R; Troy, Judith A
2007-01-01
Immunization is a rapidly evolving field, and teachers of family medicine are responsible for ensuring that they and their students are knowledgeable about the latest vaccine recommendations. A survey was mailed to 456 family medicine residency directors across the United States to obtain their evaluation of immunization resources developed by the Society of Teachers of Family Medicine's Group on Immunization Education. Frequencies, measures of central tendency, and differences between responses from 2001 to 2005 were analyzed. Directors of 261 (57%) family medicine residencies responded, with >80% reporting satisfaction with immunization teaching resources. The popularity of bound resources decreased from 2001 to 2005, while immunization Web sites increased in importance. The journal supplement, "Vaccines Across the Lifespan, 2005" was less frequently read in 2005 than its predecessor published in 2001, but quality ratings remained high. Use of the Web site, www.ImmunizationEd.org, and the Shots software for both desktop and handheld computers has increased since their creation. Electronic immunization teaching resources are increasingly popular among family medicine residencies. As the field continues to change, the use of electronic resources is expected to continue, since they are easily updated and, in the case of www.ImmunizationEd.org and Shots software, are available free of charge.
Oguchi, Masahiro; Murakami, Shinsuke; Sakanakura, Hirofumi; Kida, Akiko; Kameya, Takashi
2011-01-01
End-of-life electrical and electronic equipment (EEE) has recently received attention as a secondary source of metals. This study examined characteristics of end-of-life EEE as secondary metal resources to consider efficient collection and metal recovery systems according to the specific metals and types of EEE. We constructed an analogy between natural resource development and metal recovery from end-of-life EEE and found that metal content and total annual amount of metal contained in each type of end-of-life EEE should be considered in secondary resource development, as well as the collectability of the end-of-life products. We then categorized 21 EEE types into five groups and discussed their potential as secondary metal resources. Refrigerators, washing machines, air conditioners, and CRT TVs were evaluated as the most important sources of common metals, and personal computers, mobile phones, and video games were evaluated as the most important sources of precious metals. Several types of small digital equipment were also identified as important sources of precious metals; however, mid-size information and communication technology (ICT) equipment (e.g., printers and fax machines) and audio/video equipment were shown to be more important as a source of a variety of less common metals. The physical collectability of each type of EEE was roughly characterized by unit size and number of end-of-life products generated annually. Current collection systems in Japan were examined and potentially appropriate collection methods were suggested for equipment types that currently have no specific collection systems in Japan, particularly for video games, notebook computers, and mid-size ICT and audio/video equipment. Copyright © 2011 Elsevier Ltd. All rights reserved.
Are Technology Interruptions Impacting Your Bottom Line? An Innovative Proposal for Change.
Ledbetter, Tamera; Shultz, Sarah; Beckham, Roxanne
2017-10-01
Nursing interruptions are a costly and dangerous variable in acute care hospitals. Malfunctioning technology equipment interrupts nursing care and prevents full utilization of computer safety systems to prevent patient care errors. This paper identifies an innovative approach to nursing interruptions related to computer and computer cart malfunctions. The impact on human resources is defined and outcome measures were proposed. A multifaceted proposal, based on a literature review, aimed at reducing nursing interruptions is presented. This proposal is expected to increase patient safety, as well as patient and nurse satisfaction. Acute care hospitals utilizing electronic medical records and bar-coded medication administration technology. Nurses, information technology staff, nursing informatics staff, and all leadership teams affected by technology problems and their proposed solutions. Literature from multiple fields was reviewed to evaluate research related to computer/computer cart failures, and the approaches used to resolve these issues. Outcome measured strategic goals related to patient safety, and nurse and patient satisfaction. Specific help desk metrics will demonstrate the effect of interventions. This paper addresses a gap in the literature and proposes practical and innovative solutions. A comprehensive computer and computer cart repair program is essential for patient safety, financial stewardship, and utilization of resources. © 2015 Wiley Periodicals, Inc.
Higher-order adaptive finite-element methods for Kohn–Sham density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Motamarri, P.; Nowak, M.R.; Leiter, K.
2013-11-15
We present an efficient computational approach to perform real-space electronic structure calculations using an adaptive higher-order finite-element discretization of Kohn–Sham density-functional theory (DFT). To this end, we develop an a priori mesh-adaption technique to construct a close to optimal finite-element discretization of the problem. We further propose an efficient solution strategy for solving the discrete eigenvalue problem by using spectral finite-elements in conjunction with Gauss–Lobatto quadrature, and a Chebyshev acceleration technique for computing the occupied eigenspace. The proposed approach has been observed to provide a staggering 100–200-fold computational advantage over the solution of a generalized eigenvalue problem. Using the proposedmore » solution procedure, we investigate the computational efficiency afforded by higher-order finite-element discretizations of the Kohn–Sham DFT problem. Our studies suggest that staggering computational savings—of the order of 1000-fold—relative to linear finite-elements can be realized, for both all-electron and local pseudopotential calculations, by using higher-order finite-element discretizations. On all the benchmark systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemical accuracy, suggesting that the hexic spectral-element may be an optimal choice for the finite-element discretization of the Kohn–Sham DFT problem. A comparative study of the computational efficiency of the proposed higher-order finite-element discretizations suggests that the performance of finite-element basis is competing with the plane-wave discretization for non-periodic local pseudopotential calculations, and compares to the Gaussian basis for all-electron calculations to within an order of magnitude. Further, we demonstrate the capability of the proposed approach to compute the electronic structure of a metallic system containing 1688 atoms using modest computational resources, and good scalability of the present implementation up to 192 processors.« less
Womack, James C; Anton, Lucian; Dziedzic, Jacek; Hasnip, Phil J; Probert, Matt I J; Skylaris, Chris-Kriton
2018-03-13
The solution of the Poisson equation is a crucial step in electronic structure calculations, yielding the electrostatic potential-a key component of the quantum mechanical Hamiltonian. In recent decades, theoretical advances and increases in computer performance have made it possible to simulate the electronic structure of extended systems in complex environments. This requires the solution of more complicated variants of the Poisson equation, featuring nonhomogeneous dielectric permittivities, ionic concentrations with nonlinear dependencies, and diverse boundary conditions. The analytic solutions generally used to solve the Poisson equation in vacuum (or with homogeneous permittivity) are not applicable in these circumstances, and numerical methods must be used. In this work, we present DL_MG, a flexible, scalable, and accurate solver library, developed specifically to tackle the challenges of solving the Poisson equation in modern large-scale electronic structure calculations on parallel computers. Our solver is based on the multigrid approach and uses an iterative high-order defect correction method to improve the accuracy of solutions. Using two chemically relevant model systems, we tested the accuracy and computational performance of DL_MG when solving the generalized Poisson and Poisson-Boltzmann equations, demonstrating excellent agreement with analytic solutions and efficient scaling to ∼10 9 unknowns and 100s of CPU cores. We also applied DL_MG in actual large-scale electronic structure calculations, using the ONETEP linear-scaling electronic structure package to study a 2615 atom protein-ligand complex with routinely available computational resources. In these calculations, the overall execution time with DL_MG was not significantly greater than the time required for calculations using a conventional FFT-based solver.
Implementation of a World Wide Web server for the oil and gas industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, R.E.; Martin, F.D.; Emery, R.
1995-12-31
The Gas and Oil Technology Exchange and Communication Highway, (GO-TECH), provides an electronic information system for the petroleum community for the purpose of exchanging ideas, data, and technology. The personal computer-based system fosters communication and discussion by linking oil and gas producers with resource centers, government agencies, consulting firms, service companies, national laboratories, academic research groups, and universities throughout the world. The oil and gas producers are provided access to the GO-TECH World Wide Web home page via modem links, as well as Internet. The future GO-TECH applications will include the establishment of{open_quote}Virtual corporations {close_quotes} consisting of consortiums of smallmore » companies, consultants, and service companies linked by electronic information systems. These virtual corporations will have the resources and expertise previously found only in major corporations.« less
ERIC Educational Resources Information Center
Maron, Nancy L.; Smith, K. Kirby
2008-01-01
As electronic resources for scholarship proliferate, more and more scholars turn to their computers rather than to print sources to conduct their research. The decentralized distribution of these new model works can make it difficult to fully appreciate their scope and number, even for university librarians tasked with knowing about valuable…
2011-01-01
Simulating Satellite Tracking Using Parallel Computing By Andrew Lindstrom ,University of Hawaii at Hilo — Mentors: Carl Holmberg, Maui High Performance...RDECOM) and his management team, RDECOM Deputy Director Gary Martin ; ARL Director John Miller; Communications- Electronics Research, Development...Saves Resources By Mike Knowles, ARL DSRC Site Lead, Lockheed Martin mode instead of full power down. The first phase of the EAS effort is an attempt
First-Principles Study of Superconductivity in Ultra- thin Pb Films
NASA Astrophysics Data System (ADS)
Noffsinger, Jesse; Cohen, Marvin L.
2010-03-01
Recently, superconductivity in ultrathin layered Pb has been confirmed in samples with as few as two atomic layers [S. Qin, J. Kim, Q. Niu, and C.-K. Shih, Science 2009]. Interestingly, the prototypical strong-coupling superconductor exhibits different Tc's for differing surface reconstructions in samples with only two monolayers. Additionally, Tc is seen to oscillate as the number of atomic layers is increased. Using first principles techniques based on Wannier functions, we analyze the electronic structure, lattice dynamics and electron-phonon coupling for varying thicknesses and surface reconstructions of layered Pb. We discuss results as they relate to superconductivity in the bulk, for which accurate calculations of superconducting properties can be compared to experiment [W. L. McMillan and J.M. Rowell, PRL 1965]. This work was supported by National Science Foundation Grant No. DMR07-05941, the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Computational resources have been provided by the Lawrencium computational cluster resource provided by the IT Division at the Lawrence Berkeley National Laboratory (Supported by the Director, Office of Science, Office of Basic Energy Sciences, of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231)
Correlation energy extrapolation by many-body expansion
Boschen, Jeffery S.; Theis, Daniel; Ruedenberg, Klaus; ...
2017-01-09
Accounting for electron correlation is required for high accuracy calculations of molecular energies. The full configuration interaction (CI) approach can fully capture the electron correlation within a given basis, but it does so at a computational expense that is impractical for all but the smallest chemical systems. In this work, a new methodology is presented to approximate configuration interaction calculations at a reduced computational expense and memory requirement, namely, the correlation energy extrapolation by many-body expansion (CEEMBE). This method combines a MBE approximation of the CI energy with an extrapolated correction obtained from CI calculations using subsets of the virtualmore » orbitals. The extrapolation approach is inspired by, and analogous to, the method of correlation energy extrapolation by intrinsic scaling. Benchmark calculations of the new method are performed on diatomic fluorine and ozone. Finally, the method consistently achieves agreement with CI calculations to within a few mhartree and often achieves agreement to within ~1 millihartree or less, while requiring significantly less computational resources.« less
Correlation energy extrapolation by many-body expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boschen, Jeffery S.; Theis, Daniel; Ruedenberg, Klaus
Accounting for electron correlation is required for high accuracy calculations of molecular energies. The full configuration interaction (CI) approach can fully capture the electron correlation within a given basis, but it does so at a computational expense that is impractical for all but the smallest chemical systems. In this work, a new methodology is presented to approximate configuration interaction calculations at a reduced computational expense and memory requirement, namely, the correlation energy extrapolation by many-body expansion (CEEMBE). This method combines a MBE approximation of the CI energy with an extrapolated correction obtained from CI calculations using subsets of the virtualmore » orbitals. The extrapolation approach is inspired by, and analogous to, the method of correlation energy extrapolation by intrinsic scaling. Benchmark calculations of the new method are performed on diatomic fluorine and ozone. Finally, the method consistently achieves agreement with CI calculations to within a few mhartree and often achieves agreement to within ~1 millihartree or less, while requiring significantly less computational resources.« less
An ontology-based telemedicine tasks management system architecture.
Nageba, Ebrahim; Fayn, Jocelyne; Rubel, Paul
2008-01-01
The recent developments in ambient intelligence and ubiquitous computing offer new opportunities for the design of advanced Telemedicine systems providing high quality services, anywhere, anytime. In this paper we present an approach for building an ontology-based task-driven telemedicine system. The architecture is composed of a task management server, a communication server and a knowledge base for enabling decision makings taking account of different telemedical concepts such as actors, resources, services and the Electronic Health Record. The final objective is to provide an intelligent management of the different types of available human, material and communication resources.
Accelerating electron tomography reconstruction algorithm ICON with GPU.
Chen, Yu; Wang, Zihao; Zhang, Jingrong; Li, Lun; Wan, Xiaohua; Sun, Fei; Zhang, Fa
2017-01-01
Electron tomography (ET) plays an important role in studying in situ cell ultrastructure in three-dimensional space. Due to limited tilt angles, ET reconstruction always suffers from the "missing wedge" problem. With a validation procedure, iterative compressed-sensing optimized NUFFT reconstruction (ICON) demonstrates its power in the restoration of validated missing information for low SNR biological ET dataset. However, the huge computational demand has become a major problem for the application of ICON. In this work, we analyzed the framework of ICON and classified the operations of major steps of ICON reconstruction into three types. Accordingly, we designed parallel strategies and implemented them on graphics processing units (GPU) to generate a parallel program ICON-GPU. With high accuracy, ICON-GPU has a great acceleration compared to its CPU version, up to 83.7×, greatly relieving ICON's dependence on computing resource.
Javan Amoli, Amir Hossein; Maserat, Elham; Safdari, Reza; Zali, Mohammad Reza
2015-01-01
Decision making modalities for screening for many cancer conditions and different stages have become increasingly complex. Computer-based risk assessment systems facilitate scheduling and decision making and support the delivery of cancer screening services. The aim of this article was to survey electronic risk assessment system as an appropriate tool for the prevention of cancer. A qualitative design was used involving 21 face-to-face interviews. Interviewing involved asking questions and getting answers from exclusive managers of cancer screening. Of the participants 6 were female and 15 were male, and ages ranged from 32 to 78 years. The study was based on a grounded theory approach and the tool was a semi- structured interview. Researchers studied 5 dimensions, comprising electronic guideline standards of colorectal cancer screening, work flow of clinical and genetic activities, pathways of colorectal cancer screening and functionality of computer based guidelines and barriers. Electronic guideline standards of colorectal cancer screening were described in the s3 categories of content standard, telecommunications and technical standards and nomenclature and classification standards. According to the participations' views, workflow and genetic pathways of colorectal cancer screening were identified. The study demonstrated an effective role of computer-guided consultation for screening management. Electronic based systems facilitate real-time decision making during a clinical interaction. Electronic pathways have been applied for clinical and genetic decision support, workflow management, update recommendation and resource estimates. A suitable technical and clinical infrastructure is an integral part of clinical practice guidline of screening. As a conclusion, it is recommended to consider the necessity of architecture assessment and also integration standards.
NASA Tech Briefs, April 1995. Volume 19, No. 4
NASA Technical Reports Server (NTRS)
1995-01-01
This issue of the NASA Tech Briefs has a special focus section on video and imaging, a feature on the NASA invention of the year, and a resource report on the Dryden Flight Research Center. The issue also contains articles on electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences and life sciences. In addition to the standard articles in the NASA Tech brief, this contains a supplement entitled "Laser Tech Briefs" which features an article on the National Ignition Facility, and other articles on the use of Lasers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krasheninnikov, Sergei I.; Angus, Justin; Lee, Wonjae
The goal of the Edge Simulation Laboratory (ESL) multi-institutional project is to advance scientific understanding of the edge plasma region of magnetic fusion devices via a coordinated effort utilizing modern computing resources, advanced algorithms, and ongoing theoretical development. The UCSD team was involved in the development of the COGENT code for kinetic studies across a magnetic separatrix. This work included a kinetic treatment of electrons and multiple ion species (impurities) and accurate collision operators.
Mineral resource of the month: cobalt
Shedd, Kim B.
2009-01-01
Cobalt is a metal used in numerous commercial, industrial and military applications. On a global basis, the leading use of cobalt is in rechargeable lithium-ion, nickel-cadmium and nickel-metal hydride battery electrodes. Cobalt use has grown rapidly since the early 1990s, with the development of new battery technologies and an increase in demand for portable electronics such as cell phones, laptop computers and cordless power tools.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
Updating and expanding the library of materials on NASA Spacelink electronic information system
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1992-01-01
NASA Spacelink, a proven resource medium, may be accessed over telephone lines or via the Internet by teachers or anyone with a computer or modem. It is a collection of historical and current information on NASA programs and activities. Included in this library is information on a variety of NASA programs, updates on Shuttle status, news releases, aeronautics, space exploration, classroom materials, NASA Educational Services, and computer programs and graphics. The material stored in Spacelink has found widespread use by teachers and others, and is being used to stimulate students, particularly in the area of aerospace science.
ERIC Educational Resources Information Center
Illinois State Board of Higher Education, Springfield.
This proposal calls on the state of Illinois to initiate a statewide computing and telecommunications network that would give its residents access to higher education, advanced training, and electronic information resources. The proposed network, entitled Illinois Century Network, would link all higher education institutions in the state to…
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
NASA Technical Reports Server (NTRS)
Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.
1975-01-01
The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
Spectroscopy of organic semiconductors from first principles
NASA Astrophysics Data System (ADS)
Sharifzadeh, Sahar; Biller, Ariel; Kronik, Leeor; Neaton, Jeffery
2011-03-01
Advances in organic optoelectronic materials rely on an accurate understanding their spectroscopy, motivating the development of predictive theoretical methods that accurately describe the excited states of organic semiconductors. In this work, we use density functional theory and many-body perturbation theory (GW/BSE) to compute the electronic and optical properties of two well-studied organic semiconductors, pentacene and PTCDA. We carefully compare our calculations of the bulk density of states with available photoemission spectra, accounting for the role of finite temperature and surface effects in experiment, and examining the influence of our main approximations -- e.g. the GW starting point and the application of the generalized plasmon-pole model -- on the predicted electronic structure. Moreover, our predictions for the nature of the exciton and its binding energy are discussed and compared against optical absorption data. We acknowledge DOE, NSF, and BASF for financial support and NERSC for computational resources.
Rhodes, Penny; Small, Neil; Rowley, Emma; Langdon, Mark; Ariss, Steven; Wright, John
2008-09-01
Two routine consultations in primary care diabetes clinics are compared using extracts from video recordings of interactions between nurses and patients. The consultations were chosen to present different styles of interaction, in which the nurse's gaze was either primarily toward the computer screen or directed more toward the patient. Using conversation analysis, the ways in which nurses shift both gaze and body orientation between the computer screen and patient to influence the style, pace, content, and structure of the consultation were investigated. By examining the effects of different levels of engagement between the electronic medical record and the embodied patient in the consultation room, we argue for the need to consider the contingent nature of the interface of technology and the person in the consultation. Policy initiatives designed to deliver what is considered best-evidenced practice are modified in the micro context of the interactions of the consultation.
Zhang, Zhuhua; Liu, Xiaofei; Yu, Jin; Hang, Yang; Li, Yao; Guo, Yufeng; Xu, Ying; Sun, Xu; Zhou, Jianxin; Guo, Wanlin
2016-01-01
Low-dimensional materials exhibit many exceptional properties and functionalities which can be efficiently tuned by externally applied force or fields. Here we review the current status of research on tuning the electronic and magnetic properties of low-dimensional carbon, boron nitride, metal-dichalcogenides, phosphorene nanomaterials by applied engineering strain, external electric field and interaction with substrates, etc, with particular focus on the progress of computational methods and studies. We highlight the similarities and differences of the property modulation among one- and two-dimensional nanomaterials. Recent breakthroughs in experimental demonstration of the tunable functionalities in typical nanostructures are also presented. Finally, prospective and challenges for applying the tunable properties into functional devices are discussed. WIREs Comput Mol Sci 2016, 6:324-350. doi: 10.1002/wcms.1251 For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article.
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath
Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.
2009-01-01
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201
The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.
Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X
2009-07-13
The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.
Do you feel you know how to write an e-mail?
NASA Astrophysics Data System (ADS)
Leonova, Anna O.
2003-10-01
Computers have opened doors to the new era of telecommunication. Electronic mail is becoming very popular in different spheres of professional activity and everyday life of people all over the world as it provides people an excellent opportunity for real, natural communication. The use of e-mail and the Internet involves a whole range of skills including knowing how to use a personal computer, knowing how to navigate the immense resources of cyberspace, and becoming familiar with the special register of e-mail communication (which lies somewhere between the formality of traditional writing and the spontaneity of speech). Conferencing via e-mail, or communicating with partners through networked computers, offers many opportunities in the Scientific Community. E-mail allows us to collaborate easily with thousands of colleagues, sharing new ideas, resources, and materials. It can provide the information, contacts, and stimulation that can make our research work more effective and enjoyable. The English language is world-wide accepted as lingua-franca of the Internet and intercultural communication. This brings us to a necessity to introduce some ideas on e-mail writing.
From transistor to trapped-ion computers for quantum chemistry.
Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E
2014-01-07
Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.
From transistor to trapped-ion computers for quantum chemistry
Yung, M.-H.; Casanova, J.; Mezzacapo, A.; McClean, J.; Lamata, L.; Aspuru-Guzik, A.; Solano, E.
2014-01-01
Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology. PMID:24395054
Information processing using a single dynamical node as complex system
Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.
2011-01-01
Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
NASA Technical Reports Server (NTRS)
Matijevic, Jacob R.; Zimmerman, Wayne F.; Dolinsky, Shlomo
1990-01-01
Assembly of electromechanical and electronic equipment (including computers) constitutes test bed for development of advanced robotic systems for remote manipulation. Combines features not found in commercial systems. Its architecture allows easy growth in complexity and level of automation. System national resource for validation of new telerobotic technology. Intended primarily for robots used in outer space, test bed adapted to development of advanced terrestrial telerobotic systems for handling radioactive materials, dangerous chemicals, and explosives.
NASA Astrophysics Data System (ADS)
Collins, Robert J.; Donaldon, Ross J.; Dunjko, Vedran; Wallden, Petros; Clarke, Patrick J.; Andersson, Erika; Jeffers, John; Buller, Gerald S.
2014-10-01
Classical digital signatures are commonly used in e-mail, electronic financial transactions and other forms of electronic communications to ensure that messages have not been tampered with in transit, and that messages are transferrable. The security of commonly used classical digital signature schemes relies on the computational difficulty of inverting certain mathematical functions. However, at present, there are no such one-way functions which have been proven to be hard to invert. With enough computational resources certain implementations of classical public key cryptosystems can be, and have been, broken with current technology. It is nevertheless possible to construct information-theoretically secure signature schemes, including quantum digital signature schemes. Quantum signature schemes can be made information theoretically secure based on the laws of quantum mechanics, while classical comparable protocols require additional resources such as secret communication and a trusted authority. Early demonstrations of quantum digital signatures required quantum memory, rendering them impractical at present. Our present implementation is based on a protocol that does not require quantum memory. It also uses the new technique of unambiguous quantum state elimination, Here we report experimental results for a test-bed system, recorded with a variety of different operating parameters, along with a discussion of aspects of the system security.
Life on the line: the therapeutic potentials of computer-mediated conversation.
Miller, J K; Gergen, K J
1998-04-01
In what ways are computer networking practices comparable to face-to-face therapy? With the exponential increase in computer-mediated communication and the increasing numbers of people joining topically based computer networks, the potential for grass-roots therapeutic (or antitherapeutic) interchange is greatly augmented. Here we report the results of research into exchanges on an electronic bulletin board devoted to the topic of suicide. Over an 11-month period participants offered each other valuable resources in terms of validation of experience, sympathy, acceptance, and encouragement. They also asked provocative questions and furnished broad-ranging advice. Hostile entries were rare. However, there were few communiques that parallel the change-inducing practices more frequent within many therapeutic settings. In effect, on-line dialogues seemed more sustaining than transforming. Further limits and potentials of on-line communication are explored.
Rich in resources/deficient in dollars! Which titles do reference departments really need?
Fishman, D L; DelBaglivo, M
1998-10-01
Budget pressures, combined with the growing availability of resources, dictate careful examination of reference use. Two studies were conducted at the University of Maryland Health Sciences Library to examine this issue. A twelve-month reshelving study determined use by title and discipline; a simultaneous study analyzed print abstract and index use in an electronic environment. Staff electronically recorded statistics for unshelved reference books, coded the collection by discipline, and tracked use by school. Oral surveys administered to reference room abstract and index users focused on title usage, user demographics, and stated reason for use. Sixty-five and a half percent of reference collection titles were used. Medical titles received the most use, but, in the context of collection size, dentistry and nursing titles used the greatest percentage of their collections. At an individual title level, medical textbooks and drug handbooks were most used. Users of abstracts and indexes were primarily campus nursing and medical students who preferred print resources. The monograph data will guide reference expenditures in canceling little-used standing orders, expanding most-used portions of the collection, and analyzing underused sections. The abstract and index survey identified the following needs: targeting instruction, contacting faculty who assign print resources, increasing the number of computer workstations, and installing signs linking databases to print equivalents.
Evaluation of a patient centered e-nursing and caring system.
Tsai, Lai-Yin; Shan, Huang; Mei-Bei, Lin
2006-01-01
This study aims to develop an electronic nursing and caring system to manage patients' information and provide patients with safe and efficient services. By transmitting data among wireless cards, optical network, and mainframe computer, nursing care will be delivered more systematically and patients' safety centered caring will be delivered more efficiently and effectively. With this system, manual record keeping time was cut down, and relevant nursing and caring information was linked up. With the development of an electronic nursing system, nurses were able to make the best use of the Internet resources, integrate information management systematically and improve quality of nursing and caring service.
NASA Astrophysics Data System (ADS)
Bui, Francis Minhthang; Hatzinakos, Dimitrios
2007-12-01
As electronic communications become more prevalent, mobile and universal, the threats of data compromises also accordingly loom larger. In the context of a body sensor network (BSN), which permits pervasive monitoring of potentially sensitive medical data, security and privacy concerns are particularly important. It is a challenge to implement traditional security infrastructures in these types of lightweight networks since they are by design limited in both computational and communication resources. A key enabling technology for secure communications in BSN's has emerged to be biometrics. In this work, we present two complementary approaches which exploit physiological signals to address security issues: (1) a resource-efficient key management system for generating and distributing cryptographic keys to constituent sensors in a BSN; (2) a novel data scrambling method, based on interpolation and random sampling, that is envisioned as a potential alternative to conventional symmetric encryption algorithms for certain types of data. The former targets the resource constraints in BSN's, while the latter addresses the fuzzy variability of biometric signals, which has largely precluded the direct application of conventional encryption. Using electrocardiogram (ECG) signals as biometrics, the resulting computer simulations demonstrate the feasibility and efficacy of these methods for delivering secure communications in BSN's.
Laboratory Computing Resource Center
Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low
The frequency spectrum crisis - Issues and answers
NASA Astrophysics Data System (ADS)
Armes, G. L.
The frequency spectrum represents a unique resource which can be overtaxed. In the present investigation, it is attempted to evalute the demand for satellite and microwave services. Dimensions of increased demand are discussed, taking into account developments related to the introduction of the personal computer, the activities of the computer and communications industries in preparation for the office of the future, and electronic publishing. Attention is given to common carrier spectrum congestion, common carrier microwave, satellite communications, teleports, international implications, satellite frequency bands, satellite spectrum implications, alternatives regarding the utilization of microwave frequency bands, U.S. Government spectrum utilization, and the impact at C-band.
The performance of low-cost commercial cloud computing as an alternative in computational chemistry.
Thackston, Russell; Fortenberry, Ryan C
2015-05-05
The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.
Research on elastic resource management for multi-queue under cloud computing environment
NASA Astrophysics Data System (ADS)
CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang
2017-10-01
As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.
Technical accomplishments of the NASA Lewis Research Center, 1989
NASA Technical Reports Server (NTRS)
1990-01-01
Topics addressed include: high-temperature composite materials; structural mechanics; fatigue life prediction for composite materials; internal computational fluid mechanics; instrumentation and controls; electronics; stirling engines; aeropropulsion and space propulsion programs, including a study of slush hydrogen; space power for use in the space station, in the Mars rover, and other applications; thermal management; plasma and radiation; cryogenic fluid management in space; microgravity physics; combustion in reduced gravity; test facilities and resources.
NASA Astrophysics Data System (ADS)
Chaluvadi, Hari; Nixon, Kate; Murray, Andrew; Ning, Chuangang; Colgan, James; Madison, Don
2014-10-01
Experimental and theoretical Triply Differential Cross Sections (TDCS) will be presented for electron-impact ionization of sulfur hexafluoride (SF6) for the molecular orbital 1t1g. M3DW (molecular 3-body distorted wave) results will be compared with experiment for coplanar geometry and for perpendicular plane geometry (a plane which is perpendicular to the incident beam direction). In both cases, the final state electron energies and observation angles are symmetric and the final state electron energies range from 5 eV to 40 eV. It will be shown that there is a large difference between using the OAMO (orientation averaged molecular orbital) approximation and the proper average over all orientations and also that the proper averaged results are in much better agreement with experiment. Work supported by NSF under Grant Number PHY-1068237. Computational work was performed with Institutional resources made available through Los Alamos National Laboratory.
NASA Technical Reports Server (NTRS)
Bishop, Ann P.; Pinelli, Thomas E.
1994-01-01
This paper presents selected results from an empirical investigation into the use of computer networks in aerospace engineering. Such networks allow aerospace engineers to communicate with people and access remote resources through electronic mail, file transfer, and remote log-in. The study drew its subjects from private sector, government and academic organizations in the U.S. aerospace industry. Data presented here were gathered in a mail survey, conducted in Spring 1993, that was distributed to aerospace engineers performing a wide variety of jobs. Results from the mail survey provide a snapshot of the current use of computer networks in the aerospace industry, suggest factors associated with the use of networks, and identify perceived impacts of networks on aerospace engineering work and communication.
Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds
NASA Astrophysics Data System (ADS)
Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni
2012-09-01
Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.
Electronic and Optical Properties of Novel Phases of Silicon and Silicon-Based Derivatives
NASA Astrophysics Data System (ADS)
Ong, Chin Shen; Choi, Sangkook; Louie, Steven
2014-03-01
The vast majority of solar cells in the market today are made from crystalline silicon in the diamond-cubic phase. Nonetheless, diamond-cubic Si has an intrinsic disadvantage: it has an indirect band gap with a large energy difference between the direct gap and the indirect gap. In this work, we perform a careful study of the electronic and optical properties of a newly discovered cubic-Si20 phase of Si that is found to sport a direct band gap. In addition, other silicon-based derivatives have also been discovered and found to be thermodynamically metastable. We carry out ab initio GW and GW-BSE calculations for the quasiparticle excitations and optical spectra, respectively, of these new phases of silicon and silicon-based derivatives. This work was supported by NSF grant No. DMR10-1006184 and U.S. DOE under Contract No. DE-AC02-05CH11231. Computational resources have been provided by DOE at Lawrence Berkeley National Laboratory's NERSC facility and the NSF through XSEDE resources at NICS.
Computing at h1 - Experience and Future
NASA Astrophysics Data System (ADS)
Eckerlin, G.; Gerhards, R.; Kleinwort, C.; KrÜNer-Marquis, U.; Egli, S.; Niebergall, F.
The H1 experiment has now been successfully operating at the electron proton collider HERA at DESY for three years. During this time the computing environment has gradually shifted from a mainframe oriented environment to the distributed server/client Unix world. This transition is now almost complete. Computing needs are largely determined by the present amount of 1.5 TB of reconstructed data per year (1994), corresponding to 1.2 × 107 accepted events. All data are centrally available at DESY. In addition to data analysis, which is done in all collaborating institutes, most of the centrally organized Monte Carlo production is performed outside of DESY. New software tools to cope with offline computing needs include CENTIPEDE, a tool for the use of distributed batch and interactive resources for Monte Carlo production, and H1 UNIX, a software package for automatic updates of H1 software on all UNIX platforms.
ERIC Educational Resources Information Center
Bhukuvhani, Crispen; Chiparausha, Blessing; Zuvalinyenga, Dorcas
2012-01-01
Lecturers use various electronic resources at different frequencies. The university library's information literacy skills workshops and seminars are the main sources of knowledge of accessing electronic resources. The use of electronic resources can be said to have positively affected lecturers' pedagogical practices and their work in general. The…
From Tedious to Timely: Screencasting to Troubleshoot Electronic Resource Issues
ERIC Educational Resources Information Center
Hartnett, Eric; Thompson, Carole
2010-01-01
The shift from traditional print materials to electronic resources, in conjunction with the rise in the number of distance education programs, has left many electronic resource librarians scrambling to keep up with the resulting inundation of electronic resource problems. When it comes to diagnosing these problems, words do not always convey all…
Electronic Resources and Mission Creep: Reorganizing the Library for the Twenty-First Century
ERIC Educational Resources Information Center
Stachokas, George
2009-01-01
The position of electronic resources librarian was created to serve as a specialist in the negotiation of license agreements for electronic resources, but mission creep has added more functions to the routine work of electronic resources such as cataloging, gathering information for collection development, and technical support. As electronic…
Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barone, Matthew F.; Ananthan, Shreyas; Churchfield, Matt
This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energymore » Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is rather modest in scale, examining performance on problem sizes of O(100 million) elements and core counts up to 8k cores. This will be expanded as more computational resources become available to the projects.« less
A study of computer graphics technology in application of communication resource management
NASA Astrophysics Data System (ADS)
Li, Jing; Zhou, Liang; Yang, Fei
2017-08-01
With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.
ERIC Educational Resources Information Center
England, Lenore; Fu, Li
2011-01-01
A critical part of electronic resources management, the electronic resources evaluation process is multi-faceted and includes a seemingly endless range of resources and tools involving numerous library staff. A solution is to build a Web site to bring all of the components together that can be implemented quickly and result in an organizational…
Design and implementation of intelligent electronic warfare decision making algorithm
NASA Astrophysics Data System (ADS)
Peng, Hsin-Hsien; Chen, Chang-Kuo; Hsueh, Chi-Shun
2017-05-01
Electromagnetic signals and the requirements of timely response have been a rapid growth in modern electronic warfare. Although jammers are limited resources, it is possible to achieve the best electronic warfare efficiency by tactical decisions. This paper proposes the intelligent electronic warfare decision support system. In this work, we develop a novel hybrid algorithm, Digital Pheromone Particle Swarm Optimization, based on Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO) and Shuffled Frog Leaping Algorithm (SFLA). We use PSO to solve the problem and combine the concept of pheromones in ACO to accumulate more useful information in spatial solving process and speed up finding the optimal solution. The proposed algorithm finds the optimal solution in reasonable computation time by using the method of matrix conversion in SFLA. The results indicated that jammer allocation was more effective. The system based on the hybrid algorithm provides electronic warfare commanders with critical information to assist commanders in effectively managing the complex electromagnetic battlefield.
Browsing the Real World using Organic Electronics, Si-Chips, and a Human Touch.
Berggren, Magnus; Simon, Daniel T; Nilsson, David; Dyreklev, Peter; Norberg, Petronella; Nordlinder, Staffan; Ersman, Peter Andersson; Gustafsson, Göran; Wikner, J Jacob; Hederén, Jan; Hentzell, Hans
2016-03-09
Organic electronics have been developed according to an orthodox doctrine advocating "all-printed'', "all-organic'' and "ultra-low-cost'' primarily targeting various e-paper applications. In order to harvest from the great opportunities afforded with organic electronics potentially operating as communication and sensor outposts within existing and future complex communication infrastructures, high-quality computing and communication protocols must be integrated with the organic electronics. Here, we debate and scrutinize the twinning of the signal-processing capability of traditional integrated silicon chips with organic electronics and sensors, and to use our body as a natural local network with our bare hand as the browser of the physical world. The resulting platform provides a body network, i.e., a personalized web, composed of e-label sensors, bioelectronics, and mobile devices that together make it possible to monitor and record both our ambience and health-status parameters, supported by the ubiquitous mobile network and the resources of the "cloud". © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi
2008-01-01
Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.
A resource management architecture based on complex network theory in cloud computing federation
NASA Astrophysics Data System (ADS)
Zhang, Zehua; Zhang, Xuejie
2011-10-01
Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.
Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María
2017-10-01
New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Mineral resource of the month: cultured quartz crystal
,
2008-01-01
The article presents information on cultured quartz crystals, a mineral used in mobile phones, computers, clocks and other devices controlled by digital circuits. Cultured quartz, which is synthetically produced in large pressurized vessels known as autoclaves, is useful in electronic circuits for precise filtration, frequency control and timing for consumer and military use. Several ingredients are used in producing cultured quartz, including seed crystals, lascas, a solution of sodium hydroxide or sodium carbonate, lithium salts and deionized water.
A resource-sharing model based on a repeated game in fog computing.
Sun, Yan; Zhang, Nan
2017-03-01
With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.
Software for Building Models of 3D Objects via the Internet
NASA Technical Reports Server (NTRS)
Schramer, Tim; Jensen, Jeff
2003-01-01
The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.
NASA Astrophysics Data System (ADS)
2001-07-01
Good teaching isn't a hardware problem Stuart Robertson, a physics teacher by training, now works to ensure that teachers are fully trained to use Information and Communication Technology (ICT) and that all Scottish students leave school competent with the basics of using computers. He addressed the Stirling meeting of physics teachers at the end of May. So, how do governments measure progress with ICT? They measure the numbers of schools with full internet access, the proportion of teachers with e-mail, the numbers of computers in classrooms and so on. One of England's most successful state schools (by exam results) boasts 26 interactive whiteboards, and in the UK there seems to be a feeling that lots of hardware = good school. Teaching isn't that simple. We don't need expensive research to know that just using a computer won't make teaching necessarily better. Robertson knows this and advises: don't be driven by technology—be driven by what you can do with it. Good teaching has always been about using the resources at hand, and it still is. Our aim at Physics Education is support the teaching of physics by reviewing and discussing new teaching tools— hardware and software (see Reviews). That's not to say that we must all be using expensive electronic boxes of tricks to reinforce every concept. We don't need computers to teach physics. I really doubt that my teachers, back in the 1970s, would have taught me much more physics if we had had computers in our lab. In this issue of Physics Education we have examples of some very straight-forward demonstrations and experiments—with no computer involvement whatsoever. But we also have some computer-interfaced activities and some computer-based investigations. We recognize that some institutions have an erratic electricity supply and few, if any, computers. Others are being driven to use as much electronic gadgetry as possible, following the mistaken assumption that this is, in itself, educationally better. Other schools and colleges are exploring electronic learning through the internet and virtual labs (see Steve Mellema's use of IT in his Lecture for the 21st Century). We aim to provide useful material for everybody at and in between the extremes. But some words of caution, sounded by Robertson in Stirling: today we might find that our classes are motivated and interested when we use computers, but how long will the excitement last? If every lesson faces children with computer screens will they soon get bored and demotivated? Individual learning, through worksheets, was a great success when it was developed in the 70s, but when every lesson faced a child with yet another worksheet, students were turned off. It became known as 'death by a thousand worksheets'. Let's not abuse computers in the same way. Physics for the beach and the igloo Physics is about being cool, as we are always trying to tell our students! In this 'summer' issue we have two papers which allow us to demonstrate this practically. I should also like to remind readers that Physics Education is available online (www.iop.org/Journals/pe) in addition to the paper version. The electronic version has the advantages of hotlinks to websites, search facilities and the ability to download teaching materials. My guess is that we haven't begun to explore the possibilities of the electronic journal as a teaching resource for teachers. If it can be stored electronically, we can include it as a multimedia clip pictures, worksheets, spreadsheets, videos, sounds... But there are also many advantages of paper—convenience and permanence being just two. Physics Education is a worthwhile publication and it feels like that in your hand. Having a journal like this, to put in my bag, or stack on my bookshelf, still feels good to me. And judging by readers' comments, you agree. IOPP will, no doubt, support both formats for a long time to come. So, this summer, enjoy the format you are reading, read Physics Education on screen or on the beach, reflect on your teaching, your students' learning and remind yourself that physics really can be cool. Editor: Kerry Parker
NASA Astrophysics Data System (ADS)
Falkner, Katrina; Vivian, Rebecca
2015-10-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Sarah Nicole
Polymer foam encapsulants provide mechanical, electrical, and thermal isolation in engineered systems. It can be advantageous to surround objects of interest, such as electronics, with foams in a hermetically sealed container to protect the electronics from hostile en vironments, such as a crash that produces a fire. However, i n fire environments, gas pressure from thermal decomposition of foams can cause mechanical failure of the sealed system . In this work, a detailed study of thermally decomposing polymeric methylene diisocyanate (PMDI) - polyether - polyol based polyurethane foam in a sealed container is presented . Both experimental and computational workmore » is discussed. Three models of increasing physics fidelity are presented: No Flow, Porous Media, and Porous Media with VLE. Each model us described in detail, compared to experiment , and uncertainty quantification is performed. While the Porous Media with VLE model matches has the best agreement with experiment, it also requires the most computational resources.« less
Zhang, Zhuhua; Liu, Xiaofei; Yu, Jin; Hang, Yang; Li, Yao; Guo, Yufeng; Xu, Ying; Sun, Xu; Zhou, Jianxin
2016-01-01
Low‐dimensional materials exhibit many exceptional properties and functionalities which can be efficiently tuned by externally applied force or fields. Here we review the current status of research on tuning the electronic and magnetic properties of low‐dimensional carbon, boron nitride, metal‐dichalcogenides, phosphorene nanomaterials by applied engineering strain, external electric field and interaction with substrates, etc, with particular focus on the progress of computational methods and studies. We highlight the similarities and differences of the property modulation among one‐ and two‐dimensional nanomaterials. Recent breakthroughs in experimental demonstration of the tunable functionalities in typical nanostructures are also presented. Finally, prospective and challenges for applying the tunable properties into functional devices are discussed. WIREs Comput Mol Sci 2016, 6:324–350. doi: 10.1002/wcms.1251 For further resources related to this article, please visit the WIREs website. Conflict of interest: The authors have declared no conflicts of interest for this article. PMID:27818710
TomoMiner and TomoMinerCloud: A software platform for large-scale subtomogram structural analysis
Frazier, Zachary; Xu, Min; Alber, Frank
2017-01-01
SUMMARY Cryo-electron tomography (cryoET) captures the 3D electron density distribution of macromolecular complexes in close to native state. With the rapid advance of cryoET acquisition technologies, it is possible to generate large numbers (>100,000) of subtomograms, each containing a macromolecular complex. Often, these subtomograms represent a heterogeneous sample due to variations in structure and composition of a complex in situ form or because particles are a mixture of different complexes. In this case subtomograms must be classified. However, classification of large numbers of subtomograms is a time-intensive task and often a limiting bottleneck. This paper introduces an open source software platform, TomoMiner, for large-scale subtomogram classification, template matching, subtomogram averaging, and alignment. Its scalable and robust parallel processing allows efficient classification of tens to hundreds of thousands of subtomograms. Additionally, TomoMiner provides a pre-configured TomoMinerCloud computing service permitting users without sufficient computing resources instant access to TomoMiners high-performance features. PMID:28552576
Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W
2008-05-28
The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.
Bhargava, Puneet; Dhand, Sabeen; Lackey, Amanda E; Pandey, Tarun; Moshiri, Mariam; Jambhekar, Kedar
2013-03-01
Increasing use of smartphones and handheld computers is accompanied by a rapid growth in the other related industries. Electronic books have revolutionized the centuries-old conventional books and magazines markets and have simplified publishing by reducing the cost and processing time required to create and distribute any given book. We are now able to read, review, store, and share various types of documents via several electronic tools, many of which are available free of charge. Additionally, this electronic revolution has resulted in an explosion of readily available Internet-based educational resources for the residents and has paved the path for educators to reach out to a larger and more diverse student population. Published by Elsevier Inc.
Strongly Correlated Electron Systems: An Operatorial Perspective
NASA Astrophysics Data System (ADS)
Di Ciolo, Andrea; Avella, Adolfo
2018-05-01
We discuss the operatorial approach to the study of strongly correlated electron systems and show how the exact solution of target models on small clusters chosen ad-hoc (minimal models) can suggest very efficient bulk approximations. We use the Hubbard model as case study (target model) and we analyze and discuss the crucial role of spin fluctuations in its 2-site realization (minimal model). Accordingly, we devise a novel three-pole approximation for the 2D case, including in the basic field an operator describing the dressing of the electronic one by the nearest-neighbor spin-fluctuations. Such a solution is in very good agreement with the exact one in the minimal model (2-site case) and performs very well once compared to advanced (semi-)numerical methods in the 2D case, being by far less computational-resource demanding.
Use of Electronic Resources for Psychiatry Clerkship Learning: A Medical Student Survey.
Snow, Caitlin E; Torous, John; Gordon-Elliott, Janna S; Penzner, Julie B; Meyer, Fermonta; Boland, Robert
2017-10-01
The primary aim of this study is to examine medical students' use patterns, preferences, and perceptions of electronic educational resources available for psychiatry clerkship learning. Eligible participants included medical students who had completed the psychiatry clerkship during a 24-month period. An internet-based questionnaire was used to collect information regarding the outcomes described above. A total of 68 medical students responded to the survey. Most respondents reported high utilization of electronic resources on an array of devices for psychiatry clerkship learning and indicated a preference for electronic over print resources. The most commonly endorsed barriers to the use of electronic resources were that the source contained irrelevant and non-specific content, access was associated with a financial cost, and faculty guidance on recommended resources was insufficient. Respondents indicated a wish for more psychiatry-specific electronic learning resources. The authors' results suggest that a demand exists for high-quality electronic and portable learning tools that are relevant to medical student education in psychiatry. Psychiatry educators are usefully positioned to be involved in the development of such resources.
NASA Technical Reports Server (NTRS)
Kemeny, Sabrina E.
1994-01-01
Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.
Benchmarking information needs and use in the Tennessee public health community*
Lee, Patricia; Giuse, Nunzia B.; Sathe, Nila A.
2003-01-01
Objective: The objective is to provide insight to understanding public health officials' needs and promote access to data repositories and communication tools. Methods: Survey questions were identified by a focus group with members drawn from the fields of librarianship, public health, and informatics. The resulting comprehensive information needs survey, organized in five distinct broad categories, was distributed to 775 Tennessee public health workers from ninety-five counties in 1999 as part of the National Library of Medicine–funded Partners in Information Access contract. Results: The assessment pooled responses from 571 public health workers (73% return rate) representing seventy-two of ninety-five counties (53.4% urban and 46.6% rural) about their information-seeking behaviors, frequency of resources used, computer skills, and level of Internet access. Sixty-four percent of urban and 43% of rural respondents had email access at work and more than 50% of both urban and rural respondents had email at home (N = 289). Approximately 70% of urban and 78% of rural public health officials never or seldom used or needed the Centers for Disease Control (CDC) Website. Frequency data pooled from eleven job categories representing a subgroup of 232 health care professionals showed 72% never or seldom used or needed MEDLINE. Electronic resources used daily or weekly were email, Internet search engines, internal databases and mailing lists, and the Tennessee Department of Health Website. Conclusions: While, due to the small sample size, data cannot be generalized to the larger population, a clear trend of significant barriers to computer and Internet access can be identified across the public health community. This contributes to an overall limited use of existing electronic resources that inhibits evidence-based practice. PMID:12883562
Feltham, R K
1995-01-01
Open tendering for medical informatics systems in the UK has traditionally been lengthy and, therefore, expensive on resources for vendor and purchaser alike. Events in the United Kingdom (UK) and European Community (EC) have led to new Government guidance being published on procuring information systems for the public sector: Procurement of Information Systems Effectively (POISE). This innovative procurement process, launched in 1993, has the support of the Computing Services Association (CSA) and the Federation of the Electronics Industry (FEI). This paper gives an overview of these new UK guidelines on healthcare information system purchasing in the context of a recent procurement project with an NHS Trust Hospital. The aim of the project was to replace three aging, separate, and different laboratory computer systems with a new, integrated turnkey system offering all department modules, an Open modern computer environment, and on-line electronic links to key departmental systems, both within and external to the Trust by the end of 1994. The new system had to complement the Trust's strategy for providing a modern clinical laboratory service to the local population and meet a tight budget.
Networking Biology: The Origins of Sequence-Sharing Practices in Genomics.
Stevens, Hallam
2015-10-01
The wide sharing of biological data, especially nucleotide sequences, is now considered to be a key feature of genomics. Historians and sociologists have attempted to account for the rise of this sharing by pointing to precedents in model organism communities and in natural history. This article supplements these approaches by examining the role that electronic networking technologies played in generating the specific forms of sharing that emerged in genomics. The links between early computer users at the Stanford Artificial Intelligence Laboratory in the 1960s, biologists using local computer networks in the 1970s, and GenBank in the 1980s, show how networking technologies carried particular practices of communication, circulation, and data distribution from computing into biology. In particular, networking practices helped to transform sequences themselves into objects that had value as a community resource.
Dynamic Transportation Navigation
NASA Astrophysics Data System (ADS)
Meng, Xiaofeng; Chen, Jidong
Miniaturization of computing devices, and advances in wireless communication and sensor technology are some of the forces that are propagating computing from the stationary desktop to the mobile outdoors. Some important classes of new applications that will be enabled by this revolutionary development include intelligent traffic management, location-based services, tourist services, mobile electronic commerce, and digital battlefield. Some existing application classes that will benefit from the development include transportation and air traffic control, weather forecasting, emergency response, mobile resource management, and mobile workforce. Location management, i.e., the management of transient location information, is an enabling technology for all these applications. In this chapter, we present the applications of moving objects management and their functionalities, in particular, the application of dynamic traffic navigation, which is a challenge due to the highly variable traffic state and the requirement of fast, on-line computations.
Looking at Earth from space: Direct readout from environmental satellites
NASA Technical Reports Server (NTRS)
1994-01-01
Direct readout is the capability to acquire information directly from meteorological satellites. Data can be acquired from NASA-developed, National Oceanic and Atmospheric Administration (NOAA)-operated satellites, as well as from other nations' meteorological satellites. By setting up a personal computer-based ground (Earth) station to receive satellite signals, direct readout may be obtained. The electronic satellite signals are displayed as images on the computer screen. The images can display gradients of the Earth's topography and temperature, cloud formations, the flow and direction of winds and water currents, the formation of hurricanes, the occurrence of an eclipse, and a view of Earth's geography. Both visible and infrared images can be obtained. This booklet introduces the satellite systems, ground station configuration, and computer requirements involved in direct readout. Also included are lists of associated resources and vendors.
Provider-Independent Use of the Cloud
NASA Astrophysics Data System (ADS)
Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron
Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.
Assessing Ongoing Electronic Resource Purchases: Linking Tools to Synchronize Staff Workflows
ERIC Educational Resources Information Center
Carroll, Jeffrey D.; Major, Colleen; O'Neal, Nada; Tofanelli, John
2012-01-01
Ongoing electronic resource purchases represent a substantial proportion of collections budgets. Recognizing the necessity of systematic ongoing assessment with full selector engagement, Columbia University Libraries appointed an Electronic Resources Assessment Working Group to promote the inclusion of such resources within our current culture of…
Navigating 3D electron microscopy maps with EM-SURFER.
Esquivel-Rodríguez, Juan; Xiong, Yi; Han, Xusi; Guang, Shuomeng; Christoffer, Charles; Kihara, Daisuke
2015-05-30
The Electron Microscopy DataBank (EMDB) is growing rapidly, accumulating biological structural data obtained mainly by electron microscopy and tomography, which are emerging techniques for determining large biomolecular complex and subcellular structures. Together with the Protein Data Bank (PDB), EMDB is becoming a fundamental resource of the tertiary structures of biological macromolecules. To take full advantage of this indispensable resource, the ability to search the database by structural similarity is essential. However, unlike high-resolution structures stored in PDB, methods for comparing low-resolution electron microscopy (EM) density maps in EMDB are not well established. We developed a computational method for efficiently searching low-resolution EM maps. The method uses a compact fingerprint representation of EM maps based on the 3D Zernike descriptor, which is derived from a mathematical series expansion for EM maps that are considered as 3D functions. The method is implemented in a web server named EM-SURFER, which allows users to search against the entire EMDB in real-time. EM-SURFER compares the global shapes of EM maps. Examples of search results from different types of query structures are discussed. We developed EM-SURFER, which retrieves structurally relevant matches for query EM maps from EMDB within seconds. The unique capability of EM-SURFER to detect 3D shape similarity of low-resolution EM maps should prove invaluable in structural biology.
dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livny, Miron
This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.
Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.
2008-01-01
The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
Dodd, Gerald D; Naeger, David M
2013-05-01
The "new online" (Web 2.0) world is evolving rapidly, and the digital information, education, and networking resources available to radiologists have exploded over the past 2 decades. The 2012 Intersociety Committee Summer Conference attendees explored the online resources that have been produced by societies, universities, and commercial entities. Specific attention was given to identifying the best products and packaging them in tablet computers for use by residents and practicing radiologists. The key functions of social networking websites and the possible roles they can play in radiology were explored as well. It was the consensus of the attendees that radiologic digital resources and portable electronic devices have matured to the point that they should become an integral part of our educational programs and clinical practice. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Mineral resource of the month: copper
,
2011-01-01
The article provides information on copper and its various uses. It was the first metal used by humans and is considered as one of the materials that played an important role in the development of civilization. It is a major industrial metal because of its low cost, availability, electrical conductivity, high ductility and thermal conductivity. Copper has long been used in the circuitry of electronics and the distribution of electricity and is now being used in silicon-based computer chips, solar and wind power generation, and coinage.
NASA aeronautics R&T - A resource for aircraft design
NASA Technical Reports Server (NTRS)
Olstad, W. B.
1981-01-01
This paper discusses the NASA aeronautics research and technology program from the viewpoint of the aircraft designer. The program spans the range from fundamental research to the joint validation with industry of technology for application into product development. Examples of recent developments in structures, materials, aerodynamics, controls, propulsion systems, and safety technology are presented as new additions to the designer's handbook. Finally, the major thrusts of NASA's current and planned programs which are keyed to revolutionary advances in materials science, electronics, and computer technology are addressed.
The AMTEX Partnership{trademark}. First quarter report, Fiscal year 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-12-01
The AMTEX Partnership is a collaborative research and development program among the US Integrated Textile Industry, DOE, the National Laboratories, other federal agencies and laboratories, and universities. The goal of AMTEX is to strengthen the competitiveness of this vital industry, thereby preserving and creating US jobs. Topics in this quarters report include: computer-aided fabric evaluation, cotton biotechnology, demand activated manufacturing architecture, electronic embedded fingerprints, on-line process control in flexible fiber manufacturing, rapid cutting, sensors for agile manufacturing, and textile resource conservation.
Supporting research sites in resource-limited settings: Challenges in implementing IT infrastructure
Whalen, Christopher; Donnell, Deborah; Tartakovsky, Michael
2014-01-01
As Information and Communication Technology infrastructure becomes more reliable, new methods of Electronic Data Capture (EDC), datamarts/Data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on EDC and internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings (RLS). We describe examples of the practical and ethical/regulatory challenges raised by use of these newer technologies for data collection in multisite clinical studies. PMID:24321986
NASA Technical Reports Server (NTRS)
Frazier, Donald O.
2000-01-01
Technically, the field of integrated optics using organic/polymer materials as a new means of information processing, has emerged as of vital importance to optical computers, optical switching, optical communications, the defense industry, etc. The goal is to replace conventional electronic integrated circuits and wires by equivalent miniaturized optical integrated circuits and fibers, offering larger bandwidths, more compactness and reliability, immunity to electromagnetic interference and less cost. From the Code E perspective, this research area represents an opportunity to marry "front-line" education in science and technology with national scientific and technological interests while maximizing human resources utilization. This can be achieved by the development of untapped resources for scientific research - such as minorities, women, and universities traditionally uninvolved in scientific research.
Design of on-board parallel computer on nano-satellite
NASA Astrophysics Data System (ADS)
You, Zheng; Tian, Hexiang; Yu, Shijie; Meng, Li
2007-11-01
This paper provides one scheme of the on-board parallel computer system designed for the Nano-satellite. Based on the development request that the Nano-satellite should have a small volume, low weight, low power cost, and intelligence, this scheme gets rid of the traditional one-computer system and dual-computer system with endeavor to improve the dependability, capability and intelligence simultaneously. According to the method of integration design, it employs the parallel computer system with shared memory as the main structure, connects the telemetric system, attitude control system, and the payload system by the intelligent bus, designs the management which can deal with the static tasks and dynamic task-scheduling, protect and recover the on-site status and so forth in light of the parallel algorithms, and establishes the fault diagnosis, restoration and system restructure mechanism. It accomplishes an on-board parallel computer system with high dependability, capability and intelligence, a flexible management on hardware resources, an excellent software system, and a high ability in extension, which satisfies with the conception and the tendency of the integration electronic design sufficiently.
NASA Astrophysics Data System (ADS)
Kun, Luis G.
1994-12-01
On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.
Electronic Resource Management and Design
ERIC Educational Resources Information Center
Abrams, Kimberly R.
2015-01-01
We have now reached a tipping point at which electronic resources comprise more than half of academic library budgets. Because of the increasing work associated with the ever-increasing number of e-resources, there is a trend to distribute work throughout the library even in the presence of an electronic resources department. In 2013, the author…
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
NASA Technical Reports Server (NTRS)
1998-01-01
Table of Contents: Federal Cleanup Programs; Federal Site Remediation Technology Development Assistance Programs; Federal Site Remediation Technology Development Electronic Data Bases; Federal Electronic Resources for Site Remediation Technology Information; Other Electronic Resources for Site Remediation Technology Information; Other Electronic Resources for Site Remediation Technology Information; Selected Bibliography: Federal Publication on Alternative and Innovative Site Remediation; and Appendix: Technology Program Contacts.
Statistics Online Computational Resource for Education
ERIC Educational Resources Information Center
Dinov, Ivo D.; Christou, Nicolas
2009-01-01
The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)
An Architecture for Cross-Cloud System Management
NASA Astrophysics Data System (ADS)
Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad
The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.
Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious
ERIC Educational Resources Information Center
Cirasella, Jill
2009-01-01
This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…
Performance optimization of Qbox and WEST on Intel Knights Landing
NASA Astrophysics Data System (ADS)
Zheng, Huihuo; Knight, Christopher; Galli, Giulia; Govoni, Marco; Gygi, Francois
We present the optimization of electronic structure codes Qbox and WEST targeting the Intel®Xeon Phi™processor, codenamed Knights Landing (KNL). Qbox is an ab-initio molecular dynamics code based on plane wave density functional theory (DFT) and WEST is a post-DFT code for excited state calculations within many-body perturbation theory. Both Qbox and WEST employ highly scalable algorithms which enable accurate large-scale electronic structure calculations on leadership class supercomputer platforms beyond 100,000 cores, such as Mira and Theta at the Argonne Leadership Computing Facility. In this work, features of the KNL architecture (e.g. hierarchical memory) are explored to achieve higher performance in key algorithms of the Qbox and WEST codes and to develop a road-map for further development targeting next-generation computing architectures. In particular, the optimizations of the Qbox and WEST codes on the KNL platform will target efficient large-scale electronic structure calculations of nanostructured materials exhibiting complex structures and prediction of their electronic and thermal properties for use in solar and thermal energy conversion device. This work was supported by MICCoM, as part of Comp. Mats. Sci. Program funded by the U.S. DOE, Office of Sci., BES, MSE Division. This research used resources of the ALCF, which is a DOE Office of Sci. User Facility under Contract DE-AC02-06CH11357.
Do GPs use electronic mental health resources? - a qualitative study.
Austin, David; Pier, Ciaran; Mitchell, Joanna; Schattner, Peter; Wade, Victoria; Pierce, David; Klein, Britt
2006-05-01
The Better Outcomes in Mental Health Care (BOMHC) initiative encourages general practitioners to use electronic mental health resources (EMHRs) during consultation with patients requiring psychological assistance. However, there is little data on GPs' acceptance and use of EMHRs. Semistructured interviews were conducted with 27 GPs to determine their attitude toward EMHRs, and their use during consultation with patients. Few GPs reported frequently using EMHRs in consultation. Identified barriers to use included lack of familiarity with information technology, and insufficient knowledge of available resources. Identified advantages of electronic resources included high patient acceptance, time efficiency, and improved quality of information. General practitioners recognise several advantages of utilising electronic resources for managing patients with mental illness. However, GPs are not sufficiently familiar with electronic resources to use them effectively. This could be overcome by education.
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1987-01-01
Spacelink is an electronic information service to be operated by the Marshall Space Flight Center. It will provide NASA news and educational resources including software programs that can be accessed by anyone with a computer and modem. Spacelink is currently being installed and will soon begin service. It will provide daily updates of NASA programs, information about NASA educational services, manned space flight, unmanned space flight, aeronautics, NASA itself, lesson plans and activities, and space program spinoffs. Lesson plans and activities were extracted from existing NASA publications on aerospace activities for the elementary school. These materials were arranged into 206 documents which have been entered into the Spacelink program for use in grades K-6.
Making sense of the electronic resource marketplace: trends in health-related electronic resources.
Blansit, B D; Connor, E
1999-01-01
Changes in the practice of medicine and technological developments offer librarians unprecedented opportunities to select and organize electronic resources, use the Web to deliver content throughout the organization, and improve knowledge at the point of need. The confusing array of available products, access routes, and pricing plans makes it difficult to anticipate the needs of users, identify the top resources, budget effectively, make sound collection management decisions, and organize the resources effectively and seamlessly. The electronic resource marketplace requires much vigilance, considerable patience, and continuous evaluation. There are several strategies that librarians can employ to stay ahead of the electronic resource curve, including taking advantage of free trials from publishers; marketing free trials and involving users in evaluating new products; watching and testing products marketed to the clientele; agreeing to beta test new products and services; working with aggregators or republishers; joining vendor advisory boards; benchmarking institutional resources against five to eight competitors; and forming or joining a consortium for group negotiating and purchasing. This article provides a brief snapshot of leading biomedical resources; showcases several libraries that have excelled in identifying, acquiring, and organizing electronic resources; and discusses strategies and trends of potential interest to biomedical librarians, especially those working in hospital settings. PMID:10427421
Making sense of the electronic resource marketplace: trends in health-related electronic resources.
Blansit, B D; Connor, E
1999-07-01
Changes in the practice of medicine and technological developments offer librarians unprecedented opportunities to select and organize electronic resources, use the Web to deliver content throughout the organization, and improve knowledge at the point of need. The confusing array of available products, access routes, and pricing plans makes it difficult to anticipate the needs of users, identify the top resources, budget effectively, make sound collection management decisions, and organize the resources effectively and seamlessly. The electronic resource marketplace requires much vigilance, considerable patience, and continuous evaluation. There are several strategies that librarians can employ to stay ahead of the electronic resource curve, including taking advantage of free trials from publishers; marketing free trials and involving users in evaluating new products; watching and testing products marketed to the clientele; agreeing to beta test new products and services; working with aggregators or republishers; joining vendor advisory boards; benchmarking institutional resources against five to eight competitors; and forming or joining a consortium for group negotiating and purchasing. This article provides a brief snapshot of leading biomedical resources; showcases several libraries that have excelled in identifying, acquiring, and organizing electronic resources; and discusses strategies and trends of potential interest to biomedical librarians, especially those working in hospital settings.
Electronic tools for infectious diseases and microbiology
Burdette, Steven D
2007-01-01
Electronic tools for infectious diseases and medical microbiology have the ability to change the way the diagnosis and treatment of infectious diseases are approached. Medical information today has the ability to be dynamic, keeping up with the latest research or clinical issues, instead of being static and years behind, as many textbooks are. The ability to rapidly disseminate information around the world opens up the possibility of communicating with people thousands of miles away to quickly and efficiently learn about emerging infections. Electronic tools have expanded beyond the desktop computer and the Internet, and now include personal digital assistants and other portable devices such as cellular phones. These pocket-sized devices have the ability to provide access to clinical information at the point of care. New electronic tools include e-mail listservs, electronic drug databases and search engines that allow focused clinical questions. The goal of the present article is to provide an overview of how electronic tools can impact infectious diseases and microbiology, while providing links and resources to allow users to maximize their efficiency in accessing this information. Links to the mentioned Web sites and programs are provided along with other useful electronic tools. PMID:18978984
Towards an ab initio theory for metal L-edge soft X-ray spectroscopy of molecular aggregates.
Preuße, Marie; Bokarev, Sergey I; Aziz, Saadullah G; Kühn, Oliver
2016-11-01
The Frenkel exciton model was adapted to describe X-ray absorption and resonant inelastic scattering spectra of polynuclear transition metal complexes by means of the restricted active space self-consistent field method. The proposed approach allows to substantially decrease the requirements on computational resources if compared to a full supermolecular quantum chemical treatment. This holds true, in particular, in cases where the dipole approximation to the electronic transition charge density can be applied. The computational protocol was applied to the calculation of X-ray spectra of the hemin complex, which forms dimers in aqueous solution. The aggregation effects were found to be comparable to the spectral alterations due to the replacement of the axial ligand by solvent molecules.
Analysis of Human Resources Management Strategy in China Electronic Commerce Enterprises
NASA Astrophysics Data System (ADS)
Shao, Fang
The paper discussed electronic-commerce's influence on enterprise human resources management, proposed and proved the human resources management strategy which electronic commerce enterprise should adopt from recruitment strategy to training strategy, keeping talent strategy and other ways.
Physics and Robotic Sensing -- the good, the bad, and approaches to making it work
NASA Astrophysics Data System (ADS)
Huff, Brian
2011-03-01
All of the technological advances that have benefited consumer electronics have direct application to robotics. Technological advances have resulted in the dramatic reduction in size, cost, and weight of computing systems, while simultaneously doubling computational speed every eighteen months. The same manufacturing advancements that have enabled this rapid increase in computational power are now being leveraged to produce small, powerful and cost-effective sensing technologies applicable for use in mobile robotics applications. Despite the increase in computing and sensing resources available to today's robotic systems developers, there are sensing problems typically found in unstructured environments that continue to frustrate the widespread use of robotics and unmanned systems. This talk presents how physics has contributed to the creation of the technologies that are making modern robotics possible. The talk discusses theoretical approaches to robotic sensing that appear to suffer when they are deployed in the real world. Finally the author presents methods being used to make robotic sensing more robust.
Imaged Document Optical Correlation and Conversion System (IDOCCS)
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-03-01
Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). In addition, many organizations are converting their paper archives to electronic images, which are stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources. The Imaged Document Optical Correlation and Conversion System (IDOCCS) provides a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval capability of document images. The IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and can even determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo, or documents with a particular individual's signature block, can be singled out. With this dual capability, IDOCCS outperforms systems that rely on optical character recognition as a basis for indexing and storing only the textual content of documents for later retrieval.
Distance education through the Internet: the GNA-VSNS biocomputing course.
de la Vega, F M; Giegerich, R; Fuellen, G
1996-01-01
A prototype course on biocomputing was delivered via international computer networks in early summer 1995. The course lasted 11 weeks, and was offered free of charge. It was organized by the BioComputing Division of the Virtual School of Natural Sciences, which is a member school of the Globewide Network Academy. It brought together 34 students and 7 instructors from all over the world, and covered the basics of sequence analysis. Five authors from Germany and USA prepared a hypertext book which was discussed in weekly study sessions that took place in a virtual classroom at the BioMOO electronic conferencing system. The course aimed at students with backgrounds in molecular biology, biomedicine or computer science, complementing and extending their skills with an interdisciplinary curriculum. Special emphasis was placed on the use of Internet resources, and the development of new teaching tools. The hypertext book includes direct links to sequence analysis and databank search services on the Internet. A tool for the interactive visualization of unit-cost pairwise sequence alignment was developed for the course. All course material will stay accessible at the World Wide Web address (Uniform Resource Locator) http://+www.techfak.uni-bielefeld.de/bcd/welcome .html. This paper describes the aims and organization of the course, and gives a preliminary account of this novel experience in distance education.
Implementing CORAL: An Electronic Resource Management System
ERIC Educational Resources Information Center
Whitfield, Sharon
2011-01-01
A 2010 electronic resource management survey conducted by Maria Collins of North Carolina State University and Jill E. Grogg of University of Alabama Libraries found that the top six electronic resources management priorities included workflow management, communications management, license management, statistics management, administrative…
Barone, Vincenzo; Biczysko, Malgorzata; Borkowska-Panek, Monika; Bloino, Julien
2014-10-20
The subtle interplay of several different effects means that the interpretation and analysis of experimental spectra in terms of structural and dynamic characteristics is a challenging task. In this context, theoretical studies can be helpful, and as such, computational spectroscopy is rapidly evolving from a highly specialized research field toward a versatile and widespread tool. However, in the case of electronic spectra (e.g. UV/Vis, circular dichroism, photoelectron, and X-ray spectra), the most commonly used methods still rely on the computation of vertical excitation energies, which are further convoluted to simulate line shapes. Such treatment completely neglects the influence of nuclear motions, despite the well-recognized notion that a proper account of vibronic effects is often mandatory to correctly interpret experimental findings. Development and validation of improved models rooted into density functional theory (DFT) and its time-dependent extension (TD-DFT) is of course instrumental for the optimal balance between reliability and favorable scaling with the number of electrons. However, the implementation of easy-to-use and effective procedures to simulate vibrationally resolved electronic spectra, and their availability to a wide community of users, is at least equally important for reliable simulations of spectral line shapes for compounds of biological and technological interest. Here, such an approach has been applied to the study of the UV/Vis spectra of chlorophyll a. The results show that properly tailored approaches are feasible for state-of-the-art computational spectroscopy studies, and allow, with affordable computational resources, vibrational and environmental effects on the spectral line shapes to be taken into account for large systems. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Flexible services for the support of research.
Turilli, Matteo; Wallom, David; Williams, Chris; Gough, Steve; Curran, Neal; Tarrant, Richard; Bretherton, Dan; Powell, Andy; Johnson, Matt; Harmer, Terry; Wright, Peter; Gordon, John
2013-01-28
Cloud computing has been increasingly adopted by users and providers to promote a flexible, scalable and tailored access to computing resources. Nonetheless, the consolidation of this paradigm has uncovered some of its limitations. Initially devised by corporations with direct control over large amounts of computational resources, cloud computing is now being endorsed by organizations with limited resources or with a more articulated, less direct control over these resources. The challenge for these organizations is to leverage the benefits of cloud computing while dealing with limited and often widely distributed computing resources. This study focuses on the adoption of cloud computing by higher education institutions and addresses two main issues: flexible and on-demand access to a large amount of storage resources, and scalability across a heterogeneous set of cloud infrastructures. The proposed solutions leverage a federated approach to cloud resources in which users access multiple and largely independent cloud infrastructures through a highly customizable broker layer. This approach allows for a uniform authentication and authorization infrastructure, a fine-grained policy specification and the aggregation of accounting and monitoring. Within a loosely coupled federation of cloud infrastructures, users can access vast amount of data without copying them across cloud infrastructures and can scale their resource provisions when the local cloud resources become insufficient.
Electronic Library: A TERI Experiment.
ERIC Educational Resources Information Center
Kar, Debal C.; Deb, Subrata; Kumar, Satish
2003-01-01
Discusses the development of Electronic Library at TERI (The Energy and Resources Institute, New Delhi). Highlights include: hardware and software used; the digital library/Virtual Electronic Library; directory of Internet journals; virtual reference resources; electronic collection/Physical Electronic Library; downloaded online full-length…
Resource Aware Intelligent Network Services (RAINS) Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehman, Tom; Yang, Xi
The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
Optimization of tomographic reconstruction workflows on geographically distributed resources
Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.
2016-01-01
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149
Optimization of tomographic reconstruction workflows on geographically distributed resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar
New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less
NASA Astrophysics Data System (ADS)
Sengupta, Abhronil; Roy, Kaushik
2017-12-01
Present day computers expend orders of magnitude more computational resources to perform various cognitive and perception related tasks that humans routinely perform every day. This has recently resulted in a seismic shift in the field of computation where research efforts are being directed to develop a neurocomputer that attempts to mimic the human brain by nanoelectronic components and thereby harness its efficiency in recognition problems. Bridging the gap between neuroscience and nanoelectronics, this paper attempts to provide a review of the recent developments in the field of spintronic device based neuromorphic computing. Description of various spin-transfer torque mechanisms that can be potentially utilized for realizing device structures mimicking neural and synaptic functionalities is provided. A cross-layer perspective extending from the device to the circuit and system level is presented to envision the design of an All-Spin neuromorphic processor enabled with on-chip learning functionalities. Device-circuit-algorithm co-simulation framework calibrated to experimental results suggest that such All-Spin neuromorphic systems can potentially achieve almost two orders of magnitude energy improvement in comparison to state-of-the-art CMOS implementations.
Cooperative business management strategies for the U.S. integrated textile complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington, K.E.
1995-12-31
The mission of the American Textile (AMTEX{trademark}) Partnership is to engage the unique technical resources of the Department of Energy National Laboratories to work with the US Integrated Textile Complex (US ITC) and research universities to develop and deploy technologies that will increase the competitiveness of the US ITC. The objectives of the Demand Activated Manufacturing Architecture (DAMA) project of AMTEX are: (1) to determine strategic business structure changes for the US ITC; (2) to establish a textile industry electronic marketplace, (3) to provide methods for US ITC education ad implementation of an electronic marketplace. The Enterprise Modeling and Simulationmore » Task of DAMA is focusing on the first DAMA goal as described in another paper of this conference. The Cooperative Business Management (CBM) Task of DAMA is developing computer-based tools that will render system-wide information accessible for improved decision making. Three CBM strategies and the associated computer tools being developed to support their implementation are described in this paper. This effort is addressing the second DAMA goal to establish a textile industry electronic marketplace in concert with the Connectivity and Infrastructure Task of DAMA. As the CBM tools mature, they will be commercialized through the DAMA Education, Outreach and Commercialization Task of DAMA to achieve the third and final DAMA goal.« less
Electronic and Optical Properties of Borophene, a Two-dimensional Transparent Metal.
NASA Astrophysics Data System (ADS)
Adamska, Lyudmyla; Sadasivam, Sridhar; Darancet, Pierre; Sharifzadeh, Sahar
Borophene is a recently synthesized metallic sheet that displays many similarities to graphene and has been predicted to be complimentary to graphene as a high density of states, optically transparent 2D conductor. The atomic arrangement of boron in the monolayer strongly depends on the growth substrate and significantly alters the optoelectronic properties. Here, we report a first-principles density functional theory and many-body perturbation theory study aimed at understanding the optoelectronic properties of two likely allotropes of monolayer boron that are consistent with experimental scanning tunneling microscopy images. We predict that despite both systems are metallic, the two allotropes have substantially different bandstructure and optical properties, with one structure being transparent up to 3 eV and the second weakly absorbing in the UV/Vis region. We demonstrate that this strong structure-dependence of optoelectronic properties is present with the application of strain. Lastly, we discuss the strength of electron-phonon and electron-hole interactions within these materials. Overall, we determine that precise control of the growth conditions in necessary for controlled optical properties. This research used resources of the Argonne Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357, and the Texas Advanced Computing Center (TACC) at The University of Texas at Austin.
Enabling opportunistic resources for CMS Computing Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hufnagel, Dirk
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Enabling opportunistic resources for CMS Computing Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hufnagel, Dick
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Enabling opportunistic resources for CMS Computing Operations
Hufnagel, Dirk
2015-12-23
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulakhe, D.; Rodriguez, A.; Wilde, M.
2008-03-01
Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual datamore » system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.« less
dos-Santos, M; Fujino, A
2012-01-01
Radiology teaching usually employs a systematic and comprehensive set of medical images and related information. Databases with representative radiological images and documents are highly desirable and widely used in Radiology teaching programs. Currently, computer-based teaching file systems are widely used in Medicine and Radiology teaching as an educational resource. This work addresses a user-centered radiology electronic teaching file system as an instance of MIRC compliant medical image database. Such as a digital library, the clinical cases are available to access by using a web browser. The system has offered great opportunities to some Radiology residents interact with experts. This has been done by applying user-centered techniques and creating usage context-based tools in order to make available an interactive system.
Development of Electronic Resources across Networks in Thailand.
ERIC Educational Resources Information Center
Ratchatavorn, Phandao
2002-01-01
Discusses the development of electronic resources across library networks in Thailand to meet user needs, particularly electronic journals. Topics include concerns about journal access; limited budgets for library acquisitions of journals; and sharing resources through a centralized database system that allows Web access to journals via Internet…
ERIC Educational Resources Information Center
Murray, Adam
2008-01-01
Designed to assist with the management of e-resources, electronic resource management (ERM) systems are time- and fund-consuming to purchase and maintain. Questions of system compatibility, data population, and workflow design/redesign can be difficult to answer; sometimes those answers are not what we'd prefer to hear. The two primary functions…
Using Mosix for Wide-Area Compuational Resources
Maddox, Brian G.
2004-01-01
One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.
Lee, Tian-Fu
2014-12-01
Telecare medicine information systems provide a communicating platform for accessing remote medical resources through public networks, and help health care workers and medical personnel to rapidly making correct clinical decisions and treatments. An authentication scheme for data exchange in telecare medicine information systems enables legal users in hospitals and medical institutes to establish a secure channel and exchange electronic medical records or electronic health records securely and efficiently. This investigation develops an efficient and secure verified-based three-party authentication scheme by using extended chaotic maps for data exchange in telecare medicine information systems. The proposed scheme does not require server's public keys and avoids time-consuming modular exponential computations and scalar multiplications on elliptic curve used in previous related approaches. Additionally, the proposed scheme is proven secure in the random oracle model, and realizes the lower bounds of messages and rounds in communications. Compared to related verified-based approaches, the proposed scheme not only possesses higher security, but also has lower computational cost and fewer transmissions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lima, Filipe C. D. A.; Iost, Rodrigo M.; Crespilho, Frank N.; Caldas, Marília J.; Calzolari, Arrigo; Petrilli, Helena M.
2013-03-01
We report the investigation of electron tunneling mechanism of peptide ferrocenyl-glycylcystamine self-assembled monolayers (SAMs) onto Au (111) electrode surfaces. Recent experimental investigations showed that electron transfer in peptides can occur across long distances by separating the donor from the acceptor. This mechanism can be further fostered by the presence of electron donor terminations of Fc terminal units on SAMs but the charge transfer mechanism is still not clear. We study the interaction of the peptide ferrocenyl-glycylcystamine on the Au (111) from first principles calculations to evaluate the electron transfer mechanism. For this purpose, we used the Kohn Sham (KS) scheme for the Density Functional Theory (DFT) as implemented in the Quantum-ESPRESSO suit of codes, using Vandebilt ultrasoft pseudopotentials and GGA-PBE exchange correlation functional to evaluate the ground-state atomic and electronic structure of the system. The analysis of KS orbital at the Fermi Energy showed high electronic density localized in Fc molecules and the observation of a minor contribution from the solvent and counter ion. Based on the results, we infer evidences of electron tunneling mechanism from the molecule to the Au(111). We acknowledge FAPESP for grant support. Also, LCCA/USP, RICE and CENAPAD for computational resources.
Contextuality as a Resource for Models of Quantum Computation with Qubits
NASA Astrophysics Data System (ADS)
Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E.; Okay, Cihan; Raussendorf, Robert
2017-09-01
A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.
Shaping the Electronic Library--The UW-Madison Approach.
ERIC Educational Resources Information Center
Dean, Charles W., Ed.; Frazier, Ken; Pope, Nolan F.; Gorman, Peter C.; Dentinger, Sue; Boston, Jeanne; Phillips, Hugh; Daggett, Steven C.; Lundquist, Mitch; McClung, Mark; Riley, Curran; Allan, Craig; Waugh, David
1998-01-01
This special theme section describes the University of Wisconsin-Madison's experience building its Electronic Library. Highlights include integrating resources and services; the administrative framework; the public electronic library, including electronic publishing capability and access to World Wide Web-based and other electronic resources;…
NASA Astrophysics Data System (ADS)
Sahu, H. K.; Singh, S. N.
2015-04-01
This paper discusses and presents a comparative case study of two libraries in Pune, India, Inter-University Centre for Astronomy and Astrophysics and Information Centre and Library of National Institute of Virology (Indian Council of Medical Research). It compares how both libraries have managed their e-resource collections, including acquisitions, subscriptions, and consortia arrangements, while also developing a collection of their own resources, including pre-prints and publications, video lectures, and other materials in an institutional repository. This study illustrates how difficult it is to manage electronic resources in a developing country like India, even though electronic resources are used more than print resources. Electronic resource management can be daunting, but with a systematic approach, various problems can be solved, and use of the materials will be enhanced.
Computing arrival times of firefighting resources for initial attack
Romain M. Mees
1978-01-01
Dispatching of firefighting resources requires instantaneous or precalculated decisions. A FORTRAN computer program has been developed that can provide a list of resources in order of computed arrival time for initial attack on a fire. The program requires an accurate description of the existing road system and a list of all resources available on a planning unit....
Integrating all medical records to an enterprise viewer.
Li, Haomin; Duan, Huilong; Lu, Xudong; Zhao, Chenhui; An, Jiye
2005-01-01
The idea behind hospital information systems is to make all of a patient's medical reports, lab results, and images electronically available to clinicians, instantaneously, wherever they are. But the higgledy-piggledy evolution of most hospital computer systems makes it hard to integrate all these clinical records. Although several integration standards had been proposed to meet this challenger, none of them is fit to Chinese hospitals. In this paper, we introduce our work of implementing a three-tiered architecture enterprise viewer in Huzhou Central Hospital to integration all existing medical information systems using limited resource.
Spaceborne Hybrid-FPGA System for Processing FTIR Data
NASA Technical Reports Server (NTRS)
Bekker, Dmitriy; Blavier, Jean-Francois L.; Pingree, Paula J.; Lukowiak, Marcin; Shaaban, Muhammad
2008-01-01
Progress has been made in a continuing effort to develop a spaceborne computer system for processing readout data from a Fourier-transform infrared (FTIR) spectrometer to reduce the volume of data transmitted to Earth. The approach followed in this effort, oriented toward reducing design time and reducing the size and weight of the spectrometer electronics, has been to exploit the versatility of recently developed hybrid field-programmable gate arrays (FPGAs) to run diverse software on embedded processors while also taking advantage of the reconfigurable hardware resources of the FPGAs.
Electronic neural network for dynamic resource allocation
NASA Technical Reports Server (NTRS)
Thakoor, A. P.; Eberhardt, S. P.; Daud, T.
1991-01-01
A VLSI implementable neural network architecture for dynamic assignment is presented. The resource allocation problems involve assigning members of one set (e.g. resources) to those of another (e.g. consumers) such that the global 'cost' of the associations is minimized. The network consists of a matrix of sigmoidal processing elements (neurons), where the rows of the matrix represent resources and columns represent consumers. Unlike previous neural implementations, however, association costs are applied directly to the neurons, reducing connectivity of the network to VLSI-compatible 0 (number of neurons). Each row (and column) has an additional neuron associated with it to independently oversee activations of all the neurons in each row (and each column), providing a programmable 'k-winner-take-all' function. This function simultaneously enforces blocking (excitatory/inhibitory) constraints during convergence to control the number of active elements in each row and column within desired boundary conditions. Simulations show that the network, when implemented in fully parallel VLSI hardware, offers optimal (or near-optimal) solutions within only a fraction of a millisecond, for problems up to 128 resources and 128 consumers, orders of magnitude faster than conventional computing or heuristic search methods.
ERIC Educational Resources Information Center
Falkner, Katrina; Vivian, Rebecca
2015-01-01
To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Bussmann, Hermann; Wester, C William; Ndwapi, Ndwapi; Vanderwarker, Chris; Gaolathe, Tendani; Tirelo, Geoffrey; Avalos, Ava; Moffat, Howard; Marlink, Richard G
2006-02-01
Individual patient care and programme evaluation are pivotal for the success of antiretroviral treatment programmes in resource-limited countries. While computer-aided documentation and data storage are indispensable for any large programme, several important issues need to be addressed including which data are to be collected, who collects it and how it is entered into an electronic database. We describe a patient-monitoring approach, which uses patient encounter forms (in hybrid paper + electronic format) based on optical character recognition, piloted at Princess Marina Hospital in Gaborone, Botswana's first public highly active antiretroviral therapy (HAART) outpatient clinic. Our novel data capture approach collects "key" data for tracking patient and programme outcomes. It saves physician time and does not detract from clinical care.
Information specialist for a coming age (12)
NASA Astrophysics Data System (ADS)
Iinuma, Mitsuo
Since we entered the advanced information society, information activities infiltrated into every aspect of our life such as economy and daily life. In this circumstances, business management is now going to change in its way and policy. Especially, globalization of business activities and shifting to service business have brought a new aspect into the information activities in the business, which has now become a fundamental activity in business management. The new technology of computer and telecommunication network played a key role, and brought electronic information, which was a new type of management information. The electronic information with intellectual property has become valuable as a new resources to be marketable, as well as by its usefulness as management information. Thus, businesses will have to change their policies concerning information from "managing information" to "managing by information."
Bussmann, Hermann; Wester, C. William; Ndwapi, Ndwapi; Vanderwarker, Chris; Gaolathe, Tendani; Tirelo, Geoffrey; Avalos, Ava; Moffat, Howard; Marlink, Richard G.
2006-01-01
Individual patient care and programme evaluation are pivotal for the success of antiretroviral treatment programmes in resource-limited countries. While computer-aided documentation and data storage are indispensable for any large programme, several important issues need to be addressed including which data are to be collected, who collects it and how it is entered into an electronic database. We describe a patient-monitoring approach, which uses patient encounter forms (in hybrid paper + electronic format) based on optical character recognition, piloted at Princess Marina Hospital in Gaborone, Botswana's first public highly active antiretroviral therapy (HAART) outpatient clinic. Our novel data capture approach collects "key" data for tracking patient and programme outcomes. It saves physician time and does not detract from clinical care. PMID:16501730
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
Computing the Envelope for Stepwise-Constant Resource Allocations
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Clancy, Daniel (Technical Monitor)
2002-01-01
Computing tight resource-level bounds is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with nodes equal to the events and edges equal to the necessary predecessor links between events. A staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. Each stage has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible and promising for use in the inner loop of flexible-time scheduling algorithms.
Tablet computers in assessing performance in a high stakes exam: opinion matters.
Currie, G P; Sinha, S; Thomson, F; Cleland, J; Denison, A R
2017-06-01
Background Tablet computers have emerged as a tool to capture, process and store data in examinations, yet evidence relating to their acceptability and usefulness in assessment is limited. Methods We performed an observational study to explore opinions and attitudes relating to tablet computer use in recording performance in a final year objective structured clinical examination at a single UK medical school. Examiners completed a short questionnaire encompassing background, forced-choice and open questions. Forced choice questions were analysed using descriptive statistics and open questions by framework analysis. Results Ninety-two (97% response rate) examiners completed the questionnaire of whom 85% had previous use of tablet computers. Ninety per cent felt checklist mark allocation was 'very/quite easy', while approximately half considered recording 'free-type' comments was 'easy/very easy'. Greater overall efficiency of marking and resource savings were considered the main advantages of tablet computers, while concerns relating to technological failure and ability to record free type comments were raised. Discussion In a context where examiners were familiar with tablet computers, they were preferred to paper checklists, although concerns were raised. This study adds to the limited literature underpinning the use of electronic devices as acceptable tools in objective structured clinical examinations.
ERIC Educational Resources Information Center
Kachaluba, Sarah Buck; Brady, Jessica Evans; Critten, Jessica
2014-01-01
This article is based on quantitative and qualitative research examining humanities scholars' understandings of the advantages and disadvantages of print versus electronic information resources. It explores how humanities' faculty members at Florida State University (FSU) use print and electronic resources, as well as how they perceive these…
Using a Decision Grid Process to Build Consensus in Electronic Resources Cancellation Decisions
ERIC Educational Resources Information Center
Foudy, Gerri; McManus, Alesia
2005-01-01
Many libraries are expending an increasing part of their collections budgets on electronic resources. At the same time many libraries, especially those which are state funded, face diminishing budgets and high rates of inflation for serials subscriptions in all formats, including electronic resources. Therefore, many libraries need to develop ways…
Distributed Accounting on the Grid
NASA Technical Reports Server (NTRS)
Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.
2001-01-01
By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.
1991-06-01
Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent
Experience in using commercial clouds in CMS
NASA Astrophysics Data System (ADS)
Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration
2017-10-01
Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.
Experience in using commercial clouds in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauerdick, L.; Bockelman, B.; Dykstra, D.
Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less
The utilization of poisons information resources in Australasia.
Fountain, J S; Reith, D M; Holt, A
2014-02-01
To identify poisons information resources most commonly utilized by Australasian Emergency Department staff, and examine attitudes regarding the benefits and user experience of the electronic products used. A survey tool was mailed to six Emergency Departments each in New Zealand and Australia to be answered by medical and nursing staff. Eighty six (71.7%) responses were received from the 120 survey forms sent: 70 (81%) responders were medical staff, the remainder nursing. Electronic resources were the most accessed poisons information resource in New Zealand; Australians preferring discussion with a colleague; Poisons Information Centers were the least utilized resource in both countries. With regard to electronic resources, further differences were recognized between countries in: ease of access, ease of use, quality of information and quantity of information, with New Zealand better in all four themes. New Zealand ED staff favored electronic poisons information resources while Australians preferred discussion with a colleague. That Poisons Information Centers were the least utilized resource was surprising. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Guelph Univ. (Ontario).
This 21-paper collection examines various issues in electronic networking and conferencing with computers, including design issues, conferencing in education, electronic messaging, computer conferencing applications, social issues of computer conferencing, and distributed computer conferencing. In addition to a keynote address, "Computer…
Checklist Manifesto for Electronic Resources: Getting Ready for the Fiscal Year and Beyond
ERIC Educational Resources Information Center
England, Lenore; Fu, Li; Miller, Stephen
2011-01-01
Organization of electronic resources workflow is critical in the increasingly complicated and complex world of library management. A simple organizational tool that can be readily applied to electronic resources management (ERM) is the use of checklists. Based on the principles discussed in The Checklist Manifesto: How to Get Things Right, the…
Supporting the Emergence of Dental Informatics with an Online Community
Spallek, H.; Irwin, J. Y.; Schleyer, T.; Butler, B. S.; Weiss, P. M.
2008-01-01
Dental Informatics (DI) is the application of computer and information science to improve dental practice, research, education, and program administration. As an emerging field, dental informatics faces many challenges and barriers to establishing itself as a full-fledged discipline; these include the small number of geographically dispersed DI researchers as well as the lack of DI professional societies and DI-specific journals. E-communities have the potential to overcome these obstacles by bringing researchers together at a resources hub and giving them the ability to share information, discuss topics, and find collaborators. In this paper, we discuss our assessment of the information needs of individuals interested in DI and discuss their expectations for an e-community so that we can design an optimal electronic infrastructure for the Dental Informatics Online Community (DIOC). The 256 survey respondents indicated they prefer electronic resources over traditional print material to satisfy their information needs. The most frequently expected benefits from participation in the DIOC were general information (85% of respondents), peer networking (31.1%), and identification of potential collaborators and/or research opportunities (23.2%). We are currently building the DIOC electronic infrastructure: a searchable publication archive and the learning center have been created, and the people directory is underway. Readers are encouraged to access the DIOC Website at www.dentalinformatics.com and initiate a discussion with the authors of this paper. PMID:18271498
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
E-waste management and resources recovery in France.
Vadoudi, Kiyan; Kim, Junbeum; Laratte, Bertrand; Lee, Seung-Jin; Troussier, Nadège
2015-10-01
There are various issues of concern regarding electronic waste management, such as the toxicity of hazardous materials and the collection, recycling and recovery of useful resources. To understand the fate of electronic waste after collection and recycling, a products and materials flow analysis should be performed. This is a critical need, as material resources are becoming increasingly scarce and recycling may be able to provide secondary sources for new materials in the future. In this study, we investigate electronic waste systems, specifically the resource recovery or recycling aspects, as well as mapping electronic waste flows based on collection data in France. Approximately 1,588,453 t of new electrical and electronic equipment were sold in the French market in 2010. Of this amount, 430,000 t of electronic waste were collected, with the remaining 1,128,444 t remaining in stock. Furthermore, the total recycled amounts were 354,106 t and 11,396 t, respectively. The main electronic waste materials were ferrous metals (37%), plastic (22%), aluminium (12%), copper (11%) and glass (7%). This study will contribute to developing sustainable electronic waste and resource recycling systems in France. © The Author(s) 2015.
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment
DOT National Transportation Integrated Search
1976-10-01
A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...
Imaged document information location and extraction using an optical correlator
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-12-01
Today, the paper document is fast becoming a thing of the past. With the rapid development of fast, inexpensive computing and storage devices, many government and private organizations are archiving their documents in electronic form (e.g., personnel records, medical records, patents, etc.). Many of these organizations are converting their paper archives to electronic images, which are then stored in a computer database. Because of this, there is a need to efficiently organize this data into comprehensive and accessible information resources and provide for rapid access to the information contained within these imaged documents. To meet this need, Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provide a means for the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives and has the potential to determine the types of languages contained within a document. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited, e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.
Electronic imaging of the human body
NASA Astrophysics Data System (ADS)
Vannier, Michael W.; Yates, Randall E.; Whitestone, Jennifer J.
1992-09-01
The Human Engineering Division of the Armstrong Laboratory (USAF); the Mallinckrodt Institute of Radiology; the Washington University School of Medicine; and the Lister-Hill National Center for Biomedical Communication, National Library of Medicine are sponsoring a working group on electronic imaging of the human body. Electronic imaging of the surface of the human body has been pursued and developed by a number of disciplines including radiology, forensics, surgery, engineering, medical education, and anthropometry. The applications range from reconstructive surgery to computer-aided design (CAD) of protective equipment. Although these areas appear unrelated, they have a great deal of commonality. All the organizations working in this area are faced with the challenges of collecting, reducing, and formatting the data in an efficient and standard manner; storing this data in a computerized database to make it readily accessible; and developing software applications that can visualize, manipulate, and analyze the data. This working group is being established to encourage effective use of the resources of all the various groups and disciplines involved in electronic imaging of the human body surface by providing a forum for discussing progress and challenges with these types of data.
18 CFR 390.1 - Electronic registration.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electronic registration. 390.1 Section 390.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY PROCEDURAL RULES ELECTRONIC REGISTRATION § 390.1 Electronic registration. Any person who...
18 CFR 390.1 - Electronic registration.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electronic registration. 390.1 Section 390.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY PROCEDURAL RULES ELECTRONIC REGISTRATION § 390.1 Electronic registration. Any person who...
18 CFR 390.1 - Electronic registration.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Electronic registration. 390.1 Section 390.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY PROCEDURAL RULES ELECTRONIC REGISTRATION § 390.1 Electronic registration. Any person who...
18 CFR 390.1 - Electronic registration.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electronic registration. 390.1 Section 390.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY PROCEDURAL RULES ELECTRONIC REGISTRATION § 390.1 Electronic registration. Any person who...
18 CFR 390.1 - Electronic registration.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electronic registration. 390.1 Section 390.1 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY PROCEDURAL RULES ELECTRONIC REGISTRATION § 390.1 Electronic registration. Any person who...
System Resource Allocations | High-Performance Computing | NREL
Allocations System Resource Allocations To use NREL's high-performance computing (HPC) resources : Compute hours on NREL HPC Systems including Peregrine and Eagle Storage space (in Terabytes) on Peregrine , Eagle and Gyrfalcon. Allocations are principally done in response to an annual call for allocation
Computers as learning resources in the health sciences: impact and issues.
Ellis, L B; Hannigan, G G
1986-01-01
Starting with two computer terminals in 1972, the Health Sciences Learning Resources Center of the University of Minnesota Bio-Medical Library expanded its instructional facilities to ten terminals and thirty-five microcomputers by 1985. Computer use accounted for 28% of total center circulation. The impact of these resources on health sciences curricula is described and issues related to use, support, and planning are raised and discussed. Judged by their acceptance and educational value, computers are successful health sciences learning resources at the University of Minnesota. PMID:3518843
An emulator for minimizing finite element analysis implementation resources
NASA Technical Reports Server (NTRS)
Melosh, R. J.; Utku, S.; Salama, M.; Islam, M.
1982-01-01
A finite element analysis emulator providing a basis for efficiently establishing an optimum computer implementation strategy when many calculations are involved is described. The SCOPE emulator determines computer resources required as a function of the structural model, structural load-deflection equation characteristics, the storage allocation plan, and computer hardware capabilities. Thereby, it provides data for trading analysis implementation options to arrive at a best strategy. The models contained in SCOPE lead to micro-operation computer counts of each finite element operation as well as overall computer resource cost estimates. Application of SCOPE to the Memphis-Arkansas bridge analysis provides measures of the accuracy of resource assessments. Data indicate that predictions are within 17.3 percent for calculation times and within 3.2 percent for peripheral storage resources for the ELAS code.
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
Papi, Ahmad; Ghazavi, Roghayeh; Moradi, Salimeh
2015-01-01
Understanding of the medical society's from the types of information resources for quick and easy access to information is an imperative task in medical researches and management of the treatment. The present study was aimed to determine the level of awareness of the physicians in using various electronic information resources and the factors affecting it. This study was a descriptive survey. The data collection tool was a researcher-made questionnaire. The study population included all the physicians and specialty physicians of the teaching hospitals affiliated to Isfahan University of Medical Sciences and numbered 350. The sample size based on Morgan's formula was set at 180. The content validity of the tool was confirmed by the library and information professionals and the reliability was 95%. Descriptive statistics were used including the SPSS software version 19. On reviewing the need of the physicians to obtain the information on several occasions, the need for information in conducting the researches was reported by the maximum number of physicians (91.9%) and the usage of information resources, especially the electronic resources, formed 65.4% as the highest rate with regard to meeting the information needs of the physicians. Among the electronic information databases, the maximum awareness was related to Medline with 86.5%. Among the various electronic information resources, the highest awareness (43.3%) was related to the E-journals. The highest usage (36%) was also from the same source. The studied physicians considered the most effective deterrent in the use of electronic information resources as being too busy and lack of time. Despite the importance of electronic information resources for the physician's community, there was no comprehensive knowledge of these resources. This can lead to less usage of these resources. Therefore, careful planning is necessary in the hospital libraries in order to introduce the facilities and full capabilities of the mentioned resources and methods of information retrieval.
Pluye, Pierre; Grad, Roland M; Johnson-Lafleur, Janique; Granikov, Vera; Shulha, Michael; Marlow, Bernard; Ricarte, Ivan Luiz Marques
2013-01-01
We wanted to describe family physicians' use of information from an electronic knowledge resource for answering clinical questions, and their perception of subsequent patient health outcomes; and to estimate the number needed to benefit from information (NNBI), defined as the number of patients for whom clinical information was retrieved for 1 to benefit. We undertook a mixed methods research study, combining quantitative longitudinal and qualitative research studies. Participants were 41 family physicians from primary care clinics across Canada. Physicians were given access to 1 electronic knowledge resource on handheld computer in 2008-2009. For the outcome assessment, participants rated their searches using a validated method. Rated searches were examined during interviews guided by log reports that included ratings. Cases were defined as clearly described searches where clinical information was used for a specific patient. For each case, interviewees described information-related patient health outcomes. For the mixed methods data analysis, quantitative and qualitative data were merged into clinical vignettes (each vignette describing a case). We then estimated the NNBI. In 715 of 1,193 searches for information conducted during an average of 86 days, the search objective was directly linked to a patient. Of those searches, 188 were considered to be cases. In 53 cases, participants associated the use of information with at least 1 patient health benefit. This finding suggested an NNBI of 14 (715/53). The NNBI may be used in further experimental research to compare electronic knowledge resources. A low NNBI can encourage clinicians to search for information more frequently. If all searches had benefits, the NNBI would be 1. In addition to patient benefits, learning and knowledge reinforcement outcomes are frequently reported.
Pluye, Pierre; Grad, Roland M.; Johnson-Lafleur, Janique; Granikov, Vera; Shulha, Michael; Marlow, Bernard; Ricarte, Ivan Luiz Marques
2013-01-01
PURPOSE We wanted to describe family physicians’ use of information from an electronic knowledge resource for answering clinical questions, and their perception of subsequent patient health outcomes; and to estimate the number needed to benefit from information (NNBI), defined as the number of patients for whom clinical information was retrieved for 1 to benefit. METHODS We undertook a mixed methods research study, combining quantitative longitudinal and qualitative research studies. Participants were 41 family physicians from primary care clinics across Canada. Physicians were given access to 1 electronic knowledge resource on handheld computer in 2008–2009. For the outcome assessment, participants rated their searches using a validated method. Rated searches were examined during interviews guided by log reports that included ratings. Cases were defined as clearly described searches where clinical information was used for a specific patient. For each case, interviewees described information-related patient health outcomes. For the mixed methods data analysis, quantitative and qualitative data were merged into clinical vignettes (each vignette describing a case). We then estimated the NNBI. RESULTS In 715 of 1,193 searches for information conducted during an average of 86 days, the search objective was directly linked to a patient. Of those searches, 188 were considered to be cases. In 53 cases, participants associated the use of information with at least 1 patient health benefit. This finding suggested an NNBI of 14 (715/53). CONCLUSION The NNBI may be used in further experimental research to compare electronic knowledge resources. A low NNBI can encourage clinicians to search for information more frequently. If all searches had benefits, the NNBI would be 1. In addition to patient benefits, learning and knowledge reinforcement outcomes are frequently reported. PMID:24218380
Vidossich, Pietro; Lledós, Agustí; Ujaque, Gregori
2016-06-21
Computational chemistry is a valuable aid to complement experimental studies of organometallic systems and their reactivity. It allows probing mechanistic hypotheses and investigating molecular structures, shedding light on the behavior and properties of molecular assemblies at the atomic scale. When approaching a chemical problem, the computational chemist has to decide on the theoretical approach needed to describe electron/nuclear interactions and the composition of the model used to approximate the actual system. Both factors determine the reliability of the modeling study. The community dedicated much effort to developing and improving the performance and accuracy of theoretical approaches for electronic structure calculations, on which the description of (inter)atomic interactions rely. Here, the importance of the model system used in computational studies is highlighted through examples from our recent research focused on organometallic systems and homogeneous catalytic processes. We show how the inclusion of explicit solvent allows the characterization of molecular events that would otherwise not be accessible in reduced model systems (clusters). These include the stabilization of nascent charged fragments via microscopic solvation (notably, hydrogen bonding), transfer of charge (protons) between distant fragments mediated by solvent molecules, and solvent coordination to unsaturated metal centers. Furthermore, when weak interactions are involved, we show how conformational and solvation properties of organometallic complexes are also affected by the explicit inclusion of solvent molecules. Such extended model systems may be treated under periodic boundary conditions, thus removing the cluster/continuum (or vacuum) boundary, and require a statistical mechanics simulation technique to sample the accessible configurational space. First-principles molecular dynamics, in which atomic forces are computed from electronic structure calculations (namely, density functional theory), is certainly the technique of choice to investigate chemical events in solution. This methodology is well established and thanks to advances in both algorithms and computational resources simulation times required for the modeling of chemical events are nowadays accessible, though the computational requirements use to be high. Specific applications reviewed here include mechanistic studies of the Shilov and Wacker processes, speciation in Pd chemistry, hydrogen bonding to metal centers, and the dynamics of agostic interactions.
SCEAPI: A unified Restful Web API for High-Performance Computing
NASA Astrophysics Data System (ADS)
Rongqiang, Cao; Haili, Xiao; Shasha, Lu; Yining, Zhao; Xiaoning, Wang; Xuebin, Chi
2017-10-01
The development of scientific computing is increasingly moving to collaborative web and mobile applications. All these applications need high-quality programming interface for accessing heterogeneous computing resources consisting of clusters, grid computing or cloud computing. In this paper, we introduce our high-performance computing environment that integrates computing resources from 16 HPC centers across China. Then we present a bundle of web services called SCEAPI and describe how it can be used to access HPC resources with HTTP or HTTPs protocols. We discuss SCEAPI from several aspects including architecture, implementation and security, and address specific challenges in designing compatible interfaces and protecting sensitive data. We describe the functions of SCEAPI including authentication, file transfer and job management for creating, submitting and monitoring, and how to use SCEAPI in an easy-to-use way. Finally, we discuss how to exploit more HPC resources quickly for the ATLAS experiment by implementing the custom ARC compute element based on SCEAPI, and our work shows that SCEAPI is an easy-to-use and effective solution to extend opportunistic HPC resources.
The Relevancy of Graduate Curriculum to Human Resource Professionals' Electronic Communication.
ERIC Educational Resources Information Center
Hoell, Robert C.; Henry, Gordon O.
2003-01-01
Electronic communications of human resource professionals and the content of 23 university human resource management courses were categorized using the Human Resource Certification Institute's body of knowledge. Differences between proportion of topics discussed and topics covered in curricula suggest some topics are over- or undertaught.…
NASA Technical Reports Server (NTRS)
Chow, Edward T.; Schatzel, Donald V.; Whitaker, William D.; Sterling, Thomas
2008-01-01
A Spaceborne Processor Array in Multifunctional Structure (SPAMS) can lower the total mass of the electronic and structural overhead of spacecraft, resulting in reduced launch costs, while increasing the science return through dynamic onboard computing. SPAMS integrates the multifunctional structure (MFS) and the Gilgamesh Memory, Intelligence, and Network Device (MIND) multi-core in-memory computer architecture into a single-system super-architecture. This transforms every inch of a spacecraft into a sharable, interconnected, smart computing element to increase computing performance while simultaneously reducing mass. The MIND in-memory architecture provides a foundation for high-performance, low-power, and fault-tolerant computing. The MIND chip has an internal structure that includes memory, processing, and communication functionality. The Gilgamesh is a scalable system comprising multiple MIND chips interconnected to operate as a single, tightly coupled, parallel computer. The array of MIND components shares a global, virtual name space for program variables and tasks that are allocated at run time to the distributed physical memory and processing resources. Individual processor- memory nodes can be activated or powered down at run time to provide active power management and to configure around faults. A SPAMS system is comprised of a distributed Gilgamesh array built into MFS, interfaces into instrument and communication subsystems, a mass storage interface, and a radiation-hardened flight computer.
Software and resources for computational medicinal chemistry
Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C
2011-01-01
Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404
Many-electron effects in the optical properties of single-walled carbon nanotubes
NASA Astrophysics Data System (ADS)
Spataru, Catalin D.; Ismail-Beigi, Sohrab; Capaz, Rodrigo B.; Louie, Steven G.
2005-03-01
Recent optical measurements on single-wall carbon nanotubes (SWCNT) showed anomalous behaviors that are indicative of strong many-electron effects. To understand these data, we performed ab initio calculation of self-energy and electron-hole interaction (excitonic) effects on the optical spectra of several SWCNTs. We employed a many-electron Green's function approach that determines both the quasiparticle and optical excitations from first principles. We found important many-electron effects that explain many of the puzzling experimental findings in the optical spectrum of these quasi-one dimensional systems, and are in excellent quantitative agreement with measurements. We have also calculated the radiative lifetime of the bright excitons in these tubes. Taking into account temperature effects and the existence of dark excitons, our results explain the radiative lifetime of excited nanotubes measured in time- resolved fluorescence experiments. This work was supported by the NSF under Grant No. DMR04-39768, and the U.S. DOE under Contract No. DE-AC03-76SF00098. Computational resources have been provided by NERSC and NPACI. RBC acknowledges financial support from the Guggenheim Foundation and Brazilian funding agencies CNPq, CAPES, FAPERJ, Instituto de Nanociências, FUJB-UFRJ and PRONEX-MCT.
NASA Astrophysics Data System (ADS)
Deslippe, Jack; Samsonidze, Georgy; Strubbe, David A.; Jain, Manish; Cohen, Marvin L.; Louie, Steven G.
2012-06-01
BerkeleyGW is a massively parallel computational package for electron excited-state properties that is based on the many-body perturbation theory employing the ab initio GW and GW plus Bethe-Salpeter equation methodology. It can be used in conjunction with many density-functional theory codes for ground-state properties, including PARATEC, PARSEC, Quantum ESPRESSO, SIESTA, and Octopus. The package can be used to compute the electronic and optical properties of a wide variety of material systems from bulk semiconductors and metals to nanostructured materials and molecules. The package scales to 10 000s of CPUs and can be used to study systems containing up to 100s of atoms. Program summaryProgram title: BerkeleyGW Catalogue identifier: AELG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Open source BSD License. See code for licensing details. No. of lines in distributed program, including test data, etc.: 576 540 No. of bytes in distributed program, including test data, etc.: 110 608 809 Distribution format: tar.gz Programming language: Fortran 90, C, C++, Python, Perl, BASH Computer: Linux/UNIX workstations or clusters Operating system: Tested on a variety of Linux distributions in parallel and serial as well as AIX and Mac OSX RAM: (50-2000) MB per CPU (Highly dependent on system size) Classification: 7.2, 7.3, 16.2, 18 External routines: BLAS, LAPACK, FFTW, ScaLAPACK (optional), MPI (optional). All available under open-source licenses. Nature of problem: The excited state properties of materials involve the addition or subtraction of electrons as well as the optical excitations of electron-hole pairs. The excited particles interact strongly with other electrons in a material system. This interaction affects the electronic energies, wavefunctions and lifetimes. It is well known that ground-state theories, such as standard methods based on density-functional theory, fail to correctly capture this physics. Solution method: We construct and solve the Dyson's equation for the quasiparticle energies and wavefunctions within the GW approximation for the electron self-energy. We additionally construct and solve the Bethe-Salpeter equation for the correlated electron-hole (exciton) wavefunctions and excitation energies. Restrictions: The material size is limited in practice by the computational resources available. Materials with up to 500 atoms per periodic cell can be studied on large HPCs. Additional comments: The distribution file for this program is approximately 110 Mbytes and therefore is not delivered directly when download or E-mail is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: 1-1000 minutes (depending greatly on system size and processor number).
ERIC Educational Resources Information Center
Amusa, Oyintola Isiaka; Atinmo, Morayo
2016-01-01
(Purpose) This study surveyed the level of availability, use and constraints to use of electronic resources among law lecturers in Nigeria. (Methodology) Five hundred and fifty-two law lecturers were surveyed and four hundred and forty-two responded. (Results) Data analysis revealed that the level of availability of electronic resources for the…
ERIC Educational Resources Information Center
Bello, Stephen Adeyemi; Ojo, Funmilayo Roseline; Ocheje, Charles Bala
2015-01-01
Relevant electronic information resources in contemporary information age are necessity to buttress teaching and learning for effective knowledge development in educational institutions. The purpose of the study is to know the state of availability of electronic information resources in government owned secondary school libraries in Ijumu Local…
Exploiting opportunistic resources for ATLAS with ARC CE and the Event Service
NASA Astrophysics Data System (ADS)
Cameron, D.; Filipčič, A.; Guan, W.; Tsulaia, V.; Walker, R.; Wenaus, T.;
2017-10-01
With ever-greater computing needs and fixed budgets, big scientific experiments are turning to opportunistic resources as a means to add much-needed extra computing power. These resources can be very different in design from those that comprise the Grid computing of most experiments, therefore exploiting them requires a change in strategy for the experiment. They may be highly restrictive in what can be run or in connections to the outside world, or tolerate opportunistic usage only on condition that tasks may be terminated without warning. The Advanced Resource Connector Computing Element (ARC CE) with its nonintrusive architecture is designed to integrate resources such as High Performance Computing (HPC) systems into a computing Grid. The ATLAS experiment developed the ATLAS Event Service (AES) primarily to address the issue of jobs that can be terminated at any point when opportunistic computing capacity is needed by someone else. This paper describes the integration of these two systems in order to exploit opportunistic resources for ATLAS in a restrictive environment. In addition to the technical details, results from deployment of this solution in the SuperMUC HPC centre in Munich are shown.
Fresnel, A; Jarno, P; Burgun, A; Delamarre, D; Denier, P; Cleret, M; Courtin, C; Seka, L P; Pouliquen, B; Cléran, L; Riou, C; Leduff, F; Lesaux, H; Duvauferrier, R; Le Beux, P
1998-01-01
A pedagogical network has been developed at University Hospital of Rennes from 1996. The challenge is to give medical information and informatics tools to all medical students in the clinical wards of the University Hospital. At first, nine wards were connected to the medical school server which is linked to the Internet. Client software electronic mail and WWW Netscape on Macintosh computers. Sever software is set up on Unix SUN providing a local homepage with selected pedagogical resources. These documents are stored in a DBMS database ORACLE and queries can be provided by specialty, authors or disease. The students can access a set of interactive teaching programs or electronic textbooks and can explore the Internet through the library information system and search engines. The teachers can send URL and indexation of pedagogical documents and can produce clinical cases: the database updating will be done by the users. This experience of using Web tools generated enthusiasm when we first introduced it to students. The evaluation shows that if the students can use this training early on, they will adapt the resources of the Internet to their own needs.
Integration of Cloud resources in the LHCb Distributed Computing
NASA Astrophysics Data System (ADS)
Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel
2014-06-01
This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.
Pedagogical underpinnings of computer-based learning.
Adams, Audrey M
2004-04-01
E-learning is becoming increasingly incorporated into educational programmes. Digital materials usually require a lot of investment in terms of time, money and human resources. With advances in technology, delivery of content has much improved in terms of multimedia elements. However, often only low-level learning is achieved as a result of using these materials. The purpose of this article is to give a comprehensive overview of some of the most important issues to consider when incorporating e-learning into educational programmes. Computer-based learning has three components: hardware, software and 'underware', the pedagogy that underpins its development. The latter is the most important, as the approach adopted will influence the creation of computer-based learning materials and determine the way in which students engage with subject matter. Teachers are responsible for the quality of their courses and have a vital role in helping to develop the most appropriate electronic learning activities that will facilitate students to acquire the knowledge and skills necessary for clinical practice. Therefore, they need to have an awareness of what contributes to educationally effective, computer-based learning materials.
The road to business process improvement--can you get there from here?
Gilberto, P A
1995-11-01
Historically, "improvements" within the organization have been frequently attained through automation by building and installing computer systems. Material requirements planning (MRP), manufacturing resource planning II (MRP II), just-in-time (JIT), computer aided design (CAD), computer aided manufacturing (CAM), electronic data interchange (EDI), and various other TLAs (three-letter acronyms) have been used as the methods to attain business objectives. But most companies have found that installing computer software, cleaning up their data, and providing every employee with training on how to best use the systems have not resulted in the level of business improvements needed. The software systems have simply made management around the problems easier but did little to solve the basic problems. The missing element in the efforts to improve the performance of the organization has been a shift in focus from individual department improvements to cross-organizational business process improvements. This article describes how the Electric Boat Division of General Dynamics Corporation, in conjunction with the Data Systems Division, moved its focus from one of vertical organizational processes to horizontal business processes. In other words, how we got rid of the dinosaurs.
Use of traditional versus electronic medical-information resources by residents and interns.
Phua, Jason; Lim, T K
2007-05-01
Little is known about the information-seeking behaviour of junior doctors, with regard to their use of traditional versus electronic sources of information. To evaluate the amount of time junior doctors spent using various medical-information resources and how useful they perceived these resources to be. A questionnaire study of all residents and interns in a tertiary teaching hospital in July and August 2004. In total, 134 doctors returned the completed questionnaires (response rate 79.8%). They spent the most time using traditional resources like teaching sessions and print textbooks, rating them as most useful. However, electronic resources like MEDLINE, UpToDate, and online review articles also ranked highly. Original research articles were less popular. Residents and interns prefer traditional sources of medical information. Meanwhile, though some electronic resources are rated highly, more work is required to remove the barriers to evidence-based medicine.
Design & implementation of distributed spatial computing node based on WPS
NASA Astrophysics Data System (ADS)
Liu, Liping; Li, Guoqing; Xie, Jibo
2014-03-01
Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.
Economic models for management of resources in peer-to-peer and grid computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David
2001-07-01
The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.
Panchabhai, T S; Dangayach, N S; Mehta, V S; Patankar, C V; Rege, N N
2011-01-01
Computer usage capabilities of medical students for introduction of computer-aided learning have not been adequately assessed. Cross-sectional study to evaluate computer literacy among medical students. Tertiary care teaching hospital in Mumbai, India. Participants were administered a 52-question questionnaire, designed to study their background, computer resources, computer usage, activities enhancing computer skills, and attitudes toward computer-aided learning (CAL). The data was classified on the basis of sex, native place, and year of medical school, and the computer resources were compared. The computer usage and attitudes toward computer-based learning were assessed on a five-point Likert scale, to calculate Computer usage score (CUS - maximum 55, minimum 11) and Attitude score (AS - maximum 60, minimum 12). The quartile distribution among the groups with respect to the CUS and AS was compared by chi-squared tests. The correlation between CUS and AS was then tested. Eight hundred and seventy-five students agreed to participate in the study and 832 completed the questionnaire. One hundred and twenty eight questionnaires were excluded and 704 were analyzed. Outstation students had significantly lesser computer resources as compared to local students (P<0.0001). The mean CUS for local students (27.0±9.2, Mean±SD) was significantly higher than outstation students (23.2±9.05). No such difference was observed for the AS. The means of CUS and AS did not differ between males and females. The CUS and AS had positive, but weak correlations for all subgroups. The weak correlation between AS and CUS for all students could be explained by the lack of computer resources or inadequate training to use computers for learning. Providing additional resources would benefit the subset of outstation students with lesser computer resources. This weak correlation between the attitudes and practices of all students needs to be investigated. We believe that this gap can be bridged with a structured computer learning program.
Computerized provider order entry in the clinical laboratory
Baron, Jason M.; Dighe, Anand S.
2011-01-01
Clinicians have traditionally ordered laboratory tests using paper-based orders and requisitions. However, paper orders are becoming increasingly incompatible with the complexities, challenges, and resource constraints of our modern healthcare systems and are being replaced by electronic order entry systems. Electronic systems that allow direct provider input of diagnostic testing or medication orders into a computer system are known as Computerized Provider Order Entry (CPOE) systems. Adoption of laboratory CPOE systems may offer institutions many benefits, including reduced test turnaround time, improved test utilization, and better adherence to practice guidelines. In this review, we outline the functionality of various CPOE implementations, review the reported benefits, and discuss strategies for using CPOE to improve the test ordering process. Further, we discuss barriers to the implementation of CPOE systems that have prevented their more widespread adoption. PMID:21886891
You Have "How Many" Spreadsheets? Rethinking Electronic Resource Management
ERIC Educational Resources Information Center
Rux, Erika; Borchert, Theresa
2010-01-01
As libraries face a veritable explosion of electronic resources and as the interconnectedness of print and online resources becomes increasingly complicated, many librarians are challenged to find efficient and cost-friendly ways to manage these resources. In this article, the authors describe how a team of people from various library departments…
An Integrated Approach to Engineering Education in a Minority Community
NASA Technical Reports Server (NTRS)
Taylor, Bill
1998-01-01
Northeastern New Mexico epitomizes regions which are economically depressed, rural, and predominantly Hispanic. New Mexico Highlands University (NMHU), with a small student population of approximately 2800, offers a familiar environment attracting students who might otherwise not attend college. An outreach computer network of minority schools was created in northeastern New Mexico with NASA funding. Rural and urban minority schools gained electronic access to each other, to computer resources, to technical help at New Mexico Highlands University and gained access to the world via the Internet. This outreach program was initiated in the fall of 1992 in an effort to attract and to involve minority students in Engineering and the Mathematical Sciences. We installed 56 Kbs Internet connections to eight elementary schools, two middle schools, two high schools, a public library (servicing the home schooling community) and an International Baccalaureate school. For another fourteen rural schools, we provided computers and free dial-up service to servers on the New Mexico Highlands University campus.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
The ReaxFF reactive force-field: Development, applications, and future directions
Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...
2016-03-04
The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less
NASA Astrophysics Data System (ADS)
Hanson-Heine, Magnus W. D.; George, Michael W.; Besley, Nicholas A.
2018-06-01
The restricted excitation subspace approximation is explored as a basis to reduce the memory storage required in linear response time-dependent density functional theory (TDDFT) calculations within the Tamm-Dancoff approximation. It is shown that excluding the core orbitals and up to 70% of the virtual orbitals in the construction of the excitation subspace does not result in significant changes in computed UV/vis spectra for large molecules. The reduced size of the excitation subspace greatly reduces the size of the subspace vectors that need to be stored when using the Davidson procedure to determine the eigenvalues of the TDDFT equations. Furthermore, additional screening of the two-electron integrals in combination with a reduction in the size of the numerical integration grid used in the TDDFT calculation leads to significant computational savings. The use of these approximations represents a simple approach to extend TDDFT to the study of large systems and make the calculations increasingly tractable using modest computing resources.
Toward a Fault Tolerant Architecture for Vital Medical-Based Wearable Computing.
Abdali-Mohammadi, Fardin; Bajalan, Vahid; Fathi, Abdolhossein
2015-12-01
Advancements in computers and electronic technologies have led to the emergence of a new generation of efficient small intelligent systems. The products of such technologies might include Smartphones and wearable devices, which have attracted the attention of medical applications. These products are used less in critical medical applications because of their resource constraint and failure sensitivity. This is due to the fact that without safety considerations, small-integrated hardware will endanger patients' lives. Therefore, proposing some principals is required to construct wearable systems in healthcare so that the existing concerns are dealt with. Accordingly, this paper proposes an architecture for constructing wearable systems in critical medical applications. The proposed architecture is a three-tier one, supporting data flow from body sensors to cloud. The tiers of this architecture include wearable computers, mobile computing, and mobile cloud computing. One of the features of this architecture is its high possible fault tolerance due to the nature of its components. Moreover, the required protocols are presented to coordinate the components of this architecture. Finally, the reliability of this architecture is assessed by simulating the architecture and its components, and other aspects of the proposed architecture are discussed.
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Loats, H. L.; Fowler, T. R.; Frech, S. L.
1974-01-01
A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.
NASA Technical Reports Server (NTRS)
Tolzman, Jean M.
1993-01-01
The potential for expanded communication among researchers, scholars, and students is supported by growth in the capabilities for electronic communication as well as expanding access to various forms of electronic interchange and computing capabilities. Increased possibilities for information exchange, collegial dialogue, collaboration, and access to remote resources exist as high-speed networks, increasingly powerful workstations, and large, multi-user computational facilities are more frequently linked and more commonly available. Numerous writers speak of the telecommunications revolution and its impact on the development and dissemination of knowledge and learning. One author offers the phrase 'Scholarly skywriting' to represent a new form of scientific communication that he envisions using electronic networks. In the United States (U.S.), researchers associated with the National Science Foundation (NSF) are exploring 'nationwide collaboratories' and 'digital collaboration.' Research supported by the U.S. National Aeronautics and Space Administration (NASA) points to a future where workstations with built-in audio, video monitors, and screen sharing protocols are used to support collaborations with colleagues located throughout the world. Instruments and sensors located worldwide will produce data streams that will be brought together, analyzed, and distributed as new findings. Researchers will have access to machines that can supply domain-specific information in addition to locator and directory assistance. New forms of electronic journals will emerge and provide opportunities for researchers and scientists to exchange information electronically and interactively in a range of structures and formats. Ultimately, the wide-scale use of these technologies in the dissemination of research results and the stimulation of collegial dialogue will change the way we represent and express our knowledge of the world. A new paradigm will evolve--perhaps a truly worldwide 'invisible college.'
Hopkins, Mark E; Summers-Ables, Joy E; Clifton, Shari C; Coffman, Michael A
2011-06-01
To make electronic resources available to library users while effectively harnessing intellectual capital within the library, ultimately fostering the library's use of technology to interact asynchronously with its patrons (users). The methods used in the project included: (1) developing a new library website to facilitate the creation, management, accessibility, maintenance and dissemination of library resources; and (2) establishing ownership by those who participated in the project, while creating effective work allocation strategies through the implementation of a content management system that allowed the library to manage cost, complexity and interoperability. Preliminary results indicate that contributors to the system benefit from an increased understanding of the library's resources and add content valuable to library patrons. These strategies have helped promote the manageable creation and maintenance of electronic content in accomplishing the library's goal of interacting with its patrons. Establishment of a contributive system for adding to the library's electronic resources and electronic content has been successful. Further work will look at improving asynchronous interaction, particularly highlighting accessibility of electronic content and resources. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...
NASA Astrophysics Data System (ADS)
Artemyev, Anton V.; Neishtadt, Anatoly I.; Vasiliev, Alexei A.
2018-04-01
Accurately modelling and forecasting of the dynamics of the Earth's radiation belts with the available computer resources represents an important challenge that still requires significant advances in the theoretical plasma physics field of wave-particle resonant interaction. Energetic electron acceleration or scattering into the Earth's atmosphere are essentially controlled by their resonances with electromagnetic whistler mode waves. The quasi-linear diffusion equation describes well this resonant interaction for low intensity waves. During the last decade, however, spacecraft observations in the radiation belts have revealed a large number of whistler mode waves with sufficiently high intensity to interact with electrons in the nonlinear regime. A kinetic equation including such nonlinear wave-particle interactions and describing the long-term evolution of the electron distribution is the focus of the present paper. Using the Hamiltonian theory of resonant phenomena, we describe individual electron resonance with an intense coherent whistler mode wave. The derived characteristics of such a resonance are incorporated into a generalized kinetic equation which includes non-local transport in energy space. This transport is produced by resonant electron trapping and nonlinear acceleration. We describe the methods allowing the construction of nonlinear resonant terms in the kinetic equation and discuss possible applications of this equation.
Chen, Tzer-Long; Lin, Frank Y S
2011-08-01
Electronic medical records can be defined as a digital format of the traditionally paper-based anamneses, which contains the history of a patient such as his somewhat illness, current health problems, and his chronic treatments. An electronic anamnesis is meant to make the patient's health information more conveniently accessible and transferable between different medical institutions and also easier to be kept quite a long time. Because of such transferability and accessibility of electronic anamneses, we can use less resource than before on storing the patients' medical information. This also means that medical care providers could save more funds on record-keeping and access a patient's medical background directly since shown on the computer screen more quickly and easily. Overall, the service quality has seemingly improved greatly. However, the usage of electronic anamneses involves in some concerned issues such as its related law declaration, and the security of the patient's confidential information. Because of these concerns, a secure medical networking scheme is taking into consideration. Nowadays, the administrators at the medical institutions are facing more challenges on monitoring computers and network systems, because of dramatic advances in this field. For instance, a trusted third party is authorized to access some medical records for a certain period of time. In regard to the security purpose, all the electronic medical records are embedded with both of the public-key infrastructure (PKI) cryptography and the digital signature technique so as to ensure the records well-protected. Since the signatures will be invalid due to the revocation or time expiration, the security of records under this premise would turn into vulnerable. Hence, we propose a re-signing scheme, whose purpose is to make a going-expired digital signature been resigned in time, in keeping with the premise of not conflicting with the laws, morals, and privacy while maintaining the security of the electronic medical records.
Resource Provisioning in SLA-Based Cluster Computing
NASA Astrophysics Data System (ADS)
Xiong, Kaiqi; Suh, Sang
Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.
Mata, Ricardo A
2010-05-21
In this Perspective, several developments in the field of quantum mechanics/molecular mechanics (QM/MM) approaches are reviewed. Emphasis is placed on the use of correlated wavefunction theory and new state of the art methods for the treatment of large quantum systems. Until recently, computational chemistry approaches to large/complex chemical problems have seldom been considered as tools for quantitative predictions. However, due to the tremendous development of computational resources and new quantum chemical methods, it is nowadays possible to describe the electronic structure of biomolecules at levels of theory which a decade ago were only possible for system sizes of up to 20 atoms. These advances are here outlined in the context of QM/MM. The article concludes with a short outlook on upcoming developments and possible bottlenecks for future applications.
NASA Astrophysics Data System (ADS)
Miller, M.; Miller, E.; Liu, J.; Lund, R. M.; McKinley, J. P.
2012-12-01
X-ray computed tomography (CT), scanning electron microscopy (SEM), electron microprobe analysis (EMP), and computational image analysis are mature technologies used in many disciplines. Cross-discipline combination of these imaging and image-analysis technologies is the focus of this research, which uses laboratory and light-source resources in an iterative approach. The objective is to produce images across length scales, taking advantage of instrumentation that is optimized for each scale, and to unify them into a single compositional reconstruction. Initially, CT images will be collected using both x-ray absorption and differential phase contrast modes. The imaged sample will then be physically sectioned and the exposed surfaces imaged and characterized via SEM/EMP. The voxel slice corresponding to the physical sample surface will be isolated computationally, and the volumetric data will be combined with two-dimensional SEM images along CT image planes. This registration step will take advantage of the similarity between the X-ray absorption (CT) and backscattered electron (SEM) coefficients (both proportional to average atomic number in the interrogated volume) as well as the images' mutual information. Elemental and solid-phase distributions on the exposed surfaces, co-registered with SEM images, will be mapped using EMP. The solid-phase distribution will be propagated into three-dimensional space using computational methods relying on the estimation of compositional distributions derived from the CT data. If necessary, solid-phase and pore-space boundaries will be resolved using X-ray differential phase contrast tomography, x-ray fluorescence tomography, and absorption-edge microtomography at a light-source facility. Computational methods will be developed to register and model images collected over varying scales and data types. Image resolution, physically and dynamically, is qualitatively different for the electron microscopy and CT methodologies. Routine CT images are resolved at 10-20 μm, while SEM images are resolved at 10-20 nm; grayscale values vary according to collection time and instrument sensitivity; and compositional sensitivities via EMP vary in interrogation volume and scale. We have so far successfully registered SEM imagery within a multimode tomographic volume and have used standard methods to isolate pore space within the volume. We are developing a three-dimensional solid-phase identification and registration method that is constrained by bulk-sample X-ray diffraction Rietveld refinements. The results of this project will prove useful in fields that require the fine-scale definition of solid-phase distributions and relationships, and could replace more inefficient methods for making these estimations.
Electrical resistivity and thermal conductivity of liquid aluminum in the two-temperature state
NASA Astrophysics Data System (ADS)
Petrov, Yu V.; Inogamov, N. A.; Mokshin, A. V.; Galimzyanov, B. N.
2018-01-01
The electrical resistivity and thermal conductivity of liquid aluminum in the two-temperature state is calculated by using the relaxation time approach and structural factor of ions obtained by molecular dynamics simulation. Resistivity witin the Ziman-Evans approach is also considered to be higher than in the approach with previously calculated conductivity via the relaxation time. Calculations based on the construction of the ion structural factor through the classical molecular dynamics and kinetic equation for electrons are more economical in terms of computing resources and give results close to the Kubo-Greenwood with the quantum molecular dynamics calculations.
Rare earth element and rare metal inventory of central Asia
Mihalasky, Mark J.; Tucker, Robert D.; Renaud, Karine; Verstraeten, Ingrid M.
2018-03-06
Rare earth elements (REE), with their unique physical and chemical properties, are an essential part of modern living. REE have enabled development and manufacture of high-performance materials, processes, and electronic technologies commonly used today in computing and communications, clean energy and transportation, medical treatment and health care, glass and ceramics, aerospace and defense, and metallurgy and chemical refining. Central Asia is an emerging REE and rare metals (RM) producing region. A newly compiled inventory of REE-RM-bearing mineral occurrences and delineation of areas-of-interest indicate this region may have considerable undiscovered resources.
Acausal measurement-based quantum computing
NASA Astrophysics Data System (ADS)
Morimae, Tomoyuki
2014-07-01
In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.
Step-by-step magic state encoding for efficient fault-tolerant quantum computation
Goto, Hayato
2014-01-01
Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation. PMID:25511387
Step-by-step magic state encoding for efficient fault-tolerant quantum computation.
Goto, Hayato
2014-12-16
Quantum error correction allows one to make quantum computers fault-tolerant against unavoidable errors due to decoherence and imperfect physical gate operations. However, the fault-tolerant quantum computation requires impractically large computational resources for useful applications. This is a current major obstacle to the realization of a quantum computer. In particular, magic state distillation, which is a standard approach to universality, consumes the most resources in fault-tolerant quantum computation. For the resource problem, here we propose step-by-step magic state encoding for concatenated quantum codes, where magic states are encoded step by step from the physical level to the logical one. To manage errors during the encoding, we carefully use error detection. Since the sizes of intermediate codes are small, it is expected that the resource overheads will become lower than previous approaches based on the distillation at the logical level. Our simulation results suggest that the resource requirements for a logical magic state will become comparable to those for a single logical controlled-NOT gate. Thus, the present method opens a new possibility for efficient fault-tolerant quantum computation.
A Review of Resources for Evaluating K-12 Computer Science Education Programs
ERIC Educational Resources Information Center
Randolph, Justus J.; Hartikainen, Elina
2004-01-01
Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…
Computing the Envelope for Stepwise Constant Resource Allocations
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Clancy, Daniel (Technical Monitor)
2001-01-01
Estimating tight resource level is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with noises equal to the events and edges equal to the necessary predecessor links between events. The incremental solution of a staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. The staged algorithm has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible for use in the inner loop of search-based scheduling algorithms.
A lightweight distributed framework for computational offloading in mobile cloud computing.
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.
A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing
Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul
2014-01-01
The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245
COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...
This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).
LaRC local area networks to support distributed computing
NASA Technical Reports Server (NTRS)
Riddle, E. P.
1984-01-01
The Langley Research Center's (LaRC) Local Area Network (LAN) effort is discussed. LaRC initiated the development of a LAN to support a growing distributed computing environment at the Center. The purpose of the network is to provide an improved capability (over inteactive and RJE terminal access) for sharing multivendor computer resources. Specifically, the network will provide a data highway for the transfer of files between mainframe computers, minicomputers, work stations, and personal computers. An important influence on the overall network design was the vital need of LaRC researchers to efficiently utilize the large CDC mainframe computers in the central scientific computing facility. Although there was a steady migration from a centralized to a distributed computing environment at LaRC in recent years, the work load on the central resources increased. Major emphasis in the network design was on communication with the central resources within the distributed environment. The network to be implemented will allow researchers to utilize the central resources, distributed minicomputers, work stations, and personal computers to obtain the proper level of computing power to efficiently perform their jobs.
An approach for heterogeneous and loosely coupled geospatial data distributed computing
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui
2010-07-01
Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.
NASA Astrophysics Data System (ADS)
Jain, Anubhav
2017-04-01
Density functional theory (DFT) simulations solve for the electronic structure of materials starting from the Schrödinger equation. Many case studies have now demonstrated that researchers can often use DFT to design new compounds in the computer (e.g., for batteries, catalysts, and hydrogen storage) before synthesis and characterization in the lab. In this talk, I will focus on how DFT calculations can be executed on large supercomputing resources in order to generate very large data sets on new materials for functional applications. First, I will briefly describe the Materials Project, an effort at LBNL that has virtually characterized over 60,000 materials using DFT and has shared the results with over 17,000 registered users. Next, I will talk about how such data can help discover new materials, describing how preliminary computational screening led to the identification and confirmation of a new family of bulk AMX2 thermoelectric compounds with measured zT reaching 0.8. I will outline future plans for how such data-driven methods can be used to better understand the factors that control thermoelectric behavior, e.g., for the rational design of electronic band structures, in ways that are different from conventional approaches.
NASA Center for Computational Sciences: History and Resources
NASA Technical Reports Server (NTRS)
2000-01-01
The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.
The Michigan Electronic Library.
ERIC Educational Resources Information Center
Davidsen, Susanna L.
1997-01-01
Describes the Michigan Electronic Library (MEL), the largest evaluated and organized Web-based library of Internet resources, that was designed to provide a library of electronic information resources selected by librarians. MEL's partnership is explained, the collection is described, and future developments are considered. (LRW)
Monari, Antonio; Rivail, Jean-Louis; Assfeld, Xavier
2013-02-19
Molecular mechanics methods can efficiently compute the macroscopic properties of a large molecular system but cannot represent the electronic changes that occur during a chemical reaction or an electronic transition. Quantum mechanical methods can accurately simulate these processes, but they require considerably greater computational resources. Because electronic changes typically occur in a limited part of the system, such as the solute in a molecular solution or the substrate within the active site of enzymatic reactions, researchers can limit the quantum computation to this part of the system. Researchers take into account the influence of the surroundings by embedding this quantum computation into a calculation of the whole system described at the molecular mechanical level, a strategy known as the mixed quantum mechanics/molecular mechanics (QM/MM) approach. The accuracy of this embedding varies according to the types of interactions included, whether they are purely mechanical or classically electrostatic. This embedding can also introduce the induced polarization of the surroundings. The difficulty in QM/MM calculations comes from the splitting of the system into two parts, which requires severing the chemical bonds that link the quantum mechanical subsystem to the classical subsystem. Typically, researchers replace the quantoclassical atoms, those at the boundary between the subsystems, with a monovalent link atom. For example, researchers might add a hydrogen atom when a C-C bond is cut. This Account describes another approach, the Local Self Consistent Field (LSCF), which was developed in our laboratory. LSCF links the quantum mechanical portion of the molecule to the classical portion using a strictly localized bond orbital extracted from a small model molecule for each bond. In this scenario, the quantoclassical atom has an apparent nuclear charge of +1. To achieve correct bond lengths and force constants, we must take into account the inner shell of the atom: for an sp(3) carbon atom, we consider the two core 1s electrons and treat that carbon as an atom with three electrons. This results in an LSCF+3 model. Similarly, a nitrogen atom with a lone pair of electrons available for conjugation is treated as an atom with five electrons (LSCF+5). This approach is particularly well suited to splitting peptide bonds and other bonds that include carbon or nitrogen atoms. To embed the induced polarization within the calculation, researchers must use a polarizable force field. However, because the parameters of the usual force fields include an average of the induction effects, researchers typically can obtain satisfactory results without explicitly introducing the polarization. When considering electronic transitions, researchers must take into account the changes in the electronic polarization. One approach is to simulate the electronic cloud of the surroundings by a continuum whose dielectric constant is equal to the square of the refractive index. This Electronic Response of the Surroundings (ERS) methodology allows researchers to model the changes in induced polarization easily. We illustrate this approach by modeling the electronic absorption of tryptophan in human serum albumin (HSA).
Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.
30 CFR 1206.154 - Determination of quantities and qualities for computing royalties.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 30 Mineral Resources 3 2014-07-01 2014-07-01 false Determination of quantities and qualities for computing royalties. 1206.154 Section 1206.154 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Gas § 1206.154 Determination...
30 CFR 1206.154 - Determination of quantities and qualities for computing royalties.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 30 Mineral Resources 3 2012-07-01 2012-07-01 false Determination of quantities and qualities for computing royalties. 1206.154 Section 1206.154 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Gas § 1206.154 Determination...
30 CFR 1206.154 - Determination of quantities and qualities for computing royalties.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 30 Mineral Resources 3 2013-07-01 2013-07-01 false Determination of quantities and qualities for computing royalties. 1206.154 Section 1206.154 Mineral Resources OFFICE OF NATURAL RESOURCES REVENUE, DEPARTMENT OF THE INTERIOR NATURAL RESOURCES REVENUE PRODUCT VALUATION Federal Gas § 1206.154 Determination...
Integrated Circuit Chip Improves Network Efficiency
NASA Technical Reports Server (NTRS)
2008-01-01
Prior to 1999 and the development of SpaceWire, a standard for high-speed links for computer networks managed by the European Space Agency (ESA), there was no high-speed communications protocol for flight electronics. Onboard computers, processing units, and other electronics had to be designed for individual projects and then redesigned for subsequent projects, which increased development periods, costs, and risks. After adopting the SpaceWire protocol in 2000, NASA implemented the standard on the Swift mission, a gamma ray burst-alert telescope launched in November 2004. Scientists and developers on the James Webb Space Telescope further developed the network version of SpaceWire. In essence, SpaceWire enables more science missions at a lower cost, because it provides a standard interface between flight electronics components; new systems need not be custom built to accommodate individual missions, so electronics can be reused. New protocols are helping to standardize higher layers of computer communication. Goddard Space Flight Center improved on the ESA-developed SpaceWire by enabling standard protocols, which included defining quality of service and supporting plug-and-play capabilities. Goddard upgraded SpaceWire to make the routers more efficient and reliable, with features including redundant cables, simultaneous discrete broadcast pulses, prevention of network blockage, and improved verification. Redundant cables simplify management because the user does not need to worry about which connection is available, and simultaneous broadcast signals allow multiple users to broadcast low-latency side-band signal pulses across the network using the same resources for data communication. Additional features have been added to the SpaceWire switch to prevent network blockage so that more robust networks can be designed. Goddard s verification environment for the link-and-switch implementation continuously randomizes and tests different parts, constantly anticipating situations, which helps improve communications reliability. It has been tested in many different implementations for compatibility.
NASA Technical Reports Server (NTRS)
Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael
1989-01-01
As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.
Issues in undergraduate education in computational science and high performance computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchioro, T.L. II; Martin, D.
1994-12-31
The ever increasing need for mathematical and computational literacy within their society and among members of the work force has generated enormous pressure to revise and improve the teaching of related subjects throughout the curriculum, particularly at the undergraduate level. The Calculus Reform movement is perhaps the best known example of an organized initiative in this regard. The UCES (Undergraduate Computational Engineering and Science) project, an effort funded by the Department of Energy and administered through the Ames Laboratory, is sponsoring an informal and open discussion of the salient issues confronting efforts to improve and expand the teaching of computationalmore » science as a problem oriented, interdisciplinary approach to scientific investigation. Although the format is open, the authors hope to consider pertinent questions such as: (1) How can faculty and research scientists obtain the recognition necessary to further excellence in teaching the mathematical and computational sciences? (2) What sort of educational resources--both hardware and software--are needed to teach computational science at the undergraduate level? Are traditional procedural languages sufficient? Are PCs enough? Are massively parallel platforms needed? (3) How can electronic educational materials be distributed in an efficient way? Can they be made interactive in nature? How should such materials be tied to the World Wide Web and the growing ``Information Superhighway``?« less
Tools and Techniques for Measuring and Improving Grid Performance
NASA Technical Reports Server (NTRS)
Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)
2001-01-01
This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.
SaaS enabled admission control for MCMC simulation in cloud computing infrastructures
NASA Astrophysics Data System (ADS)
Vázquez-Poletti, J. L.; Moreno-Vozmediano, R.; Han, R.; Wang, W.; Llorente, I. M.
2017-02-01
Markov Chain Monte Carlo (MCMC) methods are widely used in the field of simulation and modelling of materials, producing applications that require a great amount of computational resources. Cloud computing represents a seamless source for these resources in the form of HPC. However, resource over-consumption can be an important drawback, specially if the cloud provision process is not appropriately optimized. In the present contribution we propose a two-level solution that, on one hand, takes advantage of approximate computing for reducing the resource demand and on the other, uses admission control policies for guaranteeing an optimal provision to running applications.
Setting Up a Grid-CERT: Experiences of an Academic CSIRT
ERIC Educational Resources Information Center
Moller, Klaus
2007-01-01
Purpose: Grid computing has often been heralded as the next logical step after the worldwide web. Users of grids can access dynamic resources such as computer storage and use the computing resources of computers under the umbrella of a virtual organisation. Although grid computing is often compared to the worldwide web, it is vastly more complex…
Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center
NASA Astrophysics Data System (ADS)
Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.
2012-12-01
Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.
Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua
2014-01-01
This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353
Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua
2014-01-01
This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.
Networking Micro-Processors for Effective Computer Utilization in Nursing
Mangaroo, Jewellean; Smith, Bob; Glasser, Jay; Littell, Arthur; Saba, Virginia
1982-01-01
Networking as a social entity has important implications for maximizing computer resources for improved utilization in nursing. This paper describes the one process of networking of complementary resources at three institutions. Prairie View A&M University, Texas A&M University and the University of Texas School of Public Health, which has effected greater utilization of computers at the college. The results achieved in this project should have implications for nurses, users, and consumers in the development of computer resources.
Rouillard, Andrew D.; Wang, Zichen; Ma’ayan, Avi
2015-01-01
With advances in genomics, transcriptomics, metabolomics and proteomics, and more expansive electronic clinical record monitoring, as well as advances in computation, we have entered the Big Data era in biomedical research. Data gathering is growing rapidly while only a small fraction of this data is converted to useful knowledge or reused in future studies. To improve this, an important concept that is often overlooked is data abstraction. To fuse and reuse biomedical datasets from diverse resources, data abstraction is frequently required. Here we summarize some of the major Big Data biomedical research resources for genomics, proteomics and phenotype data, collected from mammalian cells, tissues and organisms. We then suggest simple data abstraction methods for fusing this diverse but related data. Finally, we demonstrate examples of the potential utility of such data integration efforts, while warning about the inherit biases that exist within such data. PMID:26101093
Extraction of rubidium from natural resources
NASA Astrophysics Data System (ADS)
Ertan, Bengü
2017-04-01
Rubidium is a rare alkali metal in the first group of periodic table. It has some exclusive properties like softness, ductility, malleability, strong chemical and photo-emissive activity, low melting point, easy ionization. So it is used many of applications such that optical and laser technology, electronics, telecommunications, biomedical, space technology, academic research especially quantum mechanics-based computing devices. Attention of rubidium in relation to its uses will increase in the near future. Rubidium does not have any mineral that is the main component. It is produced as minor quantities from lithium or cesium-rich minerals and natural brines. However, there are a few researches on the extraction of rubidium from mine tailings. It is difficult extraction or concentration of rubidium from these resources. Because they require a series of physical and chemical treatments and cost expensive. Efficient, cheap and friendly of environment methods for the recovery of this metal are being investigated.
Using the Virtual Reality World of Second Life to Promote Patient Engagement
WEINER, Elizabeth; TRANGENSTEIN, Patricia; MCNEW, Ryan; GORDON, Jeffry
2017-01-01
Patients have typically been passive participants in their own healthcare. However, with a change in philosophy towards outcomes driven care, it has become necessary to make sure that patients mutually set their healthcare goals with their providers Both eHealth and mobile health applications have required patient participation in ways never before valued. The virtual reality world of Second Life offers one eHealth solution that requires computer literate patients to participate via avatars in synchronous healthcare visits and support groups, as well as explore online resources asynchronously. This paper describes the development of a Second Life environment that served as a platform for nurse practitioner driven care supplemented by a patient portal as well as the institutional electronic health record. In addition, the use of Second Life is described as an active exercise to expose students in a Consumer Health course to support groups and resources available to actively engage patients. PMID:27332190
NASA Astrophysics Data System (ADS)
Anderson, Delia Marie Castro
Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.
Strategic Planning for Electronic Resources Management: A Case Study at Gustavus Adolphus College
ERIC Educational Resources Information Center
Hulseberg, Anna; Monson, Sarah
2009-01-01
Electronic resources, the tools we use to manage them, and the needs and expectations of our users are constantly evolving; at the same time, the roles, responsibilities, and workflow of the library staff who manage e-resources are also in flux. Recognizing a need to be more intentional and proactive about how we manage e-resources, the…
Cimino, James J.; Bakken, Suzanne
2012-01-01
Objectives (1) To develop a prototype Continuity of Care Record (CCR) with context-specific links to electronic HIV information resources; and (2) to assess case managers’ perceptions regarding the usability of the prototype. Methods We integrated context-specific links to HIV case management information resources into a prototype CCR using the Infobutton Manager and Librarian Infobutton Tailoring Environment (LITE). Case managers (N=9) completed a think-aloud protocol and the Computer System Usability Questionnaire (CSUQ) to evaluate the usability of the prototype. Verbalizations from the think-aloud protocol were summarized using thematic analysis. CSUQ data were analyzed with descriptive statistics. Results Although participants expressed positive comments regarding the usability of the prototype, the think-aloud protocol also identified the need for improvement in resource labels and for additional resources. On a scale ranging from 1 (strongly agree) to 7 (strongly disagree), the average CSUQ overall satisfaction was 2.25 indicating that users (n=9) were generally satisfied with the system. Mean CSUQ factor scores were: System Usefulness (M=2.13), Information Quality (M=2.46), and Interface Quality (M=2.26). Conclusion Our novel application of the Infobutton Manager and LITE in the context of case management for persons living with HIV in community-based settings resulted in a prototype CCR with infobuttons that met the majority of case managers’ information needs and received relatively positive usability ratings. Findings from this study inform future integration of context-specific links into CCRs and electronic health records and support their use for meeting end-users information needs. PMID:22632821
Managing the Security of Nursing Data in the Electronic Health Record
Samadbeik, Mahnaz; Gorzin, Zahra; Khoshkam, Masomeh; Roudbari, Masoud
2015-01-01
Background: The Electronic Health Record (EHR) is a patient care information resource for clinicians and nursing documentation is an essential part of comprehensive patient care. Ensuring privacy and the security of health information is a key component to building the trust required to realize the potential benefits of electronic health information exchange. This study was aimed to manage nursing data security in the EHR and also discover the viewpoints of hospital information system vendors (computer companies) and hospital information technology specialists about nursing data security. Methods: This research is a cross sectional analytic-descriptive study. The study populations were IT experts at the academic hospitals and computer companies of Tehran city in Iran. Data was collected by a self-developed questionnaire whose validity and reliability were confirmed using the experts’ opinions and Cronbach’s alpha coefficient respectively. Data was analyzed through Spss Version 18 and by descriptive and analytic statistics. Results: The findings of the study revealed that user name and password were the most important methods to authenticate the nurses, with mean percent of 95% and 80%, respectively, and also the most significant level of information security protection were assigned to administrative and logical controls. There was no significant difference between opinions of both groups studied about the levels of information security protection and security requirements (p>0.05). Moreover the access to servers by authorized people, periodic security update, and the application of authentication and authorization were defined as the most basic security requirements from the viewpoint of more than 88 percent of recently-mentioned participants. Conclusions: Computer companies as system designers and hospitals information technology specialists as systems users and stakeholders present many important views about security requirements for EHR systems and nursing electronic documentation systems. Prioritizing of these requirements helps policy makers to decide what to do when planning for EHR implementation. Therefore, to make appropriate security decisions and to achieve the expected level of protection of the electronic nursing information, it is suggested to consider the priorities of both groups of experts about security principles and also discuss the issues seem to be different between two groups of participants in the research. PMID:25870490
Managing the security of nursing data in the electronic health record.
Samadbeik, Mahnaz; Gorzin, Zahra; Khoshkam, Masomeh; Roudbari, Masoud
2015-02-01
The Electronic Health Record (EHR) is a patient care information resource for clinicians and nursing documentation is an essential part of comprehensive patient care. Ensuring privacy and the security of health information is a key component to building the trust required to realize the potential benefits of electronic health information exchange. This study was aimed to manage nursing data security in the EHR and also discover the viewpoints of hospital information system vendors (computer companies) and hospital information technology specialists about nursing data security. This research is a cross sectional analytic-descriptive study. The study populations were IT experts at the academic hospitals and computer companies of Tehran city in Iran. Data was collected by a self-developed questionnaire whose validity and reliability were confirmed using the experts' opinions and Cronbach's alpha coefficient respectively. Data was analyzed through Spss Version 18 and by descriptive and analytic statistics. The findings of the study revealed that user name and password were the most important methods to authenticate the nurses, with mean percent of 95% and 80%, respectively, and also the most significant level of information security protection were assigned to administrative and logical controls. There was no significant difference between opinions of both groups studied about the levels of information security protection and security requirements (p>0.05). Moreover the access to servers by authorized people, periodic security update, and the application of authentication and authorization were defined as the most basic security requirements from the viewpoint of more than 88 percent of recently-mentioned participants. Computer companies as system designers and hospitals information technology specialists as systems users and stakeholders present many important views about security requirements for EHR systems and nursing electronic documentation systems. Prioritizing of these requirements helps policy makers to decide what to do when planning for EHR implementation. Therefore, to make appropriate security decisions and to achieve the expected level of protection of the electronic nursing information, it is suggested to consider the priorities of both groups of experts about security principles and also discuss the issues seem to be different between two groups of participants in the research.
ERM Ideas and Innovations: Digital Repository Management as ERM
ERIC Educational Resources Information Center
Pinkas, María M.; Lin, Na
2014-01-01
This article describes the application of electronic resources management (ERM) to digital repository management at the Health Sciences and Human Services Library at the University of Maryland, Baltimore. The authors discuss electronic resources management techniques, through the application of "Techniques for Electronic Management,"…
Emergency medicine educational resource use in Cape Town: modern or traditional?
Kleynhans, A C; Oosthuizen, A H; van Hoving, D J
2017-05-01
The integration of online resources and social media into higher education and continued professional development is an increasingly common phenomenon. To describe the usage of various traditional and modern educational resources by members of the divisions of emergency medicine at Stellenbosch University and the University of Cape Town. Members affiliated with the divisions during 2014 were invited to participate in an online survey. Participants were given 8 weeks to complete the questionnaire; with weekly reminders until they responded or the deadline expired. Summary statistics were used to describe the variables. Eighty-seven divisional members completed the survey (69.6% response rate). The resources most preferred were textbooks (n=78, 89.7%), open access educational resources (n=77, 88.5%) and journals (n=76, 87.4%). Emergency medicine trainees (n=31, 92.1%) and respondents ≤30 years (n=17, 94.4%) were more inclined to use social media. International Emergency Medicine and Critical Care blogs are frequently being used by 71% of respondents. YouTube (35%) and podcasts (21%) were the most commonly used multimedia resources. Computers (desktop and laptop) were most frequently used to access educational resources except for social media where smart phones were preferred. The use of modern and electronic resources is relatively common, but traditional educational resources are still preferred. This study illustrates an opportunity for greater integration of online resources and social media in educational activities to enhance multimodal and self-directed learning. Specific training in the use of these resources and how to appraise them may further improve their utility. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Desktop Computing Integration Project
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
Managing Tradeoffs in the Electronic Age.
ERIC Educational Resources Information Center
Wagner, A. Ben
2003-01-01
Provides an overview of the development of electronic resources over the past three decades, discussing key features, disadvantages, and benefits of traditional online databases and CD-ROM and Web-based resources. Considers the decision to shift collections and resources toward purely digital formats, ownership of content, licensing, and user…
76 FR 49753 - Privacy Act of 1974; System of Records
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... Defense. DHA 14 System name: Computer/Electronics Accommodations Program for People with Disabilities... with ``Computer/Electronic Accommodations Program.'' System location: Delete entry and replace with ``Computer/Electronic Accommodations Program, Skyline 5, Suite 302, 5111 Leesburg Pike, Falls Church, VA...
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...
2017-09-29
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy
ERIC Educational Resources Information Center
Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce
2013-01-01
There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…
Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing
Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav
2012-01-01
Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; ...
2017-10-01
Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey
Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less
Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue
NASA Astrophysics Data System (ADS)
Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea
2017-10-01
The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.
Classical multiparty computation using quantum resources
NASA Astrophysics Data System (ADS)
Clementi, Marco; Pappa, Anna; Eckstein, Andreas; Walmsley, Ian A.; Kashefi, Elham; Barz, Stefanie
2017-12-01
In this work, we demonstrate a way to perform classical multiparty computing among parties with limited computational resources. Our method harnesses quantum resources to increase the computational power of the individual parties. We show how a set of clients restricted to linear classical processing are able to jointly compute a nonlinear multivariable function that lies beyond their individual capabilities. The clients are only allowed to perform classical xor gates and single-qubit gates on quantum states. We also examine the type of security that can be achieved in this limited setting. Finally, we provide a proof-of-concept implementation using photonic qubits that allows four clients to compute a specific example of a multiparty function, the pairwise and.
Egle, Jonathan P; Smeenge, David M; Kassem, Kamal M; Mittal, Vijay K
2015-01-01
Electronic sources of medical information are plentiful, and numerous studies have demonstrated the use of the Internet by patients and the variable reliability of these sources. Studies have investigated neither the use of web-based resources by residents, nor the reliability of the information available on these websites. A web-based survey was distributed to surgical residents in Michigan and third- and fourth-year medical students at an American allopathic and osteopathic medical school and a Caribbean allopathic school regarding their preferred sources of medical information in various situations. A set of 254 queries simulating those faced by medical trainees on rounds, on a written examination, or during patient care was developed. The top 5 electronic resources cited by the trainees were evaluated for their ability to answer these questions accurately, using standard textbooks as the point of reference. The respondents reported a wide variety of overall preferred resources. Most of the 73 responding medical trainees favored textbooks or board review books for prolonged studying, but electronic resources are frequently used for quick studying, clinical decision-making questions, and medication queries. The most commonly used electronic resources were UpToDate, Google, Medscape, Wikipedia, and Epocrates. UpToDate and Epocrates had the highest percentage of correct answers (47%) and Wikipedia had the lowest (26%). Epocrates also had the highest percentage of wrong answers (30%), whereas Google had the lowest percentage (18%). All resources had a significant number of questions that they were unable to answer. Though hardcopy books have not been completely replaced by electronic resources, more than half of medical students and nearly half of residents prefer web-based sources of information. For quick questions and studying, both groups prefer Internet sources. However, the most commonly used electronic resources fail to answer clinical queries more than half of the time and have an alarmingly high rate of inaccurate information. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Computer Network Resources for Physical Geography Instruction.
ERIC Educational Resources Information Center
Bishop, Michael P.; And Others
1993-01-01
Asserts that the use of computer networks provides an important and effective resource for geography instruction. Describes the use of the Internet network in physical geography instruction. Provides an example of the use of Internet resources in a climatology/meteorology course. (CFR)
Self managing experiment resources
NASA Astrophysics Data System (ADS)
Stagni, F.; Ubeda, M.; Tsaregorodtsev, A.; Romanovskiy, V.; Roiser, S.; Charpentier, P.; Graciani, R.
2014-06-01
Within this paper we present an autonomic Computing resources management system, used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous resources. For example, LHC experiments very often use resources not provided by WLCG, and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware, addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System (Resource Status System) delivering real time information, the system controls the resources topology, independently of the resource types. The Resource Status System applies data mining techniques against all possible information sources available and assesses the status changes, that are then propagated to the topology description. Obviously, giving full control to such an automated system is not risk-free. Therefore, in order to minimise the probability of misbehavior, a battery of tests has been developed in order to certify the correctness of its assessments. We will demonstrate the performance and efficiency of such a system in terms of cost reduction and reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sunderam, Vaidy S.
2007-01-09
The Harness project has developed novel software frameworks for the execution of high-end simulations in a fault-tolerant manner on distributed resources. The H2O subsystem comprises the kernel of the Harness framework, and controls the key functions of resource management across multiple administrative domains, especially issues of access and allocation. It is based on a “pluggable” architecture that enables the aggregated use of distributed heterogeneous resources for high performance computing. The major contributions of the Harness II project result in significantly enhancing the overall computational productivity of high-end scientific applications by enabling robust, failure-resilient computations on cooperatively pooled resource collections.
Construction and application of Red5 cluster based on OpenStack
NASA Astrophysics Data System (ADS)
Wang, Jiaqing; Song, Jianxin
2017-08-01
With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-769] Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof; Termination of the Investigation Based on... electronic computing devices, related software, and components thereof by reason of infringement of certain...
30 CFR 1210.54 - Must I submit this royalty report electronically?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Must I submit this royalty report electronically? 1210.54 Section 1210.54 Mineral Resources OFFICE OF SURFACE MINING RECLAMATION AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR Natural Resources Revenue FORMS AND REPORTS Royalty Reports-Oil, Gas, and...
Model for Presenting Resources in Scholar's Portal
ERIC Educational Resources Information Center
Feeney, Mary; Newby, Jill
2005-01-01
Presenting electronic resources to users through a federated search engine introduces unique opportunities and challenges to libraries. This article reports on the decision-making tools and processes used for selecting collections of electronic resources by a project team at the University of Arizona (UA) Libraries for the Association of Research…
Embedded Thermal Control for Spacecraft Subsystems Miniaturization
NASA Technical Reports Server (NTRS)
Didion, Jeffrey R.
2014-01-01
Optimization of spacecraft size, weight and power (SWaP) resources is an explicit technical priority at Goddard Space Flight Center. Embedded Thermal Control Subsystems are a promising technology with many cross cutting NSAA, DoD and commercial applications: 1.) CubeSatSmallSat spacecraft architecture, 2.) high performance computing, 3.) On-board spacecraft electronics, 4.) Power electronics and RF arrays. The Embedded Thermal Control Subsystem technology development efforts focus on component, board and enclosure level devices that will ultimately include intelligent capabilities. The presentation will discuss electric, capillary and hybrid based hardware research and development efforts at Goddard Space Flight Center. The Embedded Thermal Control Subsystem development program consists of interrelated sub-initiatives, e.g., chip component level thermal control devices, self-sensing thermal management, advanced manufactured structures. This presentation includes technical status and progress on each of these investigations. Future sub-initiatives, technical milestones and program goals will be presented.
NASA Astrophysics Data System (ADS)
Xiong, Ting; He, Zhiwen
2017-06-01
Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.
Kohler, Steven W; Chen, Richard; Kagan, Alex; Helvey, Dustin W; Buccigrossi, David
2013-06-01
In order to determine the effects of implementation of an electronic medical record on rates of repeat computed tomography (CT) scanning in the emergency department (ED) setting, we analyzed the utilization of CT of the kidneys, ureters, and bladder (CT KUB) for the detection of urinary tract calculi for periods before and after the implementation of a hospital-wide electronic medical record system. Rates of repeat CT scanning within a 6-month period of previous scan were determined pre- and post-implementation and compared. Prior to implementation, there was a 6-month repeat rate of 6.2 % compared with the post-implementation period, which was associated with a 6-month repeat rate of 4.1 %. Statistical analysis using a two-sample, one-tailed t test for difference of means was associated with a p value of 0.00007. This indicates that the implementation of the electronic medical record system was associated with a 34 % decrease in 6-month repeat CT KUB scans. We conclude that the use of an electronic medical record can be associated with a decrease in utilization of unnecessary repeat CT imaging, leading to decreased cumulative lifetime risk for cancer in these patients and more efficient utilization of ED and radiologic resources.
Hashimoto, Teruo; Thompson, George E; Zhou, Xiaorong; Withers, Philip J
2016-04-01
Mechanical serial block face scanning electron microscopy (SBFSEM) has emerged as a means of obtaining three dimensional (3D) electron images over volumes much larger than possible by focused ion beam (FIB) serial sectioning and at higher spatial resolution than achievable with conventional X-ray computed tomography (CT). Such high resolution 3D electron images can be employed for precisely determining the shape, volume fraction, distribution and connectivity of important microstructural features. While soft (fixed or frozen) biological samples are particularly well suited for nanoscale sectioning using an ultramicrotome, the technique can also produce excellent 3D images at electron microscope resolution in a time and resource-efficient manner for engineering materials. Currently, a lack of appreciation of the capabilities of ultramicrotomy and the operational challenges associated with minimising artefacts for different materials is limiting its wider application to engineering materials. Consequently, this paper outlines the current state of the art for SBFSEM examining in detail how damage is introduced during slicing and highlighting strategies for minimising such damage. A particular focus of the study is the acquisition of 3D images for a variety of metallic and coated systems. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Electronic Reference Works and Library Budgeting Dilemma
ERIC Educational Resources Information Center
Lawal, Ibironke O.
2007-01-01
The number of electronic resources has climbed up steadily in recent times. Some of these e-resources are reference sources, mostly in Science, Technology and Medicine (STM), which publishers convert to electronic for obvious reasons. The library budgets for materials usually have two main lines, budget for one time purchase (monographs) and…
ERIC Educational Resources Information Center
McDowell, Liz
2002-01-01
This qualitative interview-based study examines lecturer perspectives on the roles of electronic information resources in undergraduate education. Highlights include electronic academic libraries; changes toward more constructivist approaches to learning; information quality on the Web; plagiarism; information use; information literacy; and…
Computer-assisted propofol administration.
O'Connor, J P A; O'Moráin, C A; Vargo, J J
2010-01-01
The use of propofol for sedation in endoscopy may allow for better quality of sedation, quicker recovery and facilitate greater throughput in endoscopy units. The cost-effectiveness and utility of propofol sedation for endoscopic procedures is contingent on the personnel and resources required to carry out the procedure. Computer-based platforms are based on the patients response to stimulation and physiologic parameters. They offer an appealing means of delivering safe and effective doses of propofol. One such means is the bispectral index where continuous EEG recordings are used to assess the degree of sedation. Another is the closed-loop target-controlled system where a set of physical parameters, such as muscle relaxation and auditory-evoked potential, determine a level of medication appropriate to achieve sedation. Patient-controlled platforms may also be used. These electronic adjuncts may help endoscopists who wish to adopt propofol sedation to change current practices with greater confidence. Copyright 2010 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Guo, Qi; Cheng, Liu-Yong; Chen, Li; Wang, Hong-Fu; Zhang, Shou
2014-10-01
The existing distributed quantum gates required physical particles to be transmitted between two distant nodes in the quantum network. We here demonstrate the possibility to implement distributed quantum computation without transmitting any particles. We propose a scheme for a distributed controlled-phase gate between two distant quantum-dot electron-spin qubits in optical microcavities. The two quantum-dot-microcavity systems are linked by a nested Michelson-type interferometer. A single photon acting as ancillary resource is sent in the interferometer to complete the distributed controlled-phase gate, but it never enters the transmission channel between the two nodes. Moreover, we numerically analyze the effect of experimental imperfections and show that the present scheme can be implemented with high fidelity in the ideal asymptotic limit. The scheme provides further evidence of quantum counterfactuality and opens promising possibilities for distributed quantum computation.
Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems
NASA Astrophysics Data System (ADS)
Dogan, Firat; Atilgan, Yasemin
The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.
Calculation of wakefields in 2D rectangular structures
Zagorodnov, I.; Bane, K. L. F.; Stupakov, G.
2015-10-19
We consider the calculation of electromagnetic fields generated by an electron bunch passing through a vacuum chamber structure that, in general, consists of an entry pipe, followed by some kind of transition or cavity, and ending in an exit pipe. We limit our study to structures having rectangular cross section, where the height can vary as function of longitudinal coordinate but the width and side walls remain fixed. For such structures, we derive a Fourier representation of the wake potentials through one-dimensional functions. A new numerical approach for calculating the wakes in such structures is proposed and implemented in themore » computer code echo(2d). The computation resource requirements for this approach are moderate and comparable to those for finding the wakes in 2D rotationally symmetric structures. Finally, we present numerical examples obtained with the new numerical code.« less
NASA Astrophysics Data System (ADS)
Skouteris, D.; Barone, V.
2014-06-01
We report the main features of a new general implementation of the Gaussian Multi-Configuration Time-Dependent Hartree model. The code allows effective computations of time-dependent phenomena, including calculation of vibronic spectra (in one or more electronic states), relative state populations, etc. Moreover, by expressing the Dirac-Frenkel variational principle in terms of an effective Hamiltonian, we are able to provide a new reliable estimate of the representation error. After validating the code on simple one-dimensional systems, we analyze the harmonic and anharmonic vibrational spectra of water and glycine showing that reliable and converged energy levels can be obtained with reasonable computing resources. The data obtained on water and glycine are compared with results of previous calculations using the vibrational second-order perturbation theory method. Additional features and perspectives are also shortly discussed.
BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way
NASA Astrophysics Data System (ADS)
Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip
2017-10-01
The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
Pilots 2.0: DIRAC pilots for all the skies
NASA Astrophysics Data System (ADS)
Stagni, F.; Tsaregorodtsev, A.; McNab, A.; Luzzi, C.
2015-12-01
In the last few years, new types of computing infrastructures, such as IAAS (Infrastructure as a Service) and IAAC (Infrastructure as a Client), gained popularity. New resources may come as part of pledged resources, while others are opportunistic. Most of these new infrastructures are based on virtualization techniques. Meanwhile, some concepts, such as distributed queues, lost appeal, while still supporting a vast amount of resources. Virtual Organizations are therefore facing heterogeneity of the available resources and the use of an Interware software like DIRAC to hide the diversity of underlying resources has become essential. The DIRAC WMS is based on the concept of pilot jobs that was introduced back in 2004. A pilot is what creates the possibility to run jobs on a worker node. Within DIRAC, we developed a new generation of pilot jobs, that we dubbed Pilots 2.0. Pilots 2.0 are not tied to a specific infrastructure; rather they are generic, fully configurable and extendible pilots. A Pilot 2.0 can be sent, as a script to be run, or it can be fetched from a remote location. A pilot 2.0 can run on every computing resource, e.g.: on CREAM Computing elements, on DIRAC Computing elements, on Virtual Machines as part of the contextualization script, or IAAC resources, provided that these machines are properly configured, hiding all the details of the Worker Nodes (WNs) infrastructure. Pilots 2.0 can be generated server and client side. Pilots 2.0 are the “pilots to fly in all the skies”, aiming at easy use of computing power, in whatever form it is presented. Another aim is the unification and simplification of the monitoring infrastructure for all kinds of computing resources, by using pilots as a network of distributed sensors coordinated by a central resource monitoring system. Pilots 2.0 have been developed using the command pattern. VOs using DIRAC can tune pilots 2.0 as they need, and extend or replace each and every pilot command in an easy way. In this paper we describe how Pilots 2.0 work with distributed and heterogeneous resources providing the necessary abstraction to deal with different kind of computing resources.
An Overview of Cloud Computing in Distributed Systems
NASA Astrophysics Data System (ADS)
Divakarla, Usha; Kumari, Geetha
2010-11-01
Cloud computing is the emerging trend in the field of distributed computing. Cloud computing evolved from grid computing and distributed computing. Cloud plays an important role in huge organizations in maintaining huge data with limited resources. Cloud also helps in resource sharing through some specific virtual machines provided by the cloud service provider. This paper gives an overview of the cloud organization and some of the basic security issues pertaining to the cloud.
AGIS: Integration of new technologies used in ATLAS Distributed Computing
NASA Astrophysics Data System (ADS)
Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria
2017-10-01
The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.
2014-01-01
Background As Family Medicine programs across Canada are transitioning into a competency-based curriculum, medical students and clinical teachers are increasingly incorporating tablet computers in their work and educational activities. The purpose of this pilot study was to identify how preceptors and residents use tablet computers to implement and adopt a new family medicine curriculum and to evaluate how they access applications (apps) through their tablet in an effort to support and enhance effective teaching and learning. Methods Residents and preceptors (n = 25) from the Family Medicine program working at the Pembroke Regional Hospital in Ontario, Canada, were given iPads and training on how to use the device in clinical teaching and learning activities and how to access the online curriculum. Data regarding the use and perceived contribution of the iPads were collected through surveys and focus groups. This mixed methods research used analysis of survey responses to support the selection of questions for focus groups. Results Reported results were categorized into: curriculum and assessment; ease of use; portability; apps and resources; and perceptions about the use of the iPad in teaching/learning setting. Most participants agreed on the importance of accessing curriculum resources through the iPad but recognized that these required enhancements to facilitate use. The iPad was considered to be more useful for activities involving output of information than for input. Participants’ responses regarding the ease of use of mobile technology were heterogeneous due to the diversity of computer proficiency across users. Residents had a slightly more favorable opinion regarding the iPad’s contribution to teaching/learning compared to preceptors. Conclusions iPad’s interface should be fully enhanced to allow easy access to online curriculum and its built-in resources. The differences in computer proficiency level among users should be reduced by sharing knowledge through workshops led by more skillful iPad users. To facilitate collection of information through the iPad, the design of electronic data-input forms should consider the participants’ reported negative perceptions towards typing data through mobile devices. Technology deployment projects should gather sufficient evidence from pilot studies in order to guide efforts to adapt resources and infrastructure to relevant needs of Family Medicine teachers and learners. PMID:25138307
ERIC Educational Resources Information Center
Lancaster, F. W.
1989-01-01
Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jon Eisenberg, Director, CSTB
The Computer Science and Telecommunications Board of the National Research Council considers technical and policy issues pertaining to computer science (CS), telecommunications, and information technology (IT). The functions of the board include: (1) monitoring and promoting the health of the CS, IT, and telecommunications fields, including attention as appropriate to issues of human resources and funding levels and program structures for research; (2) initiating studies involving CS, IT, and telecommunications as critical resources and sources of national economic strength; (3) responding to requests from the government, non-profit organizations, and private industry for expert advice on CS, IT, and telecommunications issues;more » and to requests from the government for expert advice on computer and telecommunications systems planning, utilization, and modernization; (4) fostering interaction among CS, IT, and telecommunications researchers and practitioners, and with other disciplines; and providing a base of expertise in the National Research Council in the areas of CS, IT, and telecommunications. This award has supported the overall operation of CSTB. Reports resulting from the Board's efforts have been widely disseminated in both electronic and print form, and all CSTB reports are available at its World Wide Web home page at cstb.org. The following reports, resulting from projects that were separately funded by a wide array of sponsors, were completed and released during the award period: 2007: * Summary of a Workshop on Software-Intensive Systems and Uncertainty at Scale * Social Security Administration Electronic Service Provision: A Strategic Assessment * Toward a Safer and More Secure Cyberspace * Software for Dependable Systems: Sufficient Evidence? * Engaging Privacy and Information Technology in a Digital Age * Improving Disaster Management: The Role of IT in Mitigation, Preparedness, Response, and Recovery 2006: * Renewing U.S. Telecommunications Research * Letter Report on Electronic Voting * Summary of a Workshop on the Technology, Policy, and Cultural Dimensions of Biometric System 2005: * Catalyzing Inquiry at the Interface of Computing and Biology * Summary of a Workshop on Using IT to Enhance Disaster Management * Asking the Right Questions About Electronic Voting * Building an Electronic Records Archive at NARA: Recommendations for a Long-Term Strategy * Signposts in Cyberspace: The Domain Name System and Internet Navigation 2004: * ITCP: Information Technology and Creative Practices (brochure) * Radio Frequency Identification (RFID) Technologies: A Workshop Summary * Getting up to Speed: The Future of Supercomputing * Summary of a Workshop on Software Certification and Dependability * Computer Science: Reflections on the Field, Reflections from the Field CSTB conducted numerous briefings of these reports and transmitted copies of these reports to researchers and key decision makers in the public and private sectors. It developed articles for journals based on several of these reports. As requested, and in fulfillment of its congressional charter to act as an independent advisor to the federal government, it arranged for congressional testimony on several of these reports. CSTB also convenes a number of workshops and other events, either as part of studies or in conjunctions with meetings of the CSTB members. These events have included the following: two 2007 workshops explored issues and challenges related to state voter registration databases, record matching, and database interoperability. A Sept. 2007 workshop, Trends in Computing Performance, explored fundamental trends in areas such as power, storage, programming, and applications. An Oct. 2007, workshop presented highlights of CSTB's May 2007 report, Software for Dependable Systems: Sufficient Evidence?, along with several panels discussing the report's conclusions and their implications. A Jan. 2007 workshop, Uncertainty at Scale, explored engineering uncertainty, system complexity, and scale issues in developing large software systems. A Feb. 2007 workshop explored China's and India's roles in the IT R&D ecosystem; observations about the ecosystem over the long term; perspectives from serial entrepreneurs about the evolution of the ecosystem; and a cross-industry, global view of the R&D ecosystem. A Nov. 2006 event brought together participants from government, industry, and academia to share their perspectives on the health of the ecosystem, patterns of funding and investment, and the Potomac-area IT startup environment. A symposium entitled 2016, held in Oct. 2006, featured a number of distinguished speakers who shared their views on how computer science and telecommunications will look in 10 years. This well-attended event was also the subject of an Oct. 31, 2006, feature essay in the New York Times, "Computing, 2016: What Won't Be Possible?"« less
ERIC Educational Resources Information Center
Hartnett, Eric; Price, Apryl; Smith, Jane; Barrett, Michael
2010-01-01
Over the past few years, Texas A&M University (TAMU) has searched for a way to administer its electronic subscriptions as well as the electronic subscriptions shared among the TAMU System. In this article, we address our attempts to implement an effective electronic resource management system (ERMS), both for subscriptions on the main campus…
Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter
Loganathan, Shyamala; Mukherjee, Saswati
2015-01-01
Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms. PMID:26473166
Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter.
Loganathan, Shyamala; Mukherjee, Saswati
2015-01-01
Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms.
43 CFR 11.40 - What are type A procedures?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 11.40 Public Lands: Interior Office of the Secretary of the Interior NATURAL RESOURCE DAMAGE... marine environments incorporates a computer model called the Natural Resource Damage Assessment Model for... environments incorporates a computer model called the Natural Resource Damage Assessment Model for Great Lakes...
43 CFR 11.40 - What are type A procedures?
Code of Federal Regulations, 2011 CFR
2011-10-01
... 11.40 Public Lands: Interior Office of the Secretary of the Interior NATURAL RESOURCE DAMAGE... marine environments incorporates a computer model called the Natural Resource Damage Assessment Model for... environments incorporates a computer model called the Natural Resource Damage Assessment Model for Great Lakes...
Mendoza-Gallegos, Roberto A; Rios, Amelia; Garcia-Cordero, Jose L
2018-05-01
The polymerase chain reaction (PCR) is a sought-after nucleic acid amplification technique used in the detection of several diseases. However, one of the main limitations of this and other nucleic acid amplification assays is the complexity, size, maintenance, and cost of their operational instrumentation. This limits the use of PCR applications in settings that cannot afford the instruments but that may have access to basic electrical, electronic, and optical components and the expertise to build them. To provide a more accessible platform, we developed a low-cost, palm-size, and portable instrument to perform real-time PCR (qPCR). The thermocycler leverages a copper-sheathed power resistor and a computer fan, in tandem with basic electronic components controlled from a single-board computer. The instrument incorporates a 3D-printed chassis and a custom-made fluorescence optical setup based on a CMOS camera and a blue LED. Results are displayed in real-time on a tablet. We also fabricated simple acrylic microdevices consisting of four wells (2 μL in volume each) where PCR reactions take place. To test our instrument, we performed qPCR on a series of cDNA dilutions spanning 4 orders of magnitude, achieving similar limits of detection as those achieved by a benchtop thermocycler. We envision our instrument being utilized to enable routine monitoring and diagnosis of certain diseases in low-resource areas.
ERIC Educational Resources Information Center
Page, Tom; Thorsteinsson, Gisli
2006-01-01
The work outlined here provides a comprehensive report and formative observations of the development and implementation of hypermedia resources for learning and teaching used in conjunction with a managed learning environment (MLE). These resources are used to enhance teaching and learning of an electronics module in product design at final year…
ERIC Educational Resources Information Center
Goodman, Kenneth; Grad, Roland; Pluye, Pierre; Nowacki, Amy; Hickner, John
2012-01-01
Introduction: Electronic knowledge resources have the potential to rapidly provide answers to clinicians' questions. We sought to determine clinicians' reasons for searching these resources, the rate of finding relevant information, and the perceived clinical impact of the information they retrieved. Methods: We asked general internists, family…
ERIC Educational Resources Information Center
White, Marilyn; Sanders, Susan
2009-01-01
The Information Services Division (ISD) of the National Institute of Standards and Technology (NIST) positioned itself to successfully implement an electronic resources management system. This article highlights the ISD's unique ability to "team" across the organization to realize a common goal, develop leadership qualities in support of…
An emulator for minimizing computer resources for finite element analysis
NASA Technical Reports Server (NTRS)
Melosh, R.; Utku, S.; Islam, M.; Salama, M.
1984-01-01
A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).
ERIC Educational Resources Information Center
Kononets, Natalia
2015-01-01
The introduction of resource-based learning disciplines of computer cycles in Agrarian College. The article focused on the issue of implementation of resource-based learning courses in the agricultural cycle computer college. Tested approach to creating elearning resources through free hosting and their further use in the classroom. Noted that the…
The AMTEX Partnership. Third quarterly report, FY 1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lemon, D.K.; Quisenberry, R.K.
1995-06-01
Key activities for the quarter were the initiation of tactical work on the OPCon Project, development of a draft of the AMTEX Policies and Procedures document, and a meeting of the Industry Technical Advisory Committee. A significant milestone was reached when a memorandum of understanding was signed between the DOE and The Department of Commerce. The agreement signified the official participation of the National Institute of Standards and Technology on the Demand Activated Manufacturing Architecture (DAMA) project in AMTEX. Project accomplishments are given for: computer-aided manufacturing, cotton biotechnology, DAMA, electronic embedded fingerprints, rapid cutting, sensors for agile manufacturing, and textilemore » resource conservation.« less
ACToR A Aggregated Computational Toxicology Resource ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
ACToR A Aggregated Computational Toxicology Resource (S) ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
Microdot - A Four-Bit Microcontroller Designed for Distributed Low-End Computing in Satellites
NASA Astrophysics Data System (ADS)
2002-03-01
Many satellites are an integrated collection of sensors and actuators that require dedicated real-time control. For single processor systems, additional sensors require an increase in computing power and speed to provide the multi-tasking capability needed to service each sensor. Faster processors cost more and consume more power, which taxes a satellite's power resources and may lead to shorter satellite lifetimes. An alternative design approach is a distributed network of small and low power microcontrollers designed for space that handle the computing requirements of each individual sensor and actuator. The design of microdot, a four-bit microcontroller for distributed low-end computing, is presented. The design is based on previous research completed at the Space Electronics Branch, Air Force Research Laboratory (AFRL/VSSE) at Kirtland AFB, NM, and the Air Force Institute of Technology at Wright-Patterson AFB, OH. The Microdot has 29 instructions and a 1K x 4 instruction memory. The distributed computing architecture is based on the Philips Semiconductor I2C Serial Bus Protocol. A prototype was implemented and tested using an Altera Field Programmable Gate Array (FPGA). The prototype was operable to 9.1 MHz. The design was targeted for fabrication in a radiation-hardened-by-design gate-array cell library for the TSMC 0.35 micrometer CMOS process.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...
Information and Communicative Technology--Computers as Research Tools
ERIC Educational Resources Information Center
Sarsani, Mahender Reddy
2007-01-01
The emergence of "the electronic age,/electronic cottages/the electronic world" has affected the whole world; particularly the emergence of computers has penetrated everyone's life to a remarkable degree. They are being used in various fields including education. Recent advances, especially in the area of computer technology have…
NASA Astrophysics Data System (ADS)
Bartlett, Philip L.; Stelbovics, Andris T.; Bray, Igor
2004-02-01
A newly-derived iterative coupling procedure for the propagating exterior complex scaling (PECS) method is used to efficiently calculate the electron-impact wavefunctions for atomic hydrogen. An overview of this method is given along with methods for extracting scattering cross sections. Differential scattering cross sections at 30 eV are presented for the electron-impact excitation to the n = 1, 2, 3 and 4 final states, for both PECS and convergent close coupling (CCC), which are in excellent agreement with each other and with experiment. PECS results are presented at 27.2 eV and 30 eV for symmetric and asymmetric energy-sharing triple differential cross sections, which are in excellent agreement with CCC and exterior complex scaling calculations, and with experimental data. At these intermediate energies, the efficiency of the PECS method with iterative coupling has allowed highly accurate partial-wave solutions of the full Schrödinger equation, for L les 50 and a large number of coupled angular momentum states, to be obtained with minimal computing resources.
Assistive technology for memory support in dementia.
Van der Roest, Henriëtte G; Wenborn, Jennifer; Pastink, Channah; Dröes, Rose-Marie; Orrell, Martin
2017-06-11
The sustained interest in electronic assistive technology in dementia care has been fuelled by the urgent need to develop useful approaches to help support people with dementia at home. Also the low costs and wide availability of electronic devices make it more feasible to use electronic devices for the benefit of disabled persons. Information Communication Technology (ICT) devices designed to support people with dementia are usually referred to as Assistive Technology (AT) or Electronic Assistive Technology (EAT). By using AT in this review we refer to electronic assistive devices. A range of AT devices has been developed to support people with dementia and their carers to manage their daily activities and to enhance safety, for example electronic pill boxes, picture phones, or mobile tracking devices. Many are commercially available. However, the usefulness and user-friendliness of these devices are often poorly evaluated. Although reviews of (electronic) memory aids do exist, a systematic review of studies focusing on the efficacy of AT for memory support in people with dementia is lacking. Such a review would guide people with dementia and their informal and professional carers in selecting appropriate AT devices. Primary objectiveTo assess the efficacy of AT for memory support in people with dementia in terms of daily performance of personal and instrumental activities of daily living (ADL), level of dependency, and admission to long-term care. Secondary objectiveTo assess the impact of AT on: users (autonomy, usefulness and user-friendliness, adoption of AT); cognitive function and neuropsychiatric symptoms; need for informal and formal care; perceived quality of life; informal carer burden, self-esteem and feelings of competence; formal carer work satisfaction, workload and feelings of competence; and adverse events. We searched ALOIS, the Specialised Register of the Cochrane Dementia and Cognitive Improvement Group (CDCIG), on 10 November 2016. ALOIS is maintained by the Information Specialists of the CDCIG and contains studies in the areas of dementia prevention, dementia treatment and cognitive enhancement in healthy people. We also searched the following list of databases, adapting the search strategy as necessary: Centre for Reviews and Dissemination (CRD) Databases, up to May 2016; The Collection of Computer Science Bibliographies; DBLP Computer Science Bibliography; HCI Bibliography: Human-Computer Interaction Resources; and AgeInfo, all to June 2016; PiCarta; Inspec; Springer Link Lecture Notes; Social Care Online; and IEEE Computer Society Digital Library, all to October 2016; J-STAGE: Japan Science and Technology Information Aggregator, Electronic; and Networked Computer Science Technical Reference Library (NCSTRL), both to November 2016; Computing Research Repository (CoRR) up to December 2016; and OT seeker; and ADEAR, both to February 2017. In addition, we searched Google Scholar and OpenSIGLE for grey literature. We intended to review randomised controlled trials (RCTs) and clustered randomised trials with blinded assessment of outcomes that evaluated an electronic assistive device used with the single aim of supporting memory function in people diagnosed with dementia. The control interventions could either be 'care (or treatment) as usual' or non-technological psychosocial interventions (including interventions that use non-electronic assistive devices) also specifically aimed at supporting memory. Outcome measures included activities of daily living, level of dependency, clinical and care-related outcomes (for example admission to long-term care), perceived quality of life and well-being, and adverse events resulting from the use of AT; as well as the effects of AT on carers. Two review authors independently screened all titles and abstracts identified by the search. We identified no studies which met the inclusion criteria. This review highlights the current lack of high-quality evidence to determine whether AT is effective in supporting people with dementia to manage their memory problems.
NASA Parts Selection List (NPSL) WWW Site http://nepp.nasa.gov/npsl
NASA Technical Reports Server (NTRS)
Brusse, Jay
2000-01-01
The NASA Parts Selection List (NPSL) is an on-line resource for electronic parts selection tailored for use by spaceflight projects. The NPSL provides a list of commonly used electronic parts that have a history of satisfactory use in spaceflight applications. The objective of this www site is to provide NASA projects, contractors, university experimenters, et al with an easy to use resource that provides a baseline of electronic parts from which designers are encouraged to select. The NPSL is an ongoing resource produced by Code 562 in support of the NASA HQ funded NASA Electronic Parts and Packaging (NEPP) Program. The NPSL is produced as an electronic format deliverable made available via the referenced www site administered by Code 562. The NPSL does not provide information pertaining to patented or proprietary information. All of the information contained in the NPSL is available through various other public domain resources such as US Military procurement specifications for electronic parts, NASA GSFC's Preferred Parts List (PPL-21), and NASA's Standard Parts List (MIL-STD975).
NASA Tech Briefs, June 2000. Volume 24, No. 6
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Computers and Peripherals;
Aviation & Space Education: A Teacher's Resource Guide.
ERIC Educational Resources Information Center
Texas State Dept. of Aviation, Austin.
This resource guide contains information on curriculum guides, resources for teachers, computer software and computer related programs, audio/visual presentations, model aircraft and demonstration aids, training seminars and career education, and an aerospace bibliography for primary grades. Each entry includes all or some of the following items:…
Campus Computing Environment: University of Kentucky.
ERIC Educational Resources Information Center
CAUSE/EFFECT, 1989
1989-01-01
A dramatic growth in computing and communications was precipitated largely by the leadership of President David Roselle at the University of Kentucky. A new operational structure of information resource management includes not only computing (academic and administrative) and communications, instructional resources, and printing/mailing services,…
Teaching Computer Literacy with Freeware and Shareware.
ERIC Educational Resources Information Center
Hobart, R. Dale; And Others
1988-01-01
Describes workshops given at Ferris State University for faculty and staff who want to acquire computer skills. Considered are a computer literacy and a software toolkit distributed to participants made from public domain/shareware resources. Stresses the benefits of shareware as an educational resource. (CW)
Challenges in Securing the Interface Between the Cloud and Pervasive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lagesse, Brent J
2011-01-01
Cloud computing presents an opportunity for pervasive systems to leverage computational and storage resources to accomplish tasks that would not normally be possible on such resource-constrained devices. Cloud computing can enable hardware designers to build lighter systems that last longer and are more mobile. Despite the advantages cloud computing offers to the designers of pervasive systems, there are some limitations of leveraging cloud computing that must be addressed. We take the position that cloud-based pervasive system must be secured holistically and discuss ways this might be accomplished. In this paper, we discuss a pervasive system utilizing cloud computing resources andmore » issues that must be addressed in such a system. In this system, the user's mobile device cannot always have network access to leverage resources from the cloud, so it must make intelligent decisions about what data should be stored locally and what processes should be run locally. As a result of these decisions, the user becomes vulnerable to attacks while interfacing with the pervasive system.« less
Methods and systems for providing reconfigurable and recoverable computing resources
NASA Technical Reports Server (NTRS)
Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)
2010-01-01
A method for optimizing the use of digital computing resources to achieve reliability and availability of the computing resources is disclosed. The method comprises providing one or more processors with a recovery mechanism, the one or more processors executing one or more applications. A determination is made whether the one or more processors needs to be reconfigured. A rapid recovery is employed to reconfigure the one or more processors when needed. A computing system that provides reconfigurable and recoverable computing resources is also disclosed. The system comprises one or more processors with a recovery mechanism, with the one or more processors configured to execute a first application, and an additional processor configured to execute a second application different than the first application. The additional processor is reconfigurable with rapid recovery such that the additional processor can execute the first application when one of the one more processors fails.
Polyphony: A Workflow Orchestration Framework for Cloud Computing
NASA Technical Reports Server (NTRS)
Shams, Khawaja S.; Powell, Mark W.; Crockett, Tom M.; Norris, Jeffrey S.; Rossi, Ryan; Soderstrom, Tom
2010-01-01
Cloud Computing has delivered unprecedented compute capacity to NASA missions at affordable rates. Missions like the Mars Exploration Rovers (MER) and Mars Science Lab (MSL) are enjoying the elasticity that enables them to leverage hundreds, if not thousands, or machines for short durations without making any hardware procurements. In this paper, we describe Polyphony, a resilient, scalable, and modular framework that efficiently leverages a large set of computing resources to perform parallel computations. Polyphony can employ resources on the cloud, excess capacity on local machines, as well as spare resources on the supercomputing center, and it enables these resources to work in concert to accomplish a common goal. Polyphony is resilient to node failures, even if they occur in the middle of a transaction. We will conclude with an evaluation of a production-ready application built on top of Polyphony to perform image-processing operations of images from around the solar system, including Mars, Saturn, and Titan.
Infrastructure Systems for Advanced Computing in E-science applications
NASA Astrophysics Data System (ADS)
Terzo, Olivier
2013-04-01
In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate cloud infrastructure to add some additional resources form the Public cloud for following the needs in term of computational and storage resources and release them where process are finished. Following the hybrid model, the scheduling approach is important for managing both cloud models. Thanks to this model infrastructure every time resources are available for additional request in term of IT capacities that can used "on demand" for a limited time without having to proceed to purchase additional servers.
Diversity in computing technologies and strategies for dynamic resource allocation
Garzoglio, G.; Gutsche, O.
2015-12-23
Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.
NASA Technical Reports Server (NTRS)
Srivastava, Deepak; Menon, Madhu; Cho, Kyeongjae; Biegel, Bryan (Technical Monitor)
2001-01-01
The role of computational nanotechnology in developing next generation of multifunctional materials, molecular scale electronic and computing devices, sensors, actuators, and machines is described through a brief review of enabling computational techniques and few recent examples derived from computer simulations of carbon nanotube based molecular nanotechnology.
A distributed computing approach to mission operations support. [for spacecraft
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1975-01-01
Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.
An Analog Computer for Electronic Engineering Education
ERIC Educational Resources Information Center
Fitch, A. L.; Iu, H. H. C.; Lu, D. D. C.
2011-01-01
This paper describes a compact analog computer and proposes its use in electronic engineering teaching laboratories to develop student understanding of applications in analog electronics, electronic components, engineering mathematics, control engineering, safe laboratory and workshop practices, circuit construction, testing, and maintenance. The…
The EPOS Vision for the Open Science Cloud
NASA Astrophysics Data System (ADS)
Jeffery, Keith; Harrison, Matt; Cocco, Massimo
2016-04-01
Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be considered as ICS-ds by EPOS.. Provision of access to ICS-Ds from ICS-C concerns several aspects: (a) Technical : it may be more or less difficult to connect and pass from ICS-C to the ICS-d/ CES the 'package' (probably a virtual machine) of data and software; (b) Security/privacy : including passing personal information e.g. related to AAAI (Authentication, authorization, accounting Infrastructure); (c) financial and legal : such as payment, licence conditions; Appropriate interfaces from ICS-C to ICS-d are being designed to accommodate these aspects. The Open Science Cloud is timely because it provides a framework to discuss governance and sustainability for computational resource provision as well as an effective interpretation of federated approach to HPC(High Performance Computing) -HTC (High Throughput Computing). It will be a unique opportunity to share and adopt procurement policies to provide access to computational resources for RIs. The current state of discussions and expected roadmap for the EPOS-Open Science Cloud relationship are presented.
NASA Technical Reports Server (NTRS)
Mahalingam, Sudhakar; Menart, James A.
2005-01-01
Computational modeling of the plasma located in the discharge chamber of an ion engine is an important activity so that the development and design of the next generation of ion engines may be enhanced. In this work a computational tool called XOOPIC is used to model the primary electrons, secondary electrons, and ions inside the discharge chamber. The details of this computational tool are discussed in this paper. Preliminary results from XOOPIC are presented. The results presented include particle number density distributions for the primary electrons, the secondary electrons, and the ions. In addition the total number of a particular particle in the discharge chamber as a function of time, electric potential maps and magnetic field maps are presented. A primary electron number density plot from PRIMA is given in this paper so that the results of XOOPIC can be compared to it. PRIMA is a computer code that the present investigators have used in much of their previous work that provides results that compare well to experimental results. PRIMA only models the primary electrons in the discharge chamber. Modeling ions and secondary electrons, as well as the primary electrons, will greatly increase our ability to predict different characteristics of the plasma discharge used in an ion engine.
Kramer, Tobias; Noack, Matthias; Reinefeld, Alexander; Rodríguez, Mirta; Zelinskyy, Yaroslav
2018-06-11
Time- and frequency-resolved optical signals provide insights into the properties of light-harvesting molecular complexes, including excitation energies, dipole strengths and orientations, as well as in the exciton energy flow through the complex. The hierarchical equations of motion (HEOM) provide a unifying theory, which allows one to study the combined effects of system-environment dissipation and non-Markovian memory without making restrictive assumptions about weak or strong couplings or separability of vibrational and electronic degrees of freedom. With increasing system size the exact solution of the open quantum system dynamics requires memory and compute resources beyond a single compute node. To overcome this barrier, we developed a scalable variant of HEOM. Our distributed memory HEOM, DM-HEOM, is a universal tool for open quantum system dynamics. It is used to accurately compute all experimentally accessible time- and frequency-resolved processes in light-harvesting molecular complexes with arbitrary system-environment couplings for a wide range of temperatures and complex sizes. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Daubechies wavelets for linear scaling density functional theory.
Mohr, Stephan; Ratcliff, Laura E; Boulanger, Paul; Genovese, Luigi; Caliste, Damien; Deutsch, Thierry; Goedecker, Stefan
2014-05-28
We demonstrate that Daubechies wavelets can be used to construct a minimal set of optimized localized adaptively contracted basis functions in which the Kohn-Sham orbitals can be represented with an arbitrarily high, controllable precision. Ground state energies and the forces acting on the ions can be calculated in this basis with the same accuracy as if they were calculated directly in a Daubechies wavelets basis, provided that the amplitude of these adaptively contracted basis functions is sufficiently small on the surface of the localization region, which is guaranteed by the optimization procedure described in this work. This approach reduces the computational costs of density functional theory calculations, and can be combined with sparse matrix algebra to obtain linear scaling with respect to the number of electrons in the system. Calculations on systems of 10,000 atoms or more thus become feasible in a systematic basis set with moderate computational resources. Further computational savings can be achieved by exploiting the similarity of the adaptively contracted basis functions for closely related environments, e.g., in geometry optimizations or combined calculations of neutral and charged systems.
Computer Technology Resources for Literacy Projects.
ERIC Educational Resources Information Center
Florida State Council on Aging, Tallahassee.
This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba);…
The Computer Explosion: Implications for Educational Equity. Resource Notebook.
ERIC Educational Resources Information Center
Denbo, Sheryl, Comp.
This notebook was prepared to provide resources for educators interested in using computers to increase opportunities for all students. The notebook contains specially prepared materials and selected newspaper and journal articles. The first section reviews the issues related to computer equity (equal access, tracking through different…
Development of Computer-Based Resources for Textile Education.
ERIC Educational Resources Information Center
Hopkins, Teresa; Thomas, Andrew; Bailey, Mike
1998-01-01
Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…
Video Killed the Radio Star: Language Students' Use of Electronic Resources-Reading or Viewing?
ERIC Educational Resources Information Center
Kiliçkaya, Ferit
2016-01-01
The current study aimed to investigate language students' use of print and electronic resources for their research papers required in research techniques class, focusing on which reading strategies they used while reading these resources. The participants of the study were 90 sophomore students enrolled in the research techniques class offered at…
ERIC Educational Resources Information Center
Downey, Kay
2012-01-01
Kent State University has developed a centralized system that manages the communication and work related to the review and selection of commercially available electronic resources. It is an automated system that tracks the review process, provides selectors with price and trial information, and compiles reviewers' feedback about the resource. It…
Parallel computing method for simulating hydrological processesof large rivers under climate change
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.
2016-12-01
Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.
Schooley, Benjamin; Walczak, Steven; Hikmet, Neset; Patel, Nitin
2016-04-01
Health information technology investments continue to increase while the value derived from their implementation and use is mixed. Mobile device adoption into practice is a recent trend that has increased dramatically and formal studies are needed to investigate consequent benefits and challenges. The objective of this study is to evaluate practitioner perceptions of improvements in productivity, provider-patient communications, care provision, technology usability and other outcomes following the adoption and use of a tablet computer connected to electronic health information resources. A pilot program was initiated in June 2013 to evaluate the effect of mobile tablet computers at one health provider organization in the southeast United States. Providers were asked to volunteer for the evaluation and were each given a mobile tablet computer. A total of 42 inpatient and outpatient providers were interviewed in 2015 using a survey style questionnaire that utilized yes/no, Likert-style, and open ended questions. Each had previously used an electronic health record (EHR) system a minimum of one year outside of residency, and were regular users of personal mobile devices. Each used a mobile tablet computer in the context of their practice connected to the health system EHR. The survey results indicate that more than half of providers perceive the use of the tablet device as having a positive effect on patient communications, patient education, patient's perception of the provider, time spent interacting with patients, provider productivity, process of care, satisfaction with EHR when used together with the device, and care provision. Providers also reported feeling comfortable using the device (82.9%), would recommend the device to colleagues (69.2%), did not experience increased information security and privacy concerns (95%), and noted significant reductions in EHR login times (64.1%). Less than 25% of participants reported negative impacts on any of these areas as well as on time spent on order submission, note completion time, overall workload, patient satisfaction with care experience and patient outcomes. Gender, number of years in practice, practice type (general practitioner vs. specialist), and service type (inpatient/outpatient) were found to have a significant effect on perceptions of patient satisfaction, care process, and provider productivity. Providers found positive gains from utilizing mobile devices in overall productivity, improved communications with their patients, the process of care, and technology efficiencies when used in combination with EHR and other health information resources. Demographic and health care work environment play a role in how mobile technologies are integrated into practice by providers. Copyright © 2016. Published by Elsevier Ireland Ltd.
Workflow Management Systems for Molecular Dynamics on Leadership Computers
NASA Astrophysics Data System (ADS)
Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu
Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.
A new taxonomy for distributed computer systems based upon operating system structure
NASA Technical Reports Server (NTRS)
Foudriat, E. C.
1985-01-01
Characteristics of the resource structure found in the operating system are considered as a mechanism for classifying distributed computer systems. Since the operating system resources, themselves, are too diversified to provide a consistent classification, the structure upon which resources are built and shared are examined. The location and control character of this indivisibility provides the taxonomy for separating uniprocessors, computer networks, network computers (fully distributed processing systems or decentralized computers) and algorithm and/or data control multiprocessors. The taxonomy is important because it divides machines into a classification that is relevant or important to the client and not the hardware architect. It also defines the character of the kernel O/S structure needed for future computer systems. What constitutes an operating system for a fully distributed processor is discussed in detail.
Bray, Lucy; Sanders, Caroline; McKenna, Jacqueline
2013-12-01
To investigate health professionals' evaluation of a computer-based resource designed to improve discussions about sexual and relationship health with young people. Evidence suggests that some health professionals can experience discomfort discussing sexual health and relationship issues with young people. Professionals within hospital settings should have the knowledge, competencies and skills to be able to ask young people sexual health questions and provide accurate sexual health education. Despite some educational material being available for community and adult services, there are no resources available, which are directly relevant to holding opportunistic discussions with young people within an acute children's hospital. A descriptive survey design. One hundred and fourteen health professionals from a children's hospital in the UK were involved in evaluating a computer-based resource. All completed an online questionnaire survey comprising of closed and open questions. The health professionals reported that the computer-based resource had a positive influence on their knowledge and clinical practice. The videos as well as the concise nature of the resource were evaluated highly. Learning was facilitated by professionals being able to control their learning through rerunning and accessing the resource on numerous occasions. An engaging, accessible computer-based resource has the capability to positively impact on health professionals' knowledge of, and skills in, starting and holding sexual health conversations with young people accessing a children's hospital. Health professionals working with children and young people value accessible, relevant and short computer-based training. This can facilitate knowledge and skill acquisition despite variation in working patterns. Improving the knowledge and skills of professionals working with young people to facilitate appropriate yet opportunistic sexual health discussions is important within the public health agenda. © 2013 John Wiley & Sons Ltd.
Connecting Print and Electronic Titles: An Integrated Approach at the University of Nebraska-Lincoln
ERIC Educational Resources Information Center
Wolfe, Judith; Konecky, Joan Latta; Boden, Dana W. R.
2011-01-01
Libraries make heavy investments in electronic resources, with many of these resources reflecting title changes, bundled subsets, or content changes of formerly print material. These changes can distance the electronic format from its print origins, creating discovery and access issues. A task force was formed to explore the enhancement of catalog…
ERIC Educational Resources Information Center
Glogoff, Stuart
1995-01-01
Discusses two Electronic Library Education Centers (ELECs) created at the University of Arizona to improve library instruction in the use of online resources. Examines costs of developing ELECs; technical changes experienced; and benefits to users and librarians. A sidebar by Abbie J. Basile identifies Internet resources for planning and/or…
Complex wet-environments in electronic-structure calculations
NASA Astrophysics Data System (ADS)
Fisicaro, Giuseppe; Genovese, Luigi; Andreussi, Oliviero; Marzari, Nicola; Goedecker, Stefan
The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of an applied electrochemical potentials, including complex electrostatic screening coming from the solvent. In the present work we present a solver to handle both the Generalized Poisson and the Poisson-Boltzmann equation. A preconditioned conjugate gradient (PCG) method has been implemented for the Generalized Poisson and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations. On the other hand, a self-consistent procedure enables us to solve the Poisson-Boltzmann problem. The algorithms take advantage of a preconditioning procedure based on the BigDFT Poisson solver for the standard Poisson equation. They exhibit very high accuracy and parallel efficiency, and allow different boundary conditions, including surfaces. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and it will be released as a independent program, suitable for integration in other codes. We present test calculations for large proteins to demonstrate efficiency and performances. This work was done within the PASC and NCCR MARVEL projects. Computer resources were provided by the Swiss National Supercomputing Centre (CSCS) under Project ID s499. LG acknowledges also support from the EXTMOS EU project.
NASA Astrophysics Data System (ADS)
Croitoru, Bogdan; Tulbure, Adrian; Abrudean, Mihail; Secara, Mihai
2015-02-01
The present paper describes a software method for creating / managing one type of Transducer Electronic Datasheet (TEDS) according to IEEE 1451.4 standard in order to develop a prototype of smart multi-sensor platform (with up to ten different analog sensors simultaneously connected) with Plug and Play capabilities over ETHERNET and Wi-Fi. In the experiments were used: one analog temperature sensor, one analog light sensor, one PIC32-based microcontroller development board with analog and digital I/O ports and other computing resources, one 24LC256 I2C (Inter Integrated Circuit standard) serial Electrically Erasable Programmable Read Only Memory (EEPROM) memory with 32KB available space and 3 bytes internal buffer for page writes (1 byte for data and 2 bytes for address). It was developed a prototype algorithm for writing and reading TEDS information to / from I2C EEPROM memories using the standard C language (up to ten different TEDS blocks coexisting in the same EEPROM device at once). The algorithm is able to write and read one type of TEDS: transducer information with standard TEDS content. A second software application, written in VB.NET platform, was developed in order to access the EEPROM sensor information from a computer through a serial interface (USB).
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
Learning with Computers. AECA Resource Book Series, Volume 3, Number 2.
ERIC Educational Resources Information Center
Elliott, Alison
1996-01-01
Research has supported the idea that the use of computers in the education of young children promotes social interaction and academic achievement. This resource booklet provides an introduction to computers in early childhood settings to enrich learning opportunities and provides guidance to teachers to find developmentally appropriate software…
Computer conferencing: the "nurse" in the "Electronic School District".
Billings, D M; Phillips, A
1991-01-01
As computer-based instructional technologies become increasingly available, they offer new mechanisms for health educators to provide health instruction. This article describes a pilot project in which nurses established a computer conference to provide health instruction to high school students participating in an electronic link of high schools. The article discusses computer conferencing, the "Electronic School District," the design of the nursing conference, and the role of the nurse in distributed health education.
On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers
NASA Astrophysics Data System (ADS)
Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.
2017-10-01
This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.
Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.
Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan
2016-01-01
Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.
Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing
Zhang, Min; Sun, Yan
2016-01-01
Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network. PMID:28030553
Pelzer, N L; Wiese, W H; Leysen, J M
1998-07-01
Veterinary medical students at Iowa State University were surveyed in January of 1997 to determine their general use of the Veterinary Medical Library and how they sought information in an electronic environment. Comparisons were made between this study and one conducted a decade ago to determine the effect of the growth in electronic resources on student library use and information-seeking behavior. The basic patterns of student activities in the library, resources used to find current information, and resources anticipated for future education needs remained unchanged. The 1997 students used the library most frequently for photocopying, office supplies, and studying coursework; they preferred textbooks and handouts as sources of current information. However, when these students went beyond textbooks and handouts to seek current information, a major shift was seen from the use of print indexes and abstracts in 1987 towards the use of computerized indexes and other electronic resources in 1997. Almost 60% of the students reported using the Internet for locating current information. Overall use of electronic materials was highest among a group of students receiving the problem-based learning method of instruction. Most of the students surveyed in 1997 indicated that electronic resources would have some degree of importance to them for future education needs. The electronic environment has provided new opportunities for information professionals to help prepare future veterinarians, some of whom will be practicing in remote geographical locations, to access the wealth of information and services available on the Internet and Web.
21 CFR 803.14 - How do I submit a report electronically?
Code of Federal Regulations, 2014 CFR
2014-04-01
... submissions include alternative reporting media (magnetic tape, disc, etc.) and computer-to-computer communication. (b) If your electronic report meets electronic reporting standards, guidance documents, or other...
Research on Key Technologies of Cloud Computing
NASA Astrophysics Data System (ADS)
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
Hard-real-time resource management for autonomous spacecraft
NASA Technical Reports Server (NTRS)
Gat, E.
2000-01-01
This paper describes tickets, a computational mechanism for hard-real-time autonomous resource management. Autonomous spacecraftcontrol can be considered abstractly as a computational process whose outputs are spacecraft commands.
Synchronization of Finite State Shared Resources
1976-03-01
IMHI uiw mmm " AFOSR -TR- 70- 0^8 3 QC o SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Sei neide.- DEPARTMENT of COMPUTER...34" ■ ■ ^ I I. i. . : ,1 . i-i SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Schneider Department of Computer...SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. ABSTRACT The problem of synchronizing a set of operations defined on a shared resource
Radar Control Optimal Resource Allocation
2015-07-13
other tunable parameters of radars [17, 18]. Such radar resource scheduling usually demands massive computation. Even myopic 14 Distribution A: Approved...reduced validity of the optimal choice of radar resource. In the non- myopic context, the computational problem becomes exponentially more difficult...computed as t? = ασ2 q + σ r √ α q (σ + r + α q) α q2 r − 1ασ q2 + q r2 . (19) We are only interested in t? > 1 and solving the inequality we obtain the
Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons
NASA Technical Reports Server (NTRS)
Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang
2008-01-01
Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.
Performance Evaluation of Resource Management in Cloud Computing Environments.
Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci
2015-01-01
Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.
Performance Evaluation of Resource Management in Cloud Computing Environments
Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci
2015-01-01
Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price. PMID:26555730
ERIC Educational Resources Information Center
Pezzoli, Jean A.
In June 1992, Maui Community College (MCC), in Hawaii, conducted a survey of the communities of Maui, Molokai, Lanai, and Hana to determine perceived needs for an associate degree and certificate program in electronics and computer engineering. Questionnaires were mailed to 500 firms utilizing electronic or computer services, seeking information…
Eurogrid: a new glideinWMS based portal for CDF data analysis
NASA Astrophysics Data System (ADS)
Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.
2012-12-01
The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.
Daylighting simulation: methods, algorithms, and resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carroll, William L.
This document presents work conducted as part of Subtask C, ''Daylighting Design Tools'', Subgroup C2, ''New Daylight Algorithms'', of the IEA SHC Task 21 and the ECBCS Program Annex 29 ''Daylight in Buildings''. The search for and collection of daylighting analysis methods and algorithms led to two important observations. First, there is a wide range of needs for different types of methods to produce a complete analysis tool. These include: Geometry; Light modeling; Characterization of the natural illumination resource; Materials and components properties, representations; and Usability issues (interfaces, interoperability, representation of analysis results, etc). Second, very advantageously, there have beenmore » rapid advances in many basic methods in these areas, due to other forces. They are in part driven by: The commercial computer graphics community (commerce, entertainment); The lighting industry; Architectural rendering and visualization for projects; and Academia: Course materials, research. This has led to a very rich set of information resources that have direct applicability to the small daylighting analysis community. Furthermore, much of this information is in fact available online. Because much of the information about methods and algorithms is now online, an innovative reporting strategy was used: the core formats are electronic, and used to produce a printed form only secondarily. The electronic forms include both online WWW pages and a downloadable .PDF file with the same appearance and content. Both electronic forms include live primary and indirect links to actual information sources on the WWW. In most cases, little additional commentary is provided regarding the information links or citations that are provided. This in turn allows the report to be very concise. The links are expected speak for themselves. The report consists of only about 10+ pages, with about 100+ primary links, but with potentially thousands of indirect links. For purposes of the printed version, a list of the links is explicitly provided. This document exists in HTML form at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.html. An equivalent downloadable PDF version, also with live links, at the URL address: http://eande.lbl.gov/Task21/dlalgorithms.pdf. A printed report can be derived directly from either of the electronic versions by simply printing either of them. In addition to the live links in the electronic forms, all report forms, electronic and paper, also have explicitly listed link addresses so that they can be followed up or referenced manually.« less
Application of microarray analysis on computer cluster and cloud platforms.
Bernau, C; Boulesteix, A-L; Knaus, J
2013-01-01
Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.
Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.
NASA Astrophysics Data System (ADS)
Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca
2015-12-01
The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.
O'Donnell, Michael
2015-01-01
State-and-transition simulation modeling relies on knowledge of vegetation composition and structure (states) that describe community conditions, mechanistic feedbacks such as fire that can affect vegetation establishment, and ecological processes that drive community conditions as well as the transitions between these states. However, as the need for modeling larger and more complex landscapes increase, a more advanced awareness of computing resources becomes essential. The objectives of this study include identifying challenges of executing state-and-transition simulation models, identifying common bottlenecks of computing resources, developing a workflow and software that enable parallel processing of Monte Carlo simulations, and identifying the advantages and disadvantages of different computing resources. To address these objectives, this study used the ApexRMS® SyncroSim software and embarrassingly parallel tasks of Monte Carlo simulations on a single multicore computer and on distributed computing systems. The results demonstrated that state-and-transition simulation models scale best in distributed computing environments, such as high-throughput and high-performance computing, because these environments disseminate the workloads across many compute nodes, thereby supporting analysis of larger landscapes, higher spatial resolution vegetation products, and more complex models. Using a case study and five different computing environments, the top result (high-throughput computing versus serial computations) indicated an approximate 96.6% decrease of computing time. With a single, multicore compute node (bottom result), the computing time indicated an 81.8% decrease relative to using serial computations. These results provide insight into the tradeoffs of using different computing resources when research necessitates advanced integration of ecoinformatics incorporating large and complicated data inputs and models. - See more at: http://aimspress.com/aimses/ch/reader/view_abstract.aspx?file_no=Environ2015030&flag=1#sthash.p1XKDtF8.dpuf
Dollar, Daniel M; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-04-01
To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database ("electronic shelf reading"), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes.
Dollar, Daniel M.; Gallagher, John; Glover, Janis; Marone, Regina Kenny; Crooker, Cynthia
2007-01-01
Objective: To support migration from print to electronic resources, the Cushing/Whitney Medical Library at Yale University reorganized its Technical Services Department to focus on managing electronic resources. Methods: The library hired consultants to help plan the changes and to present recommendations for integrating electronic resource management into every position. The library task force decided to focus initial efforts on the periodical collection. To free staff time to devote to electronic journals, most of the print subscriptions were switched to online only and new workflows were developed for e-journals. Results: Staff learned new responsibilities such as activating e-journals, maintaining accurate holdings information in the online public access catalog and e-journals database (“electronic shelf reading”), updating the link resolver knowledgebase, and troubleshooting. All of the serials team members now spend significant amounts of time managing e-journals. Conclusions: The serials staff now spends its time managing the materials most important to the library's clientele (e-journals and databases). The team's proactive approach to maintenance work and rapid response to reported problems should improve patrons' experiences using e-journals. The library is taking advantage of new technologies such as an electronic resource management system, and library workflows and procedures will continue to evolve as technology changes. PMID:17443247
NASA Tech Briefs, February 2000. Volume 24, No. 2
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Mathematics and Information Sciences; Computers and Peripherals.
Computing Bounds on Resource Levels for Flexible Plans
NASA Technical Reports Server (NTRS)
Muscvettola, Nicola; Rijsman, David
2009-01-01
A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow algorithm applied to an auxiliary flow network of 2N nodes. The algorithm is believed to be efficient in practice; experimental analysis shows the practical cost of maxflow to be as low as O(N1.5). The algorithm could be enhanced following at least two approaches. In the first approach, incremental subalgorithms for the computation of the envelope could be developed. By use of temporal scanning of the events in the temporal network, it may be possible to significantly reduce the size of the networks on which it is necessary to run the maximum-flow subalgorithm, thereby significantly reducing the time required for envelope calculation. In the second approach, the practical effectiveness of resource envelopes in the inner loops of search algorithms could be tested for multi-capacity resource scheduling. This testing would include inner-loop backtracking and termination tests and variable and value-ordering heuristics that exploit the properties of resource envelopes more directly.
Selection of Electronic Resources.
ERIC Educational Resources Information Center
Weathers, Barbara
1998-01-01
Discusses the impact of electronic resources on collection development; selection of CD-ROMs, (platform, speed, video and sound, networking capability, installation and maintenance); selection of laser disks; and Internet evaluation (accuracy of content, authority, objectivity, currency, technical characteristics). Lists Web sites for evaluating…
Cloudbus Toolkit for Market-Oriented Cloud Computing
NASA Astrophysics Data System (ADS)
Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian
This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.
Overhauling, updating and augmenting NASA spacelink electronic information system
NASA Technical Reports Server (NTRS)
Blake, Jean A.
1991-01-01
NASA/Spacelink is a collection of NASA information and educational materials stored on a computer at the MSFC. It is provided by the NASA Educational Affairs Division and is operated by the Education Branch of the Marshall Center Public Affairs Office. It is designed to communicate with a wide variety of computers and modems, especially those most commonly found in classrooms and homes. It was made available to the public in February, 1988. The system may be accessed by educators and the public over regular telephone lines. NASA/Spacelink is free except for the cost of long distance calls. Overhauling and updating Spacelink was done to refurbish NASA/Spacelink, a very valuable resource medium. Several new classroom activities and miscellaneous topics were edited and entered into Spacelink. One of the areas that received a major overhaul (under the guidance of Amos Crisp) was the SPINOFFS BENEFITS, the great benefits resulting from America's space explorations. The Spinoff Benefits include information on a variety of topics including agriculture, communication, the computer, consumer, energy, equipment and materials, food, health, home, industry, medicine, natural resources, public services, recreation, safety, sports, and transportation. In addition to the Space Program Spinoff Benefits, the following is a partial list of some of the material updated and introduced: Astronaut Biographies, Miscellaneous Aeronautics Classroom Activities, Miscellaneous Astronomy Classroom Activities, Miscellaneous Rocketry Classroom Activities, Miscellaneous Classroom Activities, NASA and Its Center, NASA Areas of Research, NASA Patents, Licensing, NASA Technology Transfer, Pictures from Space Classroom Activities, Status of Current NASA Projects, Using Art to Teach Science, and Word Puzzles for Use in the Classroom.
Focus issue: series on computational and systems biology.
Gough, Nancy R
2011-09-06
The application of computational biology and systems biology is yielding quantitative insight into cellular regulatory phenomena. For the month of September, Science Signaling highlights research featuring computational approaches to understanding cell signaling and investigation of signaling networks, a series of Teaching Resources from a course in systems biology, and various other articles and resources relevant to the application of computational biology and systems biology to the study of signal transduction.
Electronic Computer Aided Design. Its Application in FE.
ERIC Educational Resources Information Center
Further Education Unit, London (England).
A study was conducted at the Electronics Industrial Unit at the Dorset Institute of Higher Education to investigate the feasibility of incorporating computer-aided design (CAD) in electrical and electronic courses. The aim was to investigate the application of CAD to electrical and electronic systems; the extent to which industrial developments…
Multi-Functional UV-Visible-IR Nanosensors Devices and Structures
2015-04-29
Dual-Gate MOSFET System, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics ...International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 216-217 (2013); ISBN 978-3-901578-26-7 M. S...Raman Spectroscopy, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 198
Luke, Stephen; Fountain, John S; Reith, David M; Braitberg, George; Cruickshank, Jaycen
2014-10-01
ED staff use a range of poisons information resources of varying type and quality. The present study aims to identify those resources utilised in the state of Victoria, Australia, and assess opinion of the most used electronic products. A previously validated self-administered survey was conducted in 15 EDs, with 10 questionnaires sent to each. The survey was then repeated following the provision of a 4-month period of access to Toxinz™, an Internet poisons information product novel to the region. The study was conducted from December 2010 to August 2011. There were 117 (78%) and 48 (32%) responses received from the first and second surveys, respectively, a 55% overall response rate. No statistically significant differences in professional group, numbers of poisoned patients seen or resource type accessed were identified between studies. The electronic resource most used in the first survey was Poisindex® (48.68%) and Toxinz™ (64.1%) in the second. There were statistically significant (P < 0.01) improvements in satisfaction in 26 of 42 questions between surveys, and no decrements. Although the majority of responders possessed mobile devices, less than half used them for poisons information but would do so if a reputable product was available. The order of poisons information sources most utilised was: consultation with a colleague, in-house protocols and electronic resources. There was a significant difference in satisfaction with electronic poisons information resources and a movement away from existing sources when choice was provided. Interest in increased use of mobile solutions was identified. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Computer simulation of electron flow in linear-beam microwave tubes
NASA Astrophysics Data System (ADS)
Kumar, Lalit
1990-12-01
The computer simulation of electron flow in linear-beam microwave tubes, such as a travelling-wave tube (TWT) and klystron, is used for designing and optimising the electron gun and collector and for analysing the large-signal beam-wave interaction phenomenon. Major aspects of simulation of electron flow in static and rf fields present in such tubes are discussed. Some advancements made in this respect and results obtained from computer programs developed by the research group at CEERI for a gridded electron gun, depressed collector, and large-signal analysis of TWT and klystron are presented.
Applications of computer-aided text analysis in natural resources.
David N. Bengston
2000-01-01
Ten contributed papers describe the use of a variety of approaches to computer-aided text analysis and their application to a wide range of research questions related to natural resources and the environment. Taken together, these papers paint a picture of a growing and vital area of research on the human dimensions of natural resource management.
SCANIT: centralized digitizing of forest resource maps or photographs
Elliot L. Amidon; E. Joyce Dye
1981-01-01
Spatial data on wildland resource maps and aerial photographs can be analyzed by computer after digitizing. SCANIT is a computerized system for encoding such data in digital form. The system, consisting of a collection of computer programs and subroutines, provides a powerful and versatile tool for a variety of resource analyses. SCANIT also may be converted easily to...
NASA Astrophysics Data System (ADS)
Burnett, W.
2016-12-01
The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.
ERIC Educational Resources Information Center
Noh, Younghee
2010-01-01
This study aimed to improve the current state of electronic resource evaluation in libraries. While the use of Web DB, e-book, e-journal, and other e-resources such as CD-ROM, DVD, and micro materials is increasing in libraries, their use is not comprehensively factored into the general evaluation of libraries and may diminish the reliability of…
Electronic Commerce Resource Centers. An Industry--University Partnership.
ERIC Educational Resources Information Center
Gulledge, Thomas R.; Sommer, Rainer; Tarimcilar, M. Murat
1999-01-01
Electronic Commerce Resource Centers focus on transferring emerging technologies to small businesses through university/industry partnerships. Successful implementation hinges on a strategic operating plan, creation of measurable value for customers, investment in customer-targeted training, and measurement of performance outputs. (SK)
National electronic medical records integration on cloud computing system.
Mirza, Hebah; El-Masri, Samir
2013-01-01
Few Healthcare providers have an advanced level of Electronic Medical Record (EMR) adoption. Others have a low level and most have no EMR at all. Cloud computing technology is a new emerging technology that has been used in other industry and showed a great success. Despite the great features of Cloud computing, they haven't been utilized fairly yet in healthcare industry. This study presents an innovative Healthcare Cloud Computing system for Integrating Electronic Health Record (EHR). The proposed Cloud system applies the Cloud Computing technology on EHR system, to present a comprehensive EHR integrated environment.
Rouillard, Andrew D; Wang, Zichen; Ma'ayan, Avi
2015-12-01
With advances in genomics, transcriptomics, metabolomics and proteomics, and more expansive electronic clinical record monitoring, as well as advances in computation, we have entered the Big Data era in biomedical research. Data gathering is growing rapidly while only a small fraction of this data is converted to useful knowledge or reused in future studies. To improve this, an important concept that is often overlooked is data abstraction. To fuse and reuse biomedical datasets from diverse resources, data abstraction is frequently required. Here we summarize some of the major Big Data biomedical research resources for genomics, proteomics and phenotype data, collected from mammalian cells, tissues and organisms. We then suggest simple data abstraction methods for fusing this diverse but related data. Finally, we demonstrate examples of the potential utility of such data integration efforts, while warning about the inherit biases that exist within such data. Copyright © 2015 Elsevier Ltd. All rights reserved.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Real-Time Electronic Dashboard Technology and Its Use to Improve Pediatric Radiology Workflow.
Shailam, Randheer; Botwin, Ariel; Stout, Markus; Gee, Michael S
The purpose of our study was to create a real-time electronic dashboard in the pediatric radiology reading room providing a visual display of updated information regarding scheduled and in-progress radiology examinations that could help radiologists to improve clinical workflow and efficiency. To accomplish this, a script was set up to automatically send real-time HL7 messages from the radiology information system (Epic Systems, Verona, WI) to an Iguana Interface engine, with relevant data regarding examinations stored in an SQL Server database for visual display on the dashboard. Implementation of an electronic dashboard in the reading room of a pediatric radiology academic practice has led to several improvements in clinical workflow, including decreasing the time interval for radiologist protocol entry for computed tomography or magnetic resonance imaging examinations as well as fewer telephone calls related to unprotocoled examinations. Other advantages include enhanced ability of radiologists to anticipate and attend to examinations requiring radiologist monitoring or scanning, as well as to work with technologists and operations managers to optimize scheduling in radiology resources. We foresee increased utilization of electronic dashboard technology in the future as a method to improve radiology workflow and quality of patient care. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Momot, M. V.; Politsinskaia, E. V.; Sushko, A. V.; Semerenko, I. A.
2016-08-01
The paper considers the problem of mathematical filter selection, used for balancing of wheeled robot in conditions of limited computational resources. The solution based on complementary filter is proposed.
Development of expert systems for analyzing electronic documents
NASA Astrophysics Data System (ADS)
Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.
2018-05-01
The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.
Odekunle, Florence Femi; Odekunle, Raphael Oluseun; Shankar, Srinivasan
2017-01-01
Poor health information system has been identified as a major challenge in the health-care system in many developing countries including sub-Saharan African countries. Electronic health record (EHR) has been shown as an important tool to improve access to patient information with attendance improved quality of care. However, EHR has not been widely implemented/adopted in sub-Saharan Africa. This study sought to identify factors that affect the adoption of an EHR in sub-Saharan Africa and strategies to improve its adoption in this region. A comprehensive literature search was conducted on three electronic databases: PubMed, Medline, and Google Scholar. Articles of interest were those published in English that contained information on factors that limit the adoption of an EHR as well as strategies that improve its adoption in sub-Saharan African countries. The available evidence indicated that there were many factors that hindered the widespread adoption of an EHR in sub-Saharan Africa. These were high costs of procurement and maintenance of the EHR system, lack of financial incentives and priorities, poor electricity supply and internet connectivity, and primary user’s limited computer skills. However, strategies such as implementation planning, financial supports, appropriate EHR system selection, training of primary users, and the adoption of the phased implementation process have been identified to facilitate the use of an EHR. Wide adoption of an EHR in sub-Saharan Africa region requires a lot more effort than what is assumed because of the current poor level of technological development, lack of required computer skills, and limited resources. PMID:29085270