Sample records for resources computer technology

  1. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  2. Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment

    DOT National Transportation Integrated Search

    1976-10-01

    A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...

  3. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  4. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba);…

  5. An integrated system for land resources supervision based on the IoT and cloud computing

    NASA Astrophysics Data System (ADS)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  6. Design & implementation of distributed spatial computing node based on WPS

    NASA Astrophysics Data System (ADS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  7. Diversity in computing technologies and strategies for dynamic resource allocation

    DOE PAGES

    Garzoglio, G.; Gutsche, O.

    2015-12-23

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  8. 48 CFR 352.239-72 - Security requirements for Federal information technology resources.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...' mission. The term “information technology (IT)”, as used in this clause, includes computers, ancillary... Federal information technology resources. 352.239-72 Section 352.239-72 Federal Acquisition Regulations... Provisions and Clauses 352.239-72 Security requirements for Federal information technology resources. As...

  9. 48 CFR 352.239-72 - Security requirements for Federal information technology resources.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...' mission. The term “information technology (IT)”, as used in this clause, includes computers, ancillary... Federal information technology resources. 352.239-72 Section 352.239-72 Federal Acquisition Regulations... Provisions and Clauses 352.239-72 Security requirements for Federal information technology resources. As...

  10. 48 CFR 352.239-72 - Security requirements for Federal information technology resources.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...' mission. The term “information technology (IT)”, as used in this clause, includes computers, ancillary... Federal information technology resources. 352.239-72 Section 352.239-72 Federal Acquisition Regulations... Provisions and Clauses 352.239-72 Security requirements for Federal information technology resources. As...

  11. 48 CFR 352.239-72 - Security requirements for Federal information technology resources.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...' mission. The term “information technology (IT)”, as used in this clause, includes computers, ancillary... Federal information technology resources. 352.239-72 Section 352.239-72 Federal Acquisition Regulations... Provisions and Clauses 352.239-72 Security requirements for Federal information technology resources. As...

  12. Computer-Based Resource Accounting Model for Generating Aggregate Resource Impacts of Alternative Automobile Technologies : Volume 1. Fleet Attributes Model

    DOT National Transportation Integrated Search

    1977-01-01

    Auto production and operation consume energy, material, capital and labor resources. Numerous substitution possibilities exist within and between resource sectors, corresponding to the broad spectrum of potential design technologies. Alternative auto...

  13. System design and implementation of digital-image processing using computational grids

    NASA Astrophysics Data System (ADS)

    Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping

    2005-06-01

    As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.

  14. Instructional Technology Comes of Age. Research Center Update.

    ERIC Educational Resources Information Center

    Clery, Suzanne; Lee, John

    This report reviews the perceptions of the chief academic computing officer on campus of how well prepared faculty members in various academic departments were to use technology as a resource, which were the most important academic and instructional computing policies, procedures, and resources on campus, and what institutions saw as the most…

  15. Resource Manual on the Use of Computers in Schooling.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Technology Applications.

    This resource manual is designed to provide educators with timely information on the use of computers and related technology in schools. Section one includes a review of the new Bureau of Technology Applications' goal, functions, and major programs and activities; a description of the Model Schools Program, which has been conceptually derived from…

  16. Realizing the Potential of Information Resources: Information, Technology, and Services. Track 8: Academic Computing and Libraries.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Eight papers are presented from the 1995 CAUSE conference track on academic computing and library issues faced by managers of information technology at colleges and universities. The papers include: (1) "Where's the Beef?: Implementation of Discipline-Specific Training on Internet Resources" (Priscilla Hancock and others); (2)…

  17. Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter

    PubMed Central

    Loganathan, Shyamala; Mukherjee, Saswati

    2015-01-01

    Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms. PMID:26473166

  18. Job Scheduling with Efficient Resource Monitoring in Cloud Datacenter.

    PubMed

    Loganathan, Shyamala; Mukherjee, Saswati

    2015-01-01

    Cloud computing is an on-demand computing model, which uses virtualization technology to provide cloud resources to users in the form of virtual machines through internet. Being an adaptable technology, cloud computing is an excellent alternative for organizations for forming their own private cloud. Since the resources are limited in these private clouds maximizing the utilization of resources and giving the guaranteed service for the user are the ultimate goal. For that, efficient scheduling is needed. This research reports on an efficient data structure for resource management and resource scheduling technique in a private cloud environment and discusses a cloud model. The proposed scheduling algorithm considers the types of jobs and the resource availability in its scheduling decision. Finally, we conducted simulations using CloudSim and compared our algorithm with other existing methods, like V-MCT and priority scheduling algorithms.

  19. Lockheed Martin Idaho Technologies Company information management technology architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, M.J.; Lau, P.K.S.

    1996-05-01

    The Information Management Technology Architecture (TA) is being driven by the business objectives of reducing costs and improving effectiveness. The strategy is to reduce the cost of computing through standardization. The Lockheed Martin Idaho Technologies Company (LMITCO) TA is a set of standards and products for use at the Idaho National Engineering Laboratory (INEL). The TA will provide direction for information management resource acquisitions, development of information systems, formulation of plans, and resolution of issues involving LMITCO computing resources. Exceptions to the preferred products may be granted by the Information Management Executive Council (IMEC). Certain implementation and deployment strategies aremore » inherent in the design and structure of LMITCO TA. These include: migration from centralized toward distributed computing; deployment of the networks, servers, and other information technology infrastructure components necessary for a more integrated information technology support environment; increased emphasis on standards to make it easier to link systems and to share information; and improved use of the company`s investment in desktop computing resources. The intent is for the LMITCO TA to be a living document constantly being reviewed to take advantage of industry directions to reduce costs while balancing technological diversity with business flexibility.« less

  20. Meeting the computer technology needs of community faculty: building new models for faculty development.

    PubMed

    Baldwin, Constance D; Niebuhr, Virginia N; Sullivan, Brian

    2004-01-01

    We aimed to identify the evolving computer technology needs and interests of community faculty in order to design an effective faculty development program focused on computer skills: the Teaching and Learning Through Educational Technology (TeLeTET) program. Repeated surveys were conducted between 1994 and 2002 to assess computer resources and needs in a pool of over 800 primary care physician-educators in community practice in East Texas. Based on the results, we developed and evaluated several models to teach community preceptors about computer technologies that are useful for education. Before 1998, only half of our community faculty identified a strong interest in developing their technology skills. As the revolution in telecommunications advanced, however, preceptors' needs and interests changed, and the use of this technology to support community-based teaching became feasible. In 1998 and 1999, resource surveys showed that many of our community teaching sites had computers and Internet access. By 2001, the desire for teletechnology skills development was strong in a nucleus of community faculty, although lack of infrastructure, time, and skills were identified barriers. The TeLeTET project developed several innovative models for technology workshops and conferences, supplemented by online resources, that were well attended and positively evaluated by 181 community faculty over a 3-year period. We have identified the evolving needs of community faculty through iterative needs assessments, developed a flexible faculty development curriculum, and used open-ended, formative evaluation techniques to keep the TeLeTET program responsive to a rapidly changing environment for community-based education in computer technology.

  1. A Review of Resources for Evaluating K-12 Computer Science Education Programs

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Hartikainen, Elina

    2004-01-01

    Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…

  2. Computers, Technology, and Disability. [Update.

    ERIC Educational Resources Information Center

    American Council on Education, Washington, DC. HEATH Resource Center.

    This paper describes programs and resources that focus on access of postsecondary students with disabilities to computers and other forms of technology. Increased access to technological devices and services is provided to students with disabilities under the Technology-Related Assistance for Individuals with Disabilities Act (Tech Act). Section…

  3. Current Issues for Higher Education Information Resources Management.

    ERIC Educational Resources Information Center

    CAUSE/EFFECT, 1996

    1996-01-01

    Issues identified as important to the future of information resources management and use in higher education include information policy in a networked environment, distributed computing, integrating information resources and college planning, benchmarking information technology, integrated digital libraries, technology integration in teaching,…

  4. University Students and Ethics of Computer Technology Usage: Human Resource Development

    ERIC Educational Resources Information Center

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  5. Community-Authored Resources for Education

    ERIC Educational Resources Information Center

    Tinker, Robert; Linn, Marcia; Gerard, Libby; Staudt, Carolyn

    2010-01-01

    Textbooks are resources for learning that provide the structure, content, assessments, and teacher guidance for an entire course. Technology can provide a far better resource that provides the same functions, but takes full advantage of computers. This resource would be much more than text on a screen. It would use the best technology; it would be…

  6. Databases, data integration, and expert systems: new directions in mineral resource assessment and mineral exploration

    USGS Publications Warehouse

    McCammon, Richard B.; Ramani, Raja V.; Mozumdar, Bijoy K.; Samaddar, Arun B.

    1994-01-01

    Overcoming future difficulties in searching for ore deposits deeper in the earth's crust will require closer attention to the collection and analysis of more diverse types of data and to more efficient use of current computer technologies. Computer technologies of greatest interest include methods of storage and retrieval of resource information, methods for integrating geologic, geochemical, and geophysical data, and the introduction of advanced computer technologies such as expert systems, multivariate techniques, and neural networks. Much experience has been gained in the past few years in applying these technologies. More experience is needed if they are to be implemented for everyday use in future assessments and exploration.

  7. An approach for heterogeneous and loosely coupled geospatial data distributed computing

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Huang, Fengru; Fang, Yu; Huang, Zhou; Lin, Hui

    2010-07-01

    Most GIS (Geographic Information System) applications tend to have heterogeneous and autonomous geospatial information resources, and the availability of these local resources is unpredictable and dynamic under a distributed computing environment. In order to make use of these local resources together to solve larger geospatial information processing problems that are related to an overall situation, in this paper, with the support of peer-to-peer computing technologies, we propose a geospatial data distributed computing mechanism that involves loosely coupled geospatial resource directories and a term named as Equivalent Distributed Program of global geospatial queries to solve geospatial distributed computing problems under heterogeneous GIS environments. First, a geospatial query process schema for distributed computing as well as a method for equivalent transformation from a global geospatial query to distributed local queries at SQL (Structured Query Language) level to solve the coordinating problem among heterogeneous resources are presented. Second, peer-to-peer technologies are used to maintain a loosely coupled network environment that consists of autonomous geospatial information resources, thus to achieve decentralized and consistent synchronization among global geospatial resource directories, and to carry out distributed transaction management of local queries. Finally, based on the developed prototype system, example applications of simple and complex geospatial data distributed queries are presented to illustrate the procedure of global geospatial information processing.

  8. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously. The new approaches and expanded use of computers will require substantial increases in the quantity and sophistication of the Division 's computer resources. The requirements presented in this report will be used to develop technical specifications that describe the computer resources needed during the 1990's. (USGS)

  9. Development of Computer-Based Resources for Textile Education.

    ERIC Educational Resources Information Center

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  10. Technology Resource Teachers: Is This a New Role for Instructional Technologists?

    ERIC Educational Resources Information Center

    Moallem, Mahnaz; And Others

    Public schools have created the position of the Technology Resource Teacher (TRT) in an attempt to establish a technical and instructional support system at the school level to assure the proper usage of technology (particularly computers) by both teachers and students. This study explores the roles and responsibilities of the Technology Resource…

  11. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1999

    1999-01-01

    Provides annotated listings for current journals, books, ERIC documents, articles, and nonprint resources in the following categories: artificial intelligence/robotics/electronic performance support systems; computer-assisted instruction; distance education; educational research; educational technology; information science and technology;…

  12. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    NASA Astrophysics Data System (ADS)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  13. The Electronic School Library Resource Center: Facilities Planning for the New Information Technologies.

    ERIC Educational Resources Information Center

    Blodgett, Teresa; Repman, Judi

    1995-01-01

    Addresses the necessity of incorporating new computer technologies into school library resource centers and notes some administrative challenges. An extensive checklist is provided for assessing equipment and furniture needs, physical facilities, and rewiring needs. A glossary of 20 terms and 11 additional resources is included. (AEF)

  14. Migrating Educational Data and Services to Cloud Computing: Exploring Benefits and Challenges

    ERIC Educational Resources Information Center

    Lahiri, Minakshi; Moseley, James L.

    2013-01-01

    "Cloud computing" is currently the "buzzword" in the Information Technology field. Cloud computing facilitates convenient access to information and software resources as well as easy storage and sharing of files and data, without the end users being aware of the details of the computing technology behind the process. This…

  15. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  16. WebGIS based on semantic grid model and web services

    NASA Astrophysics Data System (ADS)

    Zhang, WangFei; Yue, CaiRong; Gao, JianGuo

    2009-10-01

    As the combination point of the network technology and GIS technology, WebGIS has got the fast development in recent years. With the restriction of Web and the characteristics of GIS, traditional WebGIS has some prominent problems existing in development. For example, it can't accomplish the interoperability of heterogeneous spatial databases; it can't accomplish the data access of cross-platform. With the appearance of Web Service and Grid technology, there appeared great change in field of WebGIS. Web Service provided an interface which can give information of different site the ability of data sharing and inter communication. The goal of Grid technology was to make the internet to a large and super computer, with this computer we can efficiently implement the overall sharing of computing resources, storage resource, data resource, information resource, knowledge resources and experts resources. But to WebGIS, we only implement the physically connection of data and information and these is far from the enough. Because of the different understanding of the world, following different professional regulations, different policies and different habits, the experts in different field will get different end when they observed the same geographic phenomenon and the semantic heterogeneity produced. Since these there are large differences to the same concept in different field. If we use the WebGIS without considering of the semantic heterogeneity, we will answer the questions users proposed wrongly or we can't answer the questions users proposed. To solve this problem, this paper put forward and experienced an effective method of combing semantic grid and Web Services technology to develop WebGIS. In this paper, we studied the method to construct ontology and the method to combine Grid technology and Web Services and with the detailed analysis of computing characteristics and application model in the distribution of data, we designed the WebGIS query system driven by ontology based on Grid technology and Web Services.

  17. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.

  18. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185

  19. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1996

    1996-01-01

    This annotated list includes media-related resources classified under the following headings: artificial intelligence and robotics, CD-ROM, computer-assisted instruction, databases and online searching, distance education, educational research, educational technology, electronic publishing, information science and technology, instructional design…

  20. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1997

    1997-01-01

    This annotated list includes media-related resources classified under the following headings: artificial intelligence and robotics, CD-ROM, computer-assisted instruction, databases and online searching, distance education, educational research, educational technology, electronic publishing, information science and technology, instructional design…

  1. ATLAS Cloud R&D

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Barreiro Megino, Fernando; Caballero Bejar, Jose; Benjamin, Doug; Di Girolamo, Alessandro; Gable, Ian; Hendrix, Val; Hover, John; Kucharczyk, Katarzyna; Medrano Llamas, Ramon; Love, Peter; Ohman, Henrik; Paterson, Michael; Sobie, Randall; Taylor, Ryan; Walker, Rodney; Zaytsev, Alexander; Atlas Collaboration

    2014-06-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained a significant insight into the cloud computing landscape and has identified points that still need to be addressed in order to fully utilize this technology. This contribution will explain the cloud integration models that are being evaluated and will discuss ATLAS' learning during the collaboration with leading commercial and academic cloud providers.

  2. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Price, Brooke, Ed.

    2001-01-01

    Lists media-related journals, books, ERIC documents, journal articles, and nonprint resources published in 1999-2000. The annotated entries are classified under the following headings: artificial intelligence; computer assisted instruction; distance education; educational research; educational technology; information science and technology;…

  3. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Burdett, Anna E.

    2003-01-01

    Lists media-related journals, books, ERIC documents, journal articles, and nonprint resources published in 2001-2002. The annotated entries are classified under the following headings: artificial intelligence; computer assisted instruction; distance education; educational research; educational technology; information science and technology;…

  4. Coordinating Technological Resources in a Non-Technical Profession: The Administrative Computer User Group.

    ERIC Educational Resources Information Center

    Rollo, J. Michael; Marmarchev, Helen L.

    1999-01-01

    The explosion of computer applications in the modern workplace has required student affairs professionals to keep pace with technological advances for office productivity. This article recommends establishing an administrative computer user groups, utilizing coordinated web site development, and enhancing working relationships as ways of dealing…

  5. 77 FR 66729 - National Oil and Hazardous Substances Pollution Contingency Plan; Revision To Increase Public...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-07

    ... technology, to include computer telecommunications or other electronic means, that the lead agency is... assess the capacity and resources of the public to utilize and maintain an electronic- or computer... the technology, to include computer telecommunications or other electronic means, that the lead agency...

  6. Research on elastic resource management for multi-queue under cloud computing environment

    NASA Astrophysics Data System (ADS)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  7. Opportunities and challenges of cloud computing to improve health care services.

    PubMed

    Kuo, Alex Mu-Hsing

    2011-09-21

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed.

  8. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  9. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  10. Technology in College Unions and Student Activities: A Collection of Technology Resources from the ACUI Community

    ERIC Educational Resources Information Center

    Association of College Unions International (NJ1), 2012

    2012-01-01

    This publication presents a collection of technology resources from the Association of College Unions International (ACUI) community. Contents include: (1) Podcasting (Jeff Lail); (2) Video Podcasting (Ed Cabellon); (3) Building a Multimedia Production Center (Nathan Byrer); (4) Cloud Computing in the Student Union and Student Activities (TJ…

  11. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  12. What's New in Software? Current Sources of Information Boost Effectiveness of Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Ellsworth, Nancy J.

    1990-01-01

    This article reviews current resources on computer-assisted instruction. Included are sources of software and hardware evaluations, advances in current technology, research, an information hotline, and inventories of available technological assistance. (DB)

  13. Debunking the Computer Science Digital Library: Lessons Learned in Collection Development at Seneca College of Applied Arts & Technology

    ERIC Educational Resources Information Center

    Buczynski, James Andrew

    2005-01-01

    Developing a library collection to support the curriculum of Canada's largest computer studies school has debunked many myths about collecting computer science and technology information resources. Computer science students are among the heaviest print book and e-book users in the library. Circulation statistics indicate that the demand for print…

  14. Modeling the Cloud to Enhance Capabilities for Crises and Catastrophe Management

    DTIC Science & Technology

    2016-11-16

    order for cloud computing infrastructures to be successfully deployed in real world scenarios as tools for crisis and catastrophe management, where...Statement of the Problem Studied As cloud computing becomes the dominant computational infrastructure[1] and cloud technologies make a transition to hosting...1. Formulate rigorous mathematical models representing technological capabilities and resources in cloud computing for performance modeling and

  15. Examining Effects of Virtual Machine Settings on Voice over Internet Protocol in a Private Cloud Environment

    ERIC Educational Resources Information Center

    Liao, Yuan

    2011-01-01

    The virtualization of computing resources, as represented by the sustained growth of cloud computing, continues to thrive. Information Technology departments are building their private clouds due to the perception of significant cost savings by managing all physical computing resources from a single point and assigning them to applications or…

  16. Digital Image Access & Retrieval.

    ERIC Educational Resources Information Center

    Heidorn, P. Bryan, Ed.; Sandore, Beth, Ed.

    Recent technological advances in computing and digital imaging technology have had immediate and permanent consequences for visual resource collections. Libraries are involved in organizing and managing large visual resource collections. The central challenges in working with digital image collections mirror those that libraries have sought to…

  17. Desktop supercomputer: what can it do?

    NASA Astrophysics Data System (ADS)

    Bogdanov, A.; Degtyarev, A.; Korkhov, V.

    2017-12-01

    The paper addresses the issues of solving complex problems that require using supercomputers or multiprocessor clusters available for most researchers nowadays. Efficient distribution of high performance computing resources according to actual application needs has been a major research topic since high-performance computing (HPC) technologies became widely introduced. At the same time, comfortable and transparent access to these resources was a key user requirement. In this paper we discuss approaches to build a virtual private supercomputer available at user's desktop: a virtual computing environment tailored specifically for a target user with a particular target application. We describe and evaluate possibilities to create the virtual supercomputer based on light-weight virtualization technologies, and analyze the efficiency of our approach compared to traditional methods of HPC resource management.

  18. Seven Affordances of Computer-Supported Collaborative Learning: How to Support Collaborative Learning? How Can Technologies Help?

    ERIC Educational Resources Information Center

    Jeong, Heisawn; Hmelo-Silver, Cindy E.

    2016-01-01

    This article proposes 7 core affordances of technology for collaborative learning based on theories of collaborative learning and CSCL (Computer-Supported Collaborative Learning) practices. Technology affords learner opportunities to (1) engage in a joint task, (2) communicate, (3) share resources, (4) engage in productive collaborative learning…

  19. Opportunities and Challenges of Cloud Computing to Improve Health Care Services

    PubMed Central

    2011-01-01

    Cloud computing is a new way of delivering computing resources and services. Many managers and experts believe that it can improve health care services, benefit health care research, and change the face of health information technology. However, as with any innovation, cloud computing should be rigorously evaluated before its widespread adoption. This paper discusses the concept and its current place in health care, and uses 4 aspects (management, technology, security, and legal) to evaluate the opportunities and challenges of this computing model. Strategic planning that could be used by a health organization to determine its direction, strategy, and resource allocation when it has decided to migrate from traditional to cloud-based health services is also discussed. PMID:21937354

  20. Collaborative Visualization Project: shared-technology learning environments for science learning

    NASA Astrophysics Data System (ADS)

    Pea, Roy D.; Gomez, Louis M.

    1993-01-01

    Project-enhanced science learning (PESL) provides students with opportunities for `cognitive apprenticeships' in authentic scientific inquiry using computers for data-collection and analysis. Student teams work on projects with teacher guidance to develop and apply their understanding of science concepts and skills. We are applying advanced computing and communications technologies to augment and transform PESL at-a-distance (beyond the boundaries of the individual school), which is limited today to asynchronous, text-only networking and unsuitable for collaborative science learning involving shared access to multimedia resources such as data, graphs, tables, pictures, and audio-video communication. Our work creates user technology (a Collaborative Science Workbench providing PESL design support and shared synchronous document views, program, and data access; a Science Learning Resource Directory for easy access to resources including two-way video links to collaborators, mentors, museum exhibits, media-rich resources such as scientific visualization graphics), and refine enabling technologies (audiovisual and shared-data telephony, networking) for this PESL niche. We characterize participation scenarios for using these resources and we discuss national networked access to science education expertise.

  1. Are Technology Interruptions Impacting Your Bottom Line? An Innovative Proposal for Change.

    PubMed

    Ledbetter, Tamera; Shultz, Sarah; Beckham, Roxanne

    2017-10-01

    Nursing interruptions are a costly and dangerous variable in acute care hospitals. Malfunctioning technology equipment interrupts nursing care and prevents full utilization of computer safety systems to prevent patient care errors. This paper identifies an innovative approach to nursing interruptions related to computer and computer cart malfunctions. The impact on human resources is defined and outcome measures were proposed. A multifaceted proposal, based on a literature review, aimed at reducing nursing interruptions is presented. This proposal is expected to increase patient safety, as well as patient and nurse satisfaction. Acute care hospitals utilizing electronic medical records and bar-coded medication administration technology. Nurses, information technology staff, nursing informatics staff, and all leadership teams affected by technology problems and their proposed solutions. Literature from multiple fields was reviewed to evaluate research related to computer/computer cart failures, and the approaches used to resolve these issues. Outcome measured strategic goals related to patient safety, and nurse and patient satisfaction. Specific help desk metrics will demonstrate the effect of interventions. This paper addresses a gap in the literature and proposes practical and innovative solutions. A comprehensive computer and computer cart repair program is essential for patient safety, financial stewardship, and utilization of resources. © 2015 Wiley Periodicals, Inc.

  2. Learning Resources and Technology. A Guide to Program Development.

    ERIC Educational Resources Information Center

    Connecticut State Dept. of Education, Hartford.

    This guide provides a framework to assist all Connecticut school districts in planning effective learning resources centers and educational technology programs capable of providing: a well developed library media component; shared instructional design responsibilities; reading for enrichment; integration of computers into instruction; distance…

  3. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  4. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  5. An Architecture for Cross-Cloud System Management

    NASA Astrophysics Data System (ADS)

    Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad

    The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.

  6. Curriculums in Industrial Technology. Plastics Technology. Industrial Maintenance. Computer Numerical Control. Teacher's Manuals and Student Learning Guides.

    ERIC Educational Resources Information Center

    El Paso Community Coll., TX.

    Curriculum guides are provided for plastics technology, industrial maintenance, and computer numerical control. Each curriculum is divided into a number of courses. For each course these instructor materials are presented in the official course outline: course description, course objectives, unit titles, texts and materials, instructor resources,…

  7. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    NASA Astrophysics Data System (ADS)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  8. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 2: Switches and Environmental Controls. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on switches and environmental controls. The book's three chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by…

  9. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 1: Communication Aids. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on communication aids. The book's six chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by function, input/output…

  10. Interfacing HTCondor-CE with OpenStack

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; Hover, J.

    2017-10-01

    Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.

  11. Uses of Technology in Community Colleges: A Resource Book for Community College Teachers and Administrators.

    ERIC Educational Resources Information Center

    Gooler, Dennis D., Ed.

    This resource guide for community college teachers and administrators focuses on hardware and software. The following are discussed: (1) individual technologies--computer-assisted instruction, audio tape, films, filmstrips/slides, dial access, programmed instruction, learning activity packages, video cassettes, cable TV, independent learning labs,…

  12. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  13. Budgeting and Funding School Technology: Essential Considerations

    ERIC Educational Resources Information Center

    Ireh, Maduakolam

    2010-01-01

    School districts need adequate financial resources to purchase hardware and software, wire their buildings to network computers and other information and communication devices, and connect to the Internet to provide students, teachers, and other school personnel with adequate access to technology. Computers and other peripherals, particularly,…

  14. Technologies as Rural Special Education Problem Solvers--A Status Report and Successful Strategies.

    ERIC Educational Resources Information Center

    Helge, Doris

    Rural schools can help solve their special education problems by using advanced technology to provide instructional support (computer managed instruction, satellite television, library searches, resource networks, on-line testing), instructional applications (computer assisted instruction, reading machines, mobile vans, instructional television),…

  15. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  16. Benchmarking high performance computing architectures with CMS’ skeleton framework

    NASA Astrophysics Data System (ADS)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-10-01

    In 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta, Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.

  17. Construction and application of Red5 cluster based on OpenStack

    NASA Astrophysics Data System (ADS)

    Wang, Jiaqing; Song, Jianxin

    2017-08-01

    With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.

  18. Engineering and Computing Portal to Solve Environmental Problems

    NASA Astrophysics Data System (ADS)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  19. The Merit Computer Network

    ERIC Educational Resources Information Center

    Aupperle, Eric M.; Davis, Donna L.

    1978-01-01

    The successful Merit Computer Network is examined in terms of both technology and operational management. The network is fully operational and has a significant and rapidly increasing usage, with three major institutions currently sharing computer resources. (Author/CMV)

  20. Pupil Science Learning in Resource-Based e-Learning Environments

    ERIC Educational Resources Information Center

    So, Wing-mui Winnie; Ching, Ngai-ying Fiona

    2011-01-01

    With the rapid expansion of broadband Internet connection and availability of high performance yet low priced computers, many countries around the world are advocating the adoption of e-learning, the use of computer technology to improve learning and teaching. The trend of e-learning has urged many teachers to incorporate online resources in their…

  1. Smoke and Air Resource Management-Peering Through the Haze

    Treesearch

    A. R. Fox Riebau

    1987-01-01

    This paper presents a vision of the future rooted in consideration of the past 20 years in the smoke and air resource management field. This future is characterized by rapid technological development of computers for computation, communications, and remote sensing capabilities and of the possible societal responses to these advances. We discuss intellectual...

  2. Working Smarter: The Skill Bias of Computer Technologies. The Evolving Workplace Series

    ERIC Educational Resources Information Center

    Wannell, Ted; Ali, Jennifer

    2002-01-01

    This document provides data from the new Workplace and Employee Survey (WES) conducted by Statistics Canada with the support of Human Resources Development Canada. The survey consists of two components: (1) a workplace survey on the adoption of technologies, organizational change, training and other human resource practices, business strategies,…

  3. An Investigation of Tool Mediation in the Research Activity of Eighth-Grade Students

    ERIC Educational Resources Information Center

    Henry, Nancy L.

    2016-01-01

    Technology and a variety of resources play an important role in students' educational lives. Vygotsky's (1987) theory of tool mediation suggests that cultural tools, such as computer software influence individuals' thinking and action. However, it is not completely understood how technology and other resources influence student action. Middle…

  4. Mississippi Curriculum Framework for Computer Discovery (8th Grade). CIP: 00.0252.

    ERIC Educational Resources Information Center

    Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.

    This document, which is intended for technology educators in Mississippi, outlines a modular instruction approach that allows eighth graders to experience various workplace technologies within four career cluster areas: agriculture/natural resources technology, business/marketing technology, health/human services technology, and…

  5. School Technology Grows Up.

    ERIC Educational Resources Information Center

    Vail, Kathleen

    2003-01-01

    Practitioners and researchers in the education technology field asked to give their vision of the future list laptop computers, personal digital assistants, electronic testing, wireless networking, and multimedia technology among the technology advances headed soon for schools. A sidebar lists 12 online resources. (MLF)

  6. Learning technologies and the cyber-science classroom

    NASA Astrophysics Data System (ADS)

    Houlihan, Gerard

    Access to computer and communication technology has long been regarded `part-and-parcel' of a good education. No educator can afford to ignore the profound impact of learning technologies on the way we teach science, nor fail to acknowledge that information literacy and computing skills will be fundamental to the practice of science in the next millennium. Nevertheless, there is still confusion concerning what technologies educators should employ in teaching science. Furthermore, a lack of knowledge combined with the pressures to be `seen' utilizing technology has lead some schools to waste scarce resources in a `grab-bag' attitude towards computers and technology. Such popularized `wish lists' can only drive schools to accumulate expensive equipment for no real learning purpose. In the future educators will have to reconsider their curriculum and pedagogy with a focus on the learning environment before determining what appropriate computing resources to acquire. This will be fundamental to the capabilities of science classrooms to engage with cutting-edge issues in science. This session will demonstrate the power of a broad range of learning technologies to enhance science education. The aim is to explore classroom possibilities as well as to provide a basic introduction to technical aspects of various software and hardware applications, including robotics and dataloggers and simulation software.

  7. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  8. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  9. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  10. Transfer and utilization of government technology assets to the private sector in the fields of health care and information technologies

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1995-10-01

    During the first Health Care Technology Policy conference last year, during health care reform, four major issues were brought up in regards to the efforts underway to develop a computer based patient record (CBPR), the National Information Infrastructure (NII) as part of the high performance computers and communications (HPCC), and the so-called 'patient card.' More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: (1) Constructing a national information infrastructure (NII); (2) Building a computer based patient record system; (3) Bringing the collective resources of our national laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; (4) Utilizing government (e.g., DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs, and accelerate technology transfer to address health care issues. This year a section of this conference entitled: 'Health Care Technology Assets of the Federal Government' addresses benefits of the technology transfer which should occur for maximizing already developed resources. This section entitled: 'Transfer and Utilization of Government Technology Assets to the Private Sector,' will look at both health care and non-health care related technologies since many areas such as information technologies (i.e. imaging, communications, archival/retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our national labs and/or other federal agencies, i.e., ARPA. These technologies although they are not labeled under health care programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.

  11. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE PAGES

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    2017-11-23

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  12. Benchmarking high performance computing architectures with CMS’ skeleton framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sexton-Kennedy, E.; Gartung, P.; Jones, C. D.

    Here, in 2012 CMS evaluated which underlying concurrency technology would be the best to use for its multi-threaded framework. The available technologies were evaluated on the high throughput computing systems dominating the resources in use at that time. A skeleton framework benchmarking suite that emulates the tasks performed within a CMSSW application was used to select Intel’s Thread Building Block library, based on the measured overheads in both memory and CPU on the different technologies benchmarked. In 2016 CMS will get access to high performance computing resources that use new many core architectures; machines such as Cori Phase 1&2, Theta,more » Mira. Because of this we have revived the 2012 benchmark to test it’s performance and conclusions on these new architectures. This talk will discuss the results of this exercise.« less

  13. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  14. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS) Evolution of Computer Integrated Manufacturing (CIM) Technologies. Version 2.0 Draft

    DOT National Transportation Integrated Search

    1988-11-01

    During the past decade a great deal of effort has been focused on the advantages computerization can bring to engineering design and production activities. This is seen in such developments as Group Technology (GT), Manufacturing Resource Planning (M...

  15. A Review on Making Things See: Augmented Reality for Futuristic Virtual Educator

    ERIC Educational Resources Information Center

    Iqbal, Javid; Sidhu, Manjit Singh

    2017-01-01

    In the past few years many choreographers have focused upon implementation of computer technology to enhance their artistic skills. Computer vision technology presents new methods for learning, instructing, developing, and assessing physical movements as well as provides scope to expand dance resources and rediscover the learning process. This…

  16. Y2K Resources for Public Libraries.

    ERIC Educational Resources Information Center

    Foster, Janet

    1999-01-01

    Presents information for public libraries on computer-related vulnerabilities as the century turns from 1999 to 2000. Highlights include: general Y2K information; the Y2K Bug and PCs; Y2K sites for librarians; Online Computer Library Center (OCLC) and USMARC; technological developments in cyberspace; and a list of Web sites and Y2K resources. (AEF)

  17. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  18. A novel resource management method of providing operating system as a service for mobile transparent computing.

    PubMed

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  19. Cornell University Center for Advanced Computing

    Science.gov Websites

    Resource Center Data Management (RDMSG) Computational Agriculture National Science Foundation Other Public agriculture technology acquired Lifka joins National Science Foundation CISE Advisory Committee © Cornell

  20. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  1. Electronic Job Search Revolution. Win with the New Technology that's Reshaping Today's Job Market.

    ERIC Educational Resources Information Center

    Kennedy, Joyce Lain; Morrow, Thomas J.

    This book contains information about the resources available to merge new technology and the search for employment. It offers suggestions from human resource specialists, software authors, and database experts. Chapter 1 is an overview of how the computer has become indispensable in a job search. Chapter 2 focuses on external, third-party resume…

  2. An Overview of the Evolution of the AAVSO's Information Technology Infrastructure Between 1965-1997

    NASA Astrophysics Data System (ADS)

    Kinne, Richard C. S.; Saladyga, M.; Waagen, E. O.

    2011-05-01

    We trace the history and usage of computers and data processing equipment at the AAVSO HQ between its beginings in the 1960s to 1997. We focus on equipment, people, and the purpose such computational power was put to use. We examine how the AAVSO evolved its use of computing and data processing resources as the technology evolved in order to further its mission.

  3. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in the experimental group, who responded to the use of Internet Resources Survey, were positive (mean of 3.4 on the 4-point scale) toward their use of Internet resources which included the online courseware developed by the researcher. Findings from this study suggest that (1) the digital divide with respect to gender and ethnicity may be narrowing, and (2) students who are exposed to a course that augments computer-driven courseware with traditional teaching methods appear to have less anxiety, have a clearer perception of computer usefulness, and feel that online resources enhance their learning.

  4. International Society for Technology in Education.

    ERIC Educational Resources Information Center

    Knox-Quinn, Carolyn

    1992-01-01

    Provides information about the International Society for Technology in Education (ISTE), an organization dedicated to improving education throughout the world by facilitating communication among instructors, media specialists, computer coordinators, information resource managers (IRMs), and administrative users of technology. Publications and the…

  5. TREND 2000

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Educator Resource Center has created the Technology, Research, Education and Discovery (TREND) 2000 computer lab at NASA's John C. Stennis Space Center to facilitate the integration of technology into schools' curriculums by providing innovative and creative classroom strategies using state-of-the-art technology.

  6. IMAGE: A Design Integration Framework Applied to the High Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.

    1993-01-01

    Effective design of the High Speed Civil Transport requires the systematic application of design resources throughout a product's life-cycle. Information obtained from the use of these resources is used for the decision-making processes of Concurrent Engineering. Integrated computing environments facilitate the acquisition, organization, and use of required information. State-of-the-art computing technologies provide the basis for the Intelligent Multi-disciplinary Aircraft Generation Environment (IMAGE) described in this paper. IMAGE builds upon existing agent technologies by adding a new component called a model. With the addition of a model, the agent can provide accountable resource utilization in the presence of increasing design fidelity. The development of a zeroth-order agent is used to illustrate agent fundamentals. Using a CATIA(TM)-based agent from previous work, a High Speed Civil Transport visualization system linking CATIA, FLOPS, and ASTROS will be shown. These examples illustrate the important role of the agent technologies used to implement IMAGE, and together they demonstrate that IMAGE can provide an integrated computing environment for the design of the High Speed Civil Transport.

  7. Connecting congregations: technology resources influence parish nurse practice.

    PubMed

    Zerull, Lisa M; Near, Kelly K; Ragon, Bart; Farrell, Sarah P

    2009-01-01

    This descriptive pilot study evaluated the influence of health resource information education and the use of Web-based communication technology on the professional practice of the parish nurse in the congregational setting. Five parish nurse participants from varied denominations in rural and nonrural Virginia received a laptop computer, printer, video projector, and webcam along with high-speed Internet access in each congregational setting. The nurses attended two group education sessions that incorporated computer applications and training in accessing and using quality health information resources and communication applications such as a group "chat" software and webcam to communicate with others through high-speed Internet access. Qualitative analysis from semistructured interviews of nurses confirmed that participants found the project to be beneficial in terms of awareness, education, and applicability of technology use in parish nurse practice. Quantitative data from preproject and postproject surveys found significant differences in nurses' abilities and confidence with technology use and application. Findings showed that the knowledge and experience gained from this study enhanced parish nurse practice and confidence in using technology for communication, health education, and counseling.

  8. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1998

    1998-01-01

    Lists educational media-related journals, books, ERIC documents, journal articles, and nonprint resources classified by Artificial Intelligence, Robotics, Electronic Performance Support Systems; Computer-Assisted Instruction; Distance Education; Educational Research; Educational Technology; Electronic Publishing; Information Science and…

  9. Advanced Technology System Scheduling Governance Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ang, Jim; Carnes, Brian; Hoang, Thuc

    In the fall of 2005, the Advanced Simulation and Computing (ASC) Program appointed a team to formulate a governance model for allocating resources and scheduling the stockpile stewardship workload on ASC capability systems. This update to the original document takes into account the new technical challenges and roles for advanced technology (AT) systems and the new ASC Program workload categories that must be supported. The goal of this updated model is to effectively allocate and schedule AT computing resources among all three National Nuclear Security Administration (NNSA) laboratories for weapons deliverables that merit priority on this class of resource. Themore » process outlined below describes how proposed work can be evaluated and approved for resource allocations while preserving high effective utilization of the systems. This approach will provide the broadest possible benefit to the Stockpile Stewardship Program (SSP).« less

  10. Factors Influencing the Adoption of Cloud Computing by Decision Making Managers

    ERIC Educational Resources Information Center

    Ross, Virginia Watson

    2010-01-01

    Cloud computing is a growing field, addressing the market need for access to computing resources to meet organizational computing requirements. The purpose of this research is to evaluate the factors that influence an organization in their decision whether to adopt cloud computing as a part of their strategic information technology planning.…

  11. Grid computing in large pharmaceutical molecular modeling.

    PubMed

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  12. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  13. Health care information infrastructure: what will it be and how will we get there?

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1996-02-01

    During the first Health Care Technology Policy [HCTPI conference last year, during Health Care Reform, four major issues were brought up in regards to the underway efforts to develop a Computer Based Patient Record (CBPR)I the National Information Infrastructure (NIl) as part of the High Performance Computers & Communications (HPCC), and the so-called "Patient Card" . More specifically it was explained how a national information system will greatly affect the way health care delivery is provided to the United States public and reduce its costs. These four issues were: Constructing a National Information Infrastructure (NIl); Building a Computer Based Patient Record System; Bringing the collective resources of our National Laboratories to bear in developing and implementing the NIl and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; Utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues. During the second HCTP conference, in mid 1 995, a section of this meeting entitled: "Health Care Technology Assets of the Federal Government" addressed benefits of the technology transfer which should occur for maximizing already developed resources. Also a section entitled:"Transfer and Utilization of Government Technology Assets to the Private Sector", looked at both Health Care and non-Health Care related technologies since many areas such as Information Technologies (i.e. imaging, communications, archival I retrieval, systems integration, information display, multimedia, heterogeneous data bases, etc.) already exist and are part of our National Labs and/or other federal agencies, i.e. ARPA. These technologies although they are not labeled under "Health Care" programs they could provide enormous value to address technical needs. An additional issue deals with both the technical (hardware, software) and human expertise that resides within these labs and their possible role in creating cost effective solutions.

  14. Introducing Computational Thinking to Young Learners: Practicing Computational Perspectives through Embodiment in Mathematics Education

    ERIC Educational Resources Information Center

    Sung, Woonhee; Ahn, Junghyun; Black, John B.

    2017-01-01

    A science, technology, engineering, and mathematics-influenced classroom requires learning activities that provide hands-on experiences with technological tools to encourage problem-solving skills (Brophy et al. in "J Eng Educ" 97(3):369-387, 2008; Mataric et al. in "AAAI spring symposium on robots and robot venues: resources for AI…

  15. [Profile, competencies and digital fluency of nurses in the Professional Improvement Program].

    PubMed

    Tanabe, Lyvia Pini; Kobayashi, Rika Miyahara

    2013-08-01

    A descriptive exploratory study conducted in the city of São Paulo, which aimed to identify the profile, competencies and digital fluency of nurses in the Professional Improvement Program in handling technology at work. The population, composed by 60 nurses in the program, answered a questionnaire with data about profile, digital fluency and professional competencies. The participants were found to be: 95.0% female, 61.7% between 23 and 25 years old, 75.0% from public schools, 58.3% enrolled in cardiovascular nursing, 98.3% had contact with computing resources during graduation, 100.0% had a computer at home, 86.7% accessed the internet daily, 96.7% used Messenger and 58.3% had an intermediate level of knowledge and skill in computing. Professional competencies required for technology management referred to knowing how to be innovative, creative, and updated to identify and manage software and to use technological resources.

  16. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  17. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  18. Immersive Education, an Annotated Webliography

    ERIC Educational Resources Information Center

    Pricer, Wayne F.

    2011-01-01

    In this second installment of a two-part feature on immersive education a webliography will provide resources discussing the use of various types of computer simulations including: (a) augmented reality, (b) virtual reality programs, (c) gaming resources for teaching with technology, (d) virtual reality lab resources, (e) virtual reality standards…

  19. Proceedings of the Annual National Conference on ADA Technology (9th) Held in Washington, DC on 4-7 March 1991

    DTIC Science & Technology

    1991-03-07

    rsolve the attack; delay whil the weapon has to wait; RESOURCE ALLOCATION . PRIORITY OF signal readiness to CONTROL; TARGETS. AND BIAS OF THE SYSTEM...Communications Systems. focal point for Computer Resource He served as project manager for the Management (CRM), Advanced Software development of the Joint...Interface Test Technology (AST), Ada Technology, Systems (JITS) - the world’s largest Joint/Army Interoperability Testing distributed command and

  20. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  1. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    NASA Astrophysics Data System (ADS)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  2. Integrating Information Technologies Into Large Organizations

    NASA Technical Reports Server (NTRS)

    Gottlich, Gretchen; Meyer, John M.; Nelson, Michael L.; Bianco, David J.

    1997-01-01

    NASA Langley Research Center's product is aerospace research information. To this end, Langley uses information technology tools in three distinct ways. First, information technology tools are used in the production of information via computation, analysis, data collection and reduction. Second, information technology tools assist in streamlining business processes, particularly those that are primarily communication based. By applying these information tools to administrative activities, Langley spends fewer resources on managing itself and can allocate more resources for research. Third, Langley uses information technology tools to disseminate its aerospace research information, resulting in faster turn around time from the laboratory to the end-customer.

  3. 75 FR 11917 - Chrysler LLC, Technology Center, Including On-Site Leased Workers from Aerotek, Ajilon, Altair...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ..., Cer-Cad Engineering Resources, Computer Consultants of America, Computer Engrg Services, Compuware..., Automated Analysis Corp/Belcan, Bartech Group, CAE Tech, CDI Information Services, CER-CAD Engineering...

  4. Computing Experience and Good Practices in Undergraduate Education: Does the Degree of Campus Wiredness Matter?

    ERIC Educational Resources Information Center

    Hu, Shouping; Kuh, George D.

    Responses to the College Student Experience Questionnaire Fourth Edition (C. Pace and G. Kuh, 1998) from 18,844 students at 71 colleges and universities were analyzed to determine if the presence of computing and information technology influenced the frequency of use of various forms of technology and other educational resources and the exposure…

  5. A cloud-based production system for information and service integration: an internet of things case study on waste electronics

    NASA Astrophysics Data System (ADS)

    Wang, Xi Vincent; Wang, Lihui

    2017-08-01

    Cloud computing is the new enabling technology that offers centralised computing, flexible data storage and scalable services. In the manufacturing context, it is possible to utilise the Cloud technology to integrate and provide industrial resources and capabilities in terms of Cloud services. In this paper, a function block-based integration mechanism is developed to connect various types of production resources. A Cloud-based architecture is also deployed to offer a service pool which maintains these resources as production services. The proposed system provides a flexible and integrated information environment for the Cloud-based production system. As a specific type of manufacturing, Waste Electrical and Electronic Equipment (WEEE) remanufacturing experiences difficulties in system integration, information exchange and resource management. In this research, WEEE is selected as the example of Internet of Things to demonstrate how the obstacles and bottlenecks are overcome with the help of Cloud-based informatics approach. In the case studies, the WEEE recycle/recovery capabilities are also integrated and deployed as flexible Cloud services. Supporting mechanisms and technologies are presented and evaluated towards the end of the paper.

  6. Adaptive Technologies for Accommodating Persons with Disabilities.

    ERIC Educational Resources Information Center

    Berliss, Jane; And Others

    1993-01-01

    Eight articles review the progress achieved in making library computing technologies and library services accessible to people with disabilities. Adaptive technologies, automated conversion into Braille, and successful programs that demonstrate compliance with the American with Disabilities Act are described. A resource list is included. (EA)

  7. Law of Large Numbers: The Theory, Applications and Technology-Based Education

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas; Gould, Robert

    2009-01-01

    Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information…

  8. Spinoff 2008: 50 Years of NASA-Derived Technologies (1958-2008)

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA Technology Benefiting Society subject headings include: Health and Medicine, Transportation, Public Safety, Consumer, Home and Recreation, Environmental and Agricultural Resources, Computer Technology, and Industrial Productivity. Other topics covered include: Aeronautics and Space Activities, Education News, Partnership News, and the Innovative Partnership Program.

  9. Teaching with technology: free Web resources for teaching and learning.

    PubMed

    Wink, Diane M; Smith-Stoner, Marilyn

    2011-01-01

    In this bimonthly series, the department editor examines how nurse educators can use Internet and Web-based computer technologies such as search, communication, collaborative writing tools; social networking, and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. In this article, the department editor and her coauthor describe free Web-based resources that can be used to support teaching and learning.

  10. Realizing the Potential of Information Resources: Information, Technology, and Services. Proceedings of the CAUSE Annual Conference (New Orleans, Louisiana, November 28-December 3, 1995).

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    This document presents the proceedings of a conference on managing and using information technology in higher education in regard to client/server computing, network delivery, process reengineering, leveraging of resources, and professional development. Eight tracks, with eight papers in each track, addressed the themes of: (1) strategic planning;…

  11. The Role of the Occupational and Physical Therapist in Assistive Technology. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Reed, Penny; Bowser, Gayl

    This guide defines assistive technology as specialized hardware and software equipment used by students with disabilities to increase their ability to participate in tasks of learning and daily living and function as independently as possible. Types of assistive technology are listed, and information resources about assistive technology are noted.…

  12. Science and Technology Resources on the Internet: Computer Security.

    ERIC Educational Resources Information Center

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  13. Drug information resources used by nurse practitioners and collaborating physicians at the point of care in Nova Scotia, Canada: a survey and review of the literature

    PubMed Central

    Murphy, Andrea L; Fleming, Mark; Martin-Misener, Ruth; Sketris, Ingrid S; MacCara, Mary; Gass, David

    2006-01-01

    Background Keeping current with drug therapy information is challenging for health care practitioners. Technologies are often implemented to facilitate access to current and credible drug information sources. In the Canadian province of Nova Scotia, legislation was passed in 2002 to allow nurse practitioners (NPs) to practice collaboratively with physician partners. The purpose of this study was to determine the current utilization patterns of information technologies by these groups of practitioners. Methods Nurse practitioners and their collaborating physician partners in Nova Scotia were sent a survey in February 2005 to determine the frequency of use, usefulness, accessibility, credibility, and current/timeliness of personal digital assistant (PDA), computer, and print drug information resources. Two surveys were developed (one for PDA users and one for computer users) and revised based on a literature search, stakeholder consultation, and pilot-testing results. A second distribution to nonresponders occurred two weeks following the first. Data were entered and analysed with SPSS. Results Twenty-seven (14 NPs and 13 physicians) of 36 (75%) recipients responded. 22% (6) returned personal digital assistant (PDA) surveys. Respondents reported print, health professionals, and online/electronic resources as the most to least preferred means to access drug information, respectively. 37% and 35% of respondents reported using "both print and electronic but print more than electronic" and "print only", respectively, to search monograph-related drug information queries whereas 4% reported using "PDA only". Analysis of respondent ratings for all resources in the categories print, health professionals and other, and online/electronic resources, indicated that the Compendium of Pharmaceuticals and Specialties and pharmacists ranked highly for frequency of use, usefulness, accessibility, credibility, and current/timeliness by both groups of practitioners. Respondents' preferences and resource ratings were consistent with self-reported methods for conducting drug information queries. Few differences existed between NP and physician rankings of resources. Conclusion The use of computers and PDAs remains limited, which is also consistent with preferred and frequent use of print resources. Education for these practitioners regarding available electronic drug information resources may facilitate future computer and PDA use. Further research is needed to determine methods to increase computer and PDA use and whether these technologies affect prescribing and patient outcomes. PMID:16822323

  14. Promoting the safe and strategic use of technology for victims of intimate partner violence: evaluation of the technology safety project.

    PubMed

    Finn, Jerry; Atkinson, Teresa

    2009-11-01

    The Technology Safety Project of the Washington State Coalition Against Domestic Violence was designed to increase awareness and knowledge of technology safety issues for domestic violence victims, survivors, and advocacy staff. The project used a "train-the-trainer" model and provided computer and Internet resources to domestic violence service providers to (a) increase safe computer and Internet access for domestic violence survivors in Washington, (b) reduce the risk posed by abusers by educating survivors about technology safety and privacy, and (c) increase the ability of survivors to help themselves and their children through information technology. Evaluation of the project suggests that the program is needed, useful, and effective. Consumer satisfaction was high, and there was perceived improvement in computer confidence and knowledge of computer safety. Areas for future program development and further research are discussed.

  15. Evaluating interactive computer-based scenarios designed for learning medical technology.

    PubMed

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd

    2014-11-01

    The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Trends in life science grid: from computing grid to knowledge grid.

    PubMed

    Konagaya, Akihiko

    2006-12-18

    Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  17. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  18. Bringing education to your virtual doorstep

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2013-03-01

    We currently witness significant migration of academic resources towards online CMS, social networking, and high-end computerized education. This happens for traditional academic programs as well as for outreach initiatives. The talk will go over a set of innovative integrated technologies, many of which are free. These were developed by Wolfram Research in order to facilitate and enhance the learning process in mathematical and physical sciences. Topics include: cloud computing with Mathematica Online; natural language programming; interactive educational resources and web publishing at the Wolfram Demonstrations Project; the computational knowledge engine Wolfram Alpha; Computable Document Format (CDF) and self-publishing with interactive e-books; course assistant apps for mobile platforms. We will also discuss outreach programs where such technologies are extensively used, such as the Wolfram Science Summer School and the Mathematica Summer Camp.

  19. A Case Study of Technology-Enhanced Historical Inquiry

    ERIC Educational Resources Information Center

    Yang, Shu Ching

    2009-01-01

    The paper describes the integration of web resources and technology as instructional and learning tools in oral history projects. The computer-mediated oral history project centred around interviews with community elders combined with new technologies to engage students in authentic historical inquiry. The study examined learners' affective…

  20. Pedagogical Approaches for Technology-Integrated Science Teaching

    ERIC Educational Resources Information Center

    Hennessy, Sara; Wishart, Jocelyn; Whitelock, Denise; Deaney, Rosemary; Brawn, Richard; la Velle, Linda; McFarlane, Angela; Ruthven, Kenneth; Winterbottom, Mark

    2007-01-01

    The two separate projects described have examined how teachers exploit computer-based technologies in supporting learning of science at secondary level. This paper examines how pedagogical approaches associated with these technological tools are adapted to both the cognitive and structuring resources available in the classroom setting. Four…

  1. Plenary.

    ERIC Educational Resources Information Center

    Oettinger, Anthony G.

    2000-01-01

    Describes the Harvard Program on Information Resources Policy (PIRP) that studies how public policy and strategic corporate decisions affect information systems, including computer technologies; postal and mechanical transportation systems; information use by civilian and military organizations; effect of new technologies; international politics;…

  2. Quantitative Investigation of the Technologies That Support Cloud Computing

    ERIC Educational Resources Information Center

    Hu, Wenjin

    2014-01-01

    Cloud computing is dramatically shaping modern IT infrastructure. It virtualizes computing resources, provides elastic scalability, serves as a pay-as-you-use utility, simplifies the IT administrators' daily tasks, enhances the mobility and collaboration of data, and increases user productivity. We focus on providing generalized black-box…

  3. Customized On-site Resource Training Services (CORTS): A Partnership Program.

    ERIC Educational Resources Information Center

    Macquarie Univ., North Ryde (Australia). Special Education Centre.

    In 1983, New Brunswick Community College-Moncton (NBCCM) was awarded funding to establish a Computer Aided Drafting/Manufacturing (CAD/CAM) resource center to train students and assist industry in researching and adopting CAD/CAM technology. However, inherent constraints in industry and the absorption of college resources by in-house training…

  4. The Frustrated Nerds Project--Resources for Systems Administrators in Higher Education: A Resource Webliography

    ERIC Educational Resources Information Center

    Henninger, Jessamyn; Aber, Susan Ward

    2010-01-01

    Systems Architects and Information Technology administrators working in higher education help faculty, staff, and student computer users. Yet, who helps them? What resources do these professionals value? A case study was conducted using purposeful sampling and data collection through electronic interview to gather the preferred information-seeking…

  5. Using Computers in Early Years Education: What Are the Effects on Children's Development? Some Suggestions Concerning Beneficial Computer Practice

    ERIC Educational Resources Information Center

    Theodotou, Evgenia

    2010-01-01

    Technology in education is considered in empirical and theoretical literature as both beneficial and harmful to children's development. In the field of the early years settings there is a dilemma whether or not early childhood teachers should use technology as a teaching and learning resource. This paper has a pedagogical focus, discussing the…

  6. Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*

    PubMed Central

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.

    2015-01-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363

  7. Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.

    PubMed

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L

    2015-02-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Staff Development Resources, 1989-90. ITV Connection.

    ERIC Educational Resources Information Center

    South Carolina State Dept. of Education, Columbia. Office of Instructional Technology.

    This staff development resource guide includes listings of television and radio broadcasts categorized by topical emphasis. Television program topics include: administration; adult education; arts; career education; certificate-renewal credit courses; college credit courses; computer education and new technology; custodial training; early…

  9. EPAS TOXCAST PROGRAM FOR PREDICTING HAZARD AND PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS(S).

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is developing methods that apply computational chemistry, high-throughput screening (HTS) and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.

  10. Technology for the Organic Chemist: Three Exploratory Modules

    ERIC Educational Resources Information Center

    Esteb, John J.; McNulty, LuAnne M.; Magers, John; Morgan, Paul; Wilson, Anne M.

    2010-01-01

    The ability to use computer-based technology is an essential skill set for students majoring in chemistry. This exercise details the introduction of appropriate uses for this technology in the organic chemistry series. The incorporation of chemically appropriate online resources (module 1), scientific databases (module 2), and the use of a…

  11. Course Syllabus: Science, Technology and Society.

    ERIC Educational Resources Information Center

    Garner, Douglas

    1985-01-01

    Describes the aims, methods, project, and topics of a course designed so that students may explore the impact of science and technology on society. Units include: technology (pro and con); nuclear deterrence; politics and technical decisions; and computers. Includes a list of audiovisual resources (with title, source, and current cost). (DH)

  12. Instructional Technology and Higher Education: Rewards, Rights, and Responsibilities.

    ERIC Educational Resources Information Center

    Albright, Michael J.

    This keynote address seeks to establish a definition for "instructional technology" that does not emphasize computer hardware and software but instead focuses on human skills, resource management, problem solving, and educational settings. Also discussed are ways in which technology like electronic mail and the world wide web has…

  13. Technologies and Reformed-Based Science Instruction: The Examination of a Professional Development Model Focused on Supporting Science Teaching and Learning with Technologies

    ERIC Educational Resources Information Center

    Campbell, Todd; Longhurst, Max L.; Wang, Shiang-Kwei; Hsu, Hui-Yin; Coster, Dan C.

    2015-01-01

    While access to computers, other technologies, and cyber-enabled resources that could be leveraged for enhancing student learning in science is increasing, generally it has been found that teachers use technology more for administrative purposes or to support traditional instruction. This use of technology, especially to support traditional…

  14. Evaluation of the Texas Technology Immersion Pilot: Third-Year (2006-07) Traits of Higher Technology Immersion Schools and Teachers

    ERIC Educational Resources Information Center

    Shapley, Kelly; Maloney, Catherine; Caranikas-Walker, Fanny; Sheehan, Daniel

    2008-01-01

    The Technology Immersion Pilot (TIP), created by the Texas Legislature in 2003, called for the Texas Education Agency (TEA) to establish a pilot project to "immerse" schools in technology by providing a wireless mobile computing device for each teacher and student, technology-based learning resources, training for teachers to integrate…

  15. Making Connections: Power at Your Fingertips. Resources in Technology.

    ERIC Educational Resources Information Center

    Deal, Walter F., III

    1997-01-01

    Discusses inventions and innovations in battery technology. Includes information about batteries that have produced products such as cellular telephones, portable computers, and camcorders. Also describes lithium and solid state batteries and offers tips on battery safety. (JOW)

  16. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  17. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Moreover, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks. PMID:27359149

  18. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can provide up to 3.13× speedup (on experimented resources). Furthermore, the error rates of the models range between 2.1 and 23.3% (considering workflow execution times), where the accuracy of the model estimations increases with higher computational demands in reconstruction tasks.« less

  19. Satellite on-board processing for earth resources data

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E.; Gonzalez, R. C.; Gupta, J. N.; Hwang, K.; Rochelle, R. W.; Wilson, J. B.; Wintz, P. A.

    1975-01-01

    Results of a survey of earth resources user applications and their data requirements, earth resources multispectral scanner sensor technology, and preprocessing algorithms for correcting the sensor outputs and for data bulk reduction are presented along with a candidate data format. Computational requirements required to implement the data analysis algorithms are included along with a review of computer architectures and organizations. Computer architectures capable of handling the algorithm computational requirements are suggested and the environmental effects of an on-board processor discussed. By relating performance parameters to the system requirements of each of the user requirements the feasibility of on-board processing is determined for each user. A tradeoff analysis is performed to determine the sensitivity of results to each of the system parameters. Significant results and conclusions are discussed, and recommendations are presented.

  20. Realizing the Promise of Visualization in the Theory of Computing

    ERIC Educational Resources Information Center

    Cogliati, Joshua J.; Goosey, Frances W.; Grinder, Michael T.; Pascoe, Bradley A.; Ross, Rockford J.; Williams, Cheston J.

    2005-01-01

    Progress on a hypertextbook on the theory of computing is presented. The hypertextbook is a novel teaching and learning resource built around web technologies that incorporates text, sound, pictures, illustrations, slide shows, video clips, and--most importantly--active learning models of the key concepts of the theory of computing into an…

  1. Extended outlook: description, utilization, and daily applications of cloud technology in radiology.

    PubMed

    Gerard, Perry; Kapadia, Neil; Chang, Patricia T; Acharya, Jay; Seiler, Michael; Lefkovitz, Zvi

    2013-12-01

    The purpose of this article is to discuss the concept of cloud technology, its role in medical applications and radiology, the role of the radiologist in using and accessing these vast resources of information, and privacy concerns and HIPAA compliance strategies. Cloud computing is the delivery of shared resources, software, and information to computers and other devices as a metered service. This technology has a promising role in the sharing of patient medical information and appears to be particularly suited for application in radiology, given the field's inherent need for storage and access to large amounts of data. The radiology cloud has significant strengths, such as providing centralized storage and access, reducing unnecessary repeat radiologic studies, and potentially allowing radiologic second opinions more easily. There are significant cost advantages to cloud computing because of a decreased need for infrastructure and equipment by the institution. Private clouds may be used to ensure secure storage of data and compliance with HIPAA. In choosing a cloud service, there are important aspects, such as disaster recovery plans, uptime, and security audits, that must be considered. Given that the field of radiology has become almost exclusively digital in recent years, the future of secure storage and easy access to imaging studies lies within cloud computing technology.

  2. Responding to Information Needs in the 1980s.

    ERIC Educational Resources Information Center

    McGraw, Harold W., Jr.

    1979-01-01

    Argues that technological developments in cable television, computers, and telecommunications could decentralize power and put the resources of the new technology more broadly at the command of individuals and small groups, but that this potential requires action to be realized. (Author)

  3. Technology.

    ERIC Educational Resources Information Center

    Online-Offline, 1998

    1998-01-01

    Focuses on technology, on advances in such areas as aeronautics, electronics, physics, the space sciences, as well as computers and the attendant progress in medicine, robotics, and artificial intelligence. Describes educational resources for elementary and middle school students, including Web sites, CD-ROMs and software, videotapes, books,…

  4. Federal Barriers to Innovation

    ERIC Educational Resources Information Center

    Miller, Raegen; Lake, Robin

    2012-01-01

    With educational outcomes inadequate, resources tight, and students' academic needs growing more complex, America's education system is certainly ready for technological innovation. And technology itself is ripe to be exploited. Devices harnessing cheap computing power have become smart and connected. Voice recognition, artificial intelligence,…

  5. Survey of Collaboration Technologies in Multi-level Security Environments

    DTIC Science & Technology

    2014-04-28

    infrastructure or resources. In this research program, the security implications of the US Air Force GeoBase (the US The problem is that in many cases...design structure. ORA uses a Java interface for ease of use, and a C++ computational backend . The current version ORA1.2 software is available on the...information: culture, policy, governance, economics and resources, and technology and infrastructure . This plan, the DoD Information Sharing

  6. Use of computers and the Internet by residents in US family medicine programmes.

    PubMed

    King, Richard V; Murphy-Cullen, Cassie L; Mayo, Helen G; Marcee, Alice K; Schneider, Gregory W

    2007-06-01

    Computers, personal digital assistants (PDA), and the Internet are widely used as resources in medical education and clinical care. Educators who intend to incorporate these resources effectively into residency education programmes can benefit from understanding how residents currently use these tools, their skills, and their preferences. The researchers sent questionnaires to 306 US family medicine residency programmes for all of their residents to complete. Respondents were 1177 residents from 125 (41%) programmes. Access to a computer was reported by 95% of respondents. Of these, 97% of desktop and 89% of laptop computers could access the Internet. Residents accessed various educational and clinical resources. Half felt they had 'intermediate' skills at Web searches, 23% had 'some skills,' and 27% were 'quite skilled.' Those under 30 years of age reported higher skill levels. Those who experienced a Web-based curriculum in medical school reported higher search skills and greater success in finding clinical information. Respondents preferred to use technology to supplement the didactic sessions offered in resident teaching conferences. Favourable conditions exist in family medicine residency programmes to implement a blend of traditional and technology-based learning experiences. These conditions include residents' experience, skills, and preferences.

  7. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid services discovered using semantic grid technology. As required, high-end computational resources could be drawn from available grid resource pools. Using grid technology, this confluence of data, services and computational resources could easily be harnessed to transform data from many different sources into a desired product that is delivered to a user's workstation or to a web portal though which it could be accessed by its intended audience.

  8. Existing and Emerging Technologies in Education: A Descriptive Overview. CREATE Monograph Series.

    ERIC Educational Resources Information Center

    Bakke, Thomas W.

    Second in a series of six monographs on the use of new technologies in the instruction of learning disabled students, the paper offers a descriptive overview of new technologies. Topics addressed include the following: (1) techniques for sharing computer resources (including aspects of networking, sharing information through databases, and the use…

  9. Integrating Technology into the K-12 Music Curriculum.

    ERIC Educational Resources Information Center

    Washington Office of the State Superintendent of Public Instruction, Olympia.

    This guide is intended to provide resources for integrating technology into the K-12 music curriculum. The focus of the guide is on computer software and the use of MIDI (Musical Instrument Digital Interface) in the music classroom. The guide gives two examples of commercially available curricula that integrate technology as well as lesson plans…

  10. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  11. Central Limit Theorem: New SOCR Applet and Demonstration Activity

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicholas; Sanchez, Juana

    2008-01-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multi-faceted learning environments, which may facilitate student comprehension and information…

  12. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  13. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  14. A Short Bibliography on Library/Media Leadership.

    ERIC Educational Resources Information Center

    Stanford Univ., CA. ERIC Clearinghouse on Information Resources.

    Prepared for distribution at the 1975 Annual Convention of the Association for Educational Communications and Technology, this bibliography was assembled from the Current Index to Journals in Education (CIJE) and Resources in Education (RIE) computer files of the Educational Resources Information Center (ERIC). Annotated CIJE and RIE entries…

  15. [Location information acquisition and sharing application design in national census of Chinese medicine resources].

    PubMed

    Zhang, Xiao-Bo; Li, Meng; Wang, Hui; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In literature, there are many information on the distribution of Chinese herbal medicine. Limited by the technical methods, the origin of Chinese herbal medicine or distribution of information in ancient literature were described roughly. It is one of the main objectives of the national census of Chinese medicine resources, which is the background information of the types and distribution of Chinese medicine resources in the region. According to the national Chinese medicine resource census technical specifications and pilot work experience, census team with "3S" technology, computer network technology, digital camera technology and other modern technology methods, can effectively collect the location information of traditional Chinese medicine resources. Detailed and specific location information, such as regional differences in resource endowment and similarity, biological characteristics and spatial distribution, the Chinese medicine resource census data access to the accuracy and objectivity evaluation work, provide technical support and data support. With the support of spatial information technology, based on location information, statistical summary and sharing of multi-source census data can be realized. The integration of traditional Chinese medicine resources and related basic data can be a spatial integration, aggregation and management of massive data, which can help for the scientific rules data mining of traditional Chinese medicine resources from the overall level and fully reveal its scientific connotation. Copyright© by the Chinese Pharmaceutical Association.

  16. Methods for Prediction of High-Speed Reacting Flows in Aerospace Propulsion

    NASA Technical Reports Server (NTRS)

    Drummond, J. Philip

    2014-01-01

    Research to develop high-speed airbreathing aerospace propulsion systems was underway in the late 1950s. A major part of the effort involved the supersonic combustion ramjet, or scramjet, engine. Work had also begun to develop computational techniques for solving the equations governing the flow through a scramjet engine. However, scramjet technology and the computational methods to assist in its evolution would remain apart for another decade. The principal barrier was that the computational methods needed for engine evolution lacked the computer technology required for solving the discrete equations resulting from the numerical methods. Even today, computer resources remain a major pacing item in overcoming this barrier. Significant advances have been made over the past 35 years, however, in modeling the supersonic chemically reacting flow in a scramjet combustor. To see how scramjet development and the required computational tools finally merged, we briefly trace the evolution of the technology in both areas.

  17. Enhancing Collaborative Peer-to-Peer Systems Using Resource Aggregation and Caching: A Multi-Attribute Resource and Query Aware Approach

    ERIC Educational Resources Information Center

    Bandara, H. M. N. Dilum

    2012-01-01

    Resource-rich computing devices, decreasing communication costs, and Web 2.0 technologies are fundamentally changing the way distributed applications communicate and collaborate. With these changes, we envision Peer-to-Peer (P2P) systems that will allow for the integration and collaboration of peers with diverse capabilities to a virtual community…

  18. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  19. Inclusion of Mobility-Impaired Children in the One-to-One Computing Era: A Case Study

    ERIC Educational Resources Information Center

    Mangiatordi, Andrea

    2012-01-01

    In recent times many developing countries have adopted a one-to-one model for distributing computers in classrooms. Among the various effects that such an approach could imply, it surely increases the availability of computer-related Assistive Technology at school and provides higher resources for empowering disabled children in their learning and…

  20. Free Software and Multivariable Calculus

    ERIC Educational Resources Information Center

    Nord, Gail M.

    2011-01-01

    Calculators and computers make new modes of instruction possible; yet, at the same time they pose hardships for school districts and mathematics educators trying to incorporate technology with limited monetary resources. In the "Standards," a recommended classroom is one in which calculators, computers, courseware, and manipulative materials are…

  1. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  2. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  3. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE PAGES

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; ...

    2017-10-01

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  4. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  5. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    NASA Astrophysics Data System (ADS)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea

    2017-10-01

    The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.

  6. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  7. Elucidating reaction mechanisms on quantum computers.

    PubMed

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  8. Elucidating reaction mechanisms on quantum computers

    PubMed Central

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  9. Elucidating reaction mechanisms on quantum computers

    NASA Astrophysics Data System (ADS)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  10. Literacy Toolkit

    ERIC Educational Resources Information Center

    Center for Best Practices in Early Childhood Education, 2005

    2005-01-01

    The toolkit contains print and electronic resources, including (1) "eMERGing Literacy and Technology: Working Together", A 492 page curriculum guide; (2) "LitTECH Interactive Presents: The Beginning of Literacy", a DVD that provides and overview linking technology to the concepts of emerging literacy; (3) "Your Preschool Classroom Computer Center:…

  11. Program on application of communications satellites to educational development

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.

    1971-01-01

    Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.

  12. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  13. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and…

  14. The Technology Information Environment with Industry{trademark} system description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detry, R.; Machin, G.

    The Technology Information Environment with Industry (TIE-In{trademark}) provides users with controlled access to distributed laboratory resources that are packaged in intelligent user interfaces. These interfaces help users access resources without requiring the user to have technical or computer expertise. TIE-In utilizes existing, proven technologies such as the Kerberos authentication system, X-Windows, and UNIX sockets. A Front End System (FES) authenticates users and allows them to register for resources and subsequently access them. The FES also stores status and accounting information, and provides an automated method for the resource owners to recover costs from users. The resources available through TIE-In aremore » typically laboratory-developed applications that are used to help design, analyze, and test components in the nation`s nuclear stockpile. Many of these applications can also be used by US companies for non-weapons-related work. TIE-In allows these industry partners to obtain laboratory-developed technical solutions without requiring them to duplicate the technical resources (people, hardware, and software) at Sandia.« less

  15. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    PubMed

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  16. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing

    PubMed Central

    Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network. PMID:28030553

  17. Role of information systems in controlling costs: the electronic medical record (EMR) and the high-performance computing and communications (HPCC) efforts

    NASA Astrophysics Data System (ADS)

    Kun, Luis G.

    1994-12-01

    On October 18, 1991, the IEEE-USA produced an entity statement which endorsed the vital importance of the High Performance Computer and Communications Act of 1991 (HPCC) and called for the rapid implementation of all its elements. Efforts are now underway to develop a Computer Based Patient Record (CBPR), the National Information Infrastructure (NII) as part of the HPCC, and the so-called `Patient Card'. Multiple legislative initiatives which address these and related information technology issues are pending in Congress. Clearly, a national information system will greatly affect the way health care delivery is provided to the United States public. Timely and reliable information represents a critical element in any initiative to reform the health care system as well as to protect and improve the health of every person. Appropriately used, information technologies offer a vital means of improving the quality of patient care, increasing access to universal care and lowering overall costs within a national health care program. Health care reform legislation should reflect increased budgetary support and a legal mandate for the creation of a national health care information system by: (1) constructing a National Information Infrastructure; (2) building a Computer Based Patient Record System; (3) bringing the collective resources of our National Laboratories to bear in developing and implementing the NII and CBPR, as well as a security system with which to safeguard the privacy rights of patients and the physician-patient privilege; and (4) utilizing Government (e.g. DOD, DOE) capabilities (technology and human resources) to maximize resource utilization, create new jobs and accelerate technology transfer to address health care issues.

  18. A European Federated Cloud: Innovative distributed computing solutions by EGI

    NASA Astrophysics Data System (ADS)

    Sipos, Gergely; Turilli, Matteo; Newhouse, Steven; Kacsuk, Peter

    2013-04-01

    The European Grid Infrastructure (EGI) is the result of pioneering work that has, over the last decade, built a collaborative production infrastructure of uniform services through the federation of national resource providers that supports multi-disciplinary science across Europe and around the world. This presentation will provide an overview of the recently established 'federated cloud computing services' that the National Grid Initiatives (NGIs), operators of EGI, offer to scientific communities. The presentation will explain the technical capabilities of the 'EGI Federated Cloud' and the processes whereby earth and space science researchers can engage with it. EGI's resource centres have been providing services for collaborative, compute- and data-intensive applications for over a decade. Besides the well-established 'grid services', several NGIs already offer privately run cloud services to their national researchers. Many of these researchers recently expressed the need to share these cloud capabilities within their international research collaborations - a model similar to the way the grid emerged through the federation of institutional batch computing and file storage servers. To facilitate the setup of a pan-European cloud service from the NGIs' resources, the EGI-InSPIRE project established a Federated Cloud Task Force in September 2011. The Task Force has a mandate to identify and test technologies for a multinational federated cloud that could be provisioned within EGI by the NGIs. A guiding principle for the EGI Federated Cloud is to remain technology neutral and flexible for both resource providers and users: • Resource providers are allowed to use any cloud hypervisor and management technology to join virtualised resources into the EGI Federated Cloud as long as the site is subscribed to the user-facing interfaces selected by the EGI community. • Users can integrate high level services - such as brokers, portals and customised Virtual Research Environments - with the EGI Federated Cloud as long as these services access cloud resources through the user-facing interfaces selected by the EGI community. The Task Force will be closed in May 2013. It already • Identified key enabling technologies by which a multinational, federated 'Infrastructure as a Service' (IaaS) type cloud can be built from the NGIs' resources; • Deployed a test bed to evaluate the integration of virtualised resources within EGI and to engage with early adopter use cases from different scientific domains; • Integrated cloud resources into the EGI production infrastructure through cloud specific bindings of the EGI information system, monitoring system, authentication system, etc.; • Collected and catalogued requirements concerning the federated cloud services from the feedback of early adopter use cases; • Provided feedback and requirements to relevant technology providers on their implementations and worked with these providers to address those requirements; • Identified issues that need to be addressed by other areas of EGI (such as portal solutions, resource allocation policies, marketing and user support) to reach a production system. The Task Force will publish a blueprint in April 2013. The blueprint will drive the establishment of a production level EGI Federated Cloud service after May 2013.

  19. COMPUTER MODEL TECHNOLOGY TRANSFER IN THE UNITED STATES

    EPA Science Inventory

    Computer-based mathematical models for urban water resources planning, management and design are widely used by engineers and planners in both the public and private sectors. In the United States, the majority of the users are in the private (consulting) sector, yet most of the m...

  20. 75 FR 38595 - Guidance to States Regarding Driver History Record Information Security, Continuity of Operation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-02

    ... Standards and Technology's (NIST) Computer Security Division maintains a Computer Security Resource Center... Regarding Driver History Record Information Security, Continuity of Operation Planning, and Disaster... (SDLAs) to support their efforts at maintaining the security of information contained in the driver...

  1. You Want Me to What?

    ERIC Educational Resources Information Center

    McGarvey, Robert J.

    2010-01-01

    It's a riddle faced by virtually every IT director: how to fulfill users' desire for more muscular computing resources while still obliging administrators' commands to keep education spending down. Against long odds, many district technology directors have been fulfilling both counts, optimizing their computing systems with improvements that pay…

  2. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  3. One-way quantum computing in superconducting circuits

    NASA Astrophysics Data System (ADS)

    Albarrán-Arriagada, F.; Alvarado Barrios, G.; Sanz, M.; Romero, G.; Lamata, L.; Retamal, J. C.; Solano, E.

    2018-03-01

    We propose a method for the implementation of one-way quantum computing in superconducting circuits. Measurement-based quantum computing is a universal quantum computation paradigm in which an initial cluster state provides the quantum resource, while the iteration of sequential measurements and local rotations encodes the quantum algorithm. Up to now, technical constraints have limited a scalable approach to this quantum computing alternative. The initial cluster state can be generated with available controlled-phase gates, while the quantum algorithm makes use of high-fidelity readout and coherent feedforward. With current technology, we estimate that quantum algorithms with above 20 qubits may be implemented in the path toward quantum supremacy. Moreover, we propose an alternative initial state with properties of maximal persistence and maximal connectedness, reducing the required resources of one-way quantum computing protocols.

  4. USMC Installations Command Information Environment: Opportunities and Analysis for Integration of First Responder Communications

    DTIC Science & Technology

    2014-09-01

    becoming a more and more prevalent technology in the business world today. According to Syal and Goswami (2012), cloud technology is seen as a...use of computing resources, applications, and personal files without reliance on a single computer or system ( Syal & Goswami, 2012). By operating in...cloud services largely being web-based, which can be retrieved through most systems with access to the Internet ( Syal & Goswami, 2012). The end user can

  5. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  6. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  7. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, K; Kagadis, G; Xing, L

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set againstmore » new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.« less

  8. Information Systems Education: The Case for the Academic Cloud

    ERIC Educational Resources Information Center

    Mew, Lionel

    2016-01-01

    This paper discusses how cloud computing can be leveraged to add value to academic programs in information systems and other fields by improving financial sustainment models for institutional technology and academic departments, relieving the strain on overworked technology support resources, while adding richness and improving pedagogical…

  9. Crocodile Technology. [CD-ROM].

    ERIC Educational Resources Information Center

    2000

    This high school physics computer software resource is a systems and control simulator that covers the topics of electricity, electronics, mechanics, and programming. Circuits can easily be simulated on the screen and electronic and mechanical components can be combined. In addition to those provided in Crocodile Technology, a student can create…

  10. Moving beyond the White Cane: Building an Online Learning Environment for the Visually Impaired Professional.

    ERIC Educational Resources Information Center

    Mitchell, Donald P.; Scigliano, John A.

    2000-01-01

    Describes the development of an online learning environment for a visually impaired professional. Topics include physical barriers, intellectual barriers, psychological barriers, and technological barriers; selecting appropriate hardware and software; and combining technologies that include personal computers, Web-based resources, network…

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Wilcox, S.

    This report was part of a multiyear collaboration with the University of Wisconsin and the National Oceanic and Atmospheric Administration (NOAA) to produce high-quality, satellite-based, solar resource datasets for the United States. High-quality, solar resource assessment accelerates technology deployment by making a positive impact on decision making and reducing uncertainty in investment decisions. Satellite-based solar resource datasets are used as a primary source in solar resource assessment. This is mainly because satellites provide larger areal coverage and longer periods of record than ground-based measurements. With the advent of newer satellites with increased information content and faster computers that can processmore » increasingly higher data volumes, methods that were considered too computationally intensive are now feasible. One class of sophisticated methods for retrieving solar resource information from satellites is a two-step, physics-based method that computes cloud properties and uses the information in a radiative transfer model to compute solar radiation. This method has the advantage of adding additional information as satellites with newer channels come on board. This report evaluates the two-step method developed at NOAA and adapted for solar resource assessment for renewable energy with the goal of identifying areas that can be improved in the future.« less

  12. OCCAM: a flexible, multi-purpose and extendable HPC cluster

    NASA Astrophysics Data System (ADS)

    Aldinucci, M.; Bagnasco, S.; Lusso, S.; Pasteris, P.; Rabellino, S.; Vallero, S.

    2017-10-01

    The Open Computing Cluster for Advanced data Manipulation (OCCAM) is a multipurpose flexible HPC cluster designed and operated by a collaboration between the University of Torino and the Sezione di Torino of the Istituto Nazionale di Fisica Nucleare. It is aimed at providing a flexible, reconfigurable and extendable infrastructure to cater to a wide range of different scientific computing use cases, including ones from solid-state chemistry, high-energy physics, computer science, big data analytics, computational biology, genomics and many others. Furthermore, it will serve as a platform for R&D activities on computational technologies themselves, with topics ranging from GPU acceleration to Cloud Computing technologies. A heterogeneous and reconfigurable system like this poses a number of challenges related to the frequency at which heterogeneous hardware resources might change their availability and shareability status, which in turn affect methods and means to allocate, manage, optimize, bill, monitor VMs, containers, virtual farms, jobs, interactive bare-metal sessions, etc. This work describes some of the use cases that prompted the design and construction of the HPC cluster, its architecture and resource provisioning model, along with a first characterization of its performance by some synthetic benchmark tools and a few realistic use-case tests.

  13. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds

    PubMed Central

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms. PMID:27501046

  14. VM Capacity-Aware Scheduling within Budget Constraints in IaaS Clouds.

    PubMed

    Thanasias, Vasileios; Lee, Choonhwa; Hanif, Muhammad; Kim, Eunsam; Helal, Sumi

    2016-01-01

    Recently, cloud computing has drawn significant attention from both industry and academia, bringing unprecedented changes to computing and information technology. The infrastructure-as-a-Service (IaaS) model offers new abilities such as the elastic provisioning and relinquishing of computing resources in response to workload fluctuations. However, because the demand for resources dynamically changes over time, the provisioning of resources in a way that a given budget is efficiently utilized while maintaining a sufficing performance remains a key challenge. This paper addresses the problem of task scheduling and resource provisioning for a set of tasks running on IaaS clouds; it presents novel provisioning and scheduling algorithms capable of executing tasks within a given budget, while minimizing the slowdown due to the budget constraint. Our simulation study demonstrates a substantial reduction up to 70% in the overall task slowdown rate by the proposed algorithms.

  15. Application of Cloud Computing at KTU: MS Live@Edu Case

    ERIC Educational Resources Information Center

    Miseviciene, Regina; Budnikas, Germanas; Ambraziene, Danute

    2011-01-01

    Cloud computing is a significant alternative in today's educational perspective. The technology gives the students and teachers the opportunity to quickly access various application platforms and resources through the web pages on-demand. Unfortunately, not all educational institutions often have an ability to take full advantages of the newest…

  16. Toward an Understanding of Incidental Input Enhancement in Computerized L2 Environments

    ERIC Educational Resources Information Center

    Gascoigne, Carolyn

    2006-01-01

    Computers, computer programs, and other novel and vivid technological applications to language learning can unintentionally redirect attentional resources and therefore increase the salience of unplanned as well as targeted features. Incidental activities such as keyboarding (Henry, 1992), manipulation of a mouse (Meunier, 1996), and other…

  17. "TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…

  18. An Integrated Evaluation Method for E-Learning: A Case Study

    ERIC Educational Resources Information Center

    Rentroia-Bonito, M. A.; Figueiredo, F.; Martins, A.; Jorge, J. A.; Ghaoui, C.

    2006-01-01

    Technological improvements in broadband and distributed computing are making it possible to distribute live media content cost-effectively. Because of this, organizations are looking into cost-effective approaches to implement e-Learning initiatives. Indeed, computing resources are not enough by themselves to promote better e-Learning experiences.…

  19. Education and information for practicing school nurses: which technology-supported resources meet their needs?

    PubMed

    Anderson, Lori S; Enge, Karmin J

    2012-10-01

    School nurses care for children with a variety of health-related conditions and they need information about managing these conditions, which is accessible, current, and useful. The goal of this literature review was to gather and synthesize information on technology-supported resources and to determine which met the educational needs of school nurses. Successful online educational programs were interactive and self-directed. The most common barriers were lack of time to find educational information, lack of knowledge about computers, technology, the Internet and specific programs, and lack of administrative support from school officials to use technology to access information and evidence for practice. Recommendations for successful use of technology to meet practicing school nurse's educational needs are offered.

  20. Generic Divide and Conquer Internet-Based Computing

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J. (Technical Monitor); Radenski, Atanas

    2003-01-01

    The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.

  1. Towards optimizing server performance in an educational MMORPG for teaching computer programming

    NASA Astrophysics Data System (ADS)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2013-10-01

    Web-based games have become significantly popular during the last few years. This is due to the gradual increase of internet speed, which has led to the ongoing multiplayer games development and more importantly the emergence of the Massive Multiplayer Online Role Playing Games (MMORPG) field. In parallel, similar technologies called educational games have started to be developed in order to be put into practice in various educational contexts, resulting in the field of Game Based Learning. However, these technologies require significant amounts of resources, such as bandwidth, RAM and CPU capacity etc. These amounts may be even larger in an educational MMORPG game that supports computer programming education, due to the usual inclusion of a compiler and the constant client/server data transmissions that occur during program coding, possibly leading to technical issues that could cause malfunctions during learning. Thus, the determination of the elements that affect the overall games resources' load is essential so that server administrators can configure them and ensure educational games' proper operation during computer programming education. In this paper, we propose a new methodology with which we can achieve monitoring and optimization of the load balancing, so that the essential resources for the creation and proper execution of an educational MMORPG for computer programming can be foreseen and bestowed without overloading the system.

  2. Requirements for fault-tolerant factoring on an atom-optics quantum computer.

    PubMed

    Devitt, Simon J; Stephens, Ashley M; Munro, William J; Nemoto, Kae

    2013-01-01

    Quantum information processing and its associated technologies have reached a pivotal stage in their development, with many experiments having established the basic building blocks. Moving forward, the challenge is to scale up to larger machines capable of performing computational tasks not possible today. This raises questions that need to be urgently addressed, such as what resources these machines will consume and how large will they be. Here we estimate the resources required to execute Shor's factoring algorithm on an atom-optics quantum computer architecture. We determine the runtime and size of the computer as a function of the problem size and physical error rate. Our results suggest that once the physical error rate is low enough to allow quantum error correction, optimization to reduce resources and increase performance will come mostly from integrating algorithms and circuits within the error correction environment, rather than from improving the physical hardware.

  3. Information technology challenges of biodiversity and ecosystems informatics

    USGS Publications Warehouse

    Schnase, J.L.; Cushing, J.; Frame, M.; Frondorf, A.; Landis, E.; Maier, D.; Silberschatz, A.

    2003-01-01

    Computer scientists, biologists, and natural resource managers recently met to examine the prospects for advancing computer science and information technology research by focusing on the complex and often-unique challenges found in the biodiversity and ecosystem domain. The workshop and its final report reveal that the biodiversity and ecosystem sciences are fundamentally information sciences and often address problems having distinctive attributes of scale and socio-technical complexity. The paper provides an overview of the emerging field of biodiversity and ecosystem informatics and demonstrates how the demands of biodiversity and ecosystem research can advance our understanding and use of information technologies.

  4. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  5. Virtual Learning: Examination of ICT as Beneficial Learning Tool for Children's Social Development

    ERIC Educational Resources Information Center

    Theodotou, Evgenia; Kaitsa-Kulovana, Helena

    2012-01-01

    Nowadays, technology is advancing on daily basis and more resources are available for educational purposes. However, there are concerns regarding negative effects it can have on children's development. This research investigates the impact of technology, particularly computers on children's social behaviour. There is considerable amount of…

  6. Designing the Very Small: Micro and Nanotechnology. Resources in Technology.

    ERIC Educational Resources Information Center

    Jacobs, James A.

    1996-01-01

    This learning activity is designed to increase knowledge of materials science; engineering; and technology design and the manufacture of the very small devices used in watches, computers, and calculators. It looks at possible innovations to come from micro- and nanotechnology. Includes a student quiz. (Author/JOW)

  7. Overcoming the Grammar Deficit: The Role of Information Technology in Teaching German Grammar to Undergraduates.

    ERIC Educational Resources Information Center

    Hall, Christopher

    1998-01-01

    Examines how application of computer-assisted language learning (CALL) and information technology can be used to overcome "grammar deficit" seen in many British undergraduate German students. A combination of explicit, implicit, and exploratory grammar teaching approaches uses diverse resources, including word processing packages,…

  8. Bringing Tomorrow's Technology to You Today: School Board of Tomorrow Resource Guide.

    ERIC Educational Resources Information Center

    National School Boards Association, Alexandria, VA.

    The National School Boards Association (NSBA), the National School Boards Foundation, NSBA's Institute for the Transfer of Technology to Education, and Apple Computer, Inc., launched "The School Board of Tomorrow Exhibit" at NSBA's 1996 annual conference and exposition in Orlando, Florida. This handbook summarizes the communication technologies…

  9. Lifelong Learning: Skills and Online Resources

    ERIC Educational Resources Information Center

    Lim, Russell F.; Hsiung, Bob C.; Hales, Deborah J.

    2006-01-01

    Objective: Advances in information technology enable the practicing psychiatrist's quest to keep up-to-date with new discoveries in psychiatry, as well as to meet recertification requirements. However, physicians' computer skills do not always keep up with technology, nor do they take advantage of online search and continuing education services.…

  10. Communication and collaboration technologies.

    PubMed

    Cheeseman, Susan E

    2012-01-01

    This is the third in a series of columns exploring health information technology (HIT) in the neonatal intensive care unit (NICU). The first column provided background information on the implementation of information technology throughout the health care delivery system, as well as the requisite informatics competencies needed for nurses to fully engage in the digital era of health care. The second column focused on information and resources to master basic computer competencies described by the TIGER initiative (Technology Informatics Guiding Education Reform) as learning about computers, computer networks, and the transfer of data.1 This column will provide additional information related to basic computer competencies, focusing on communication and collaboration technologies. Computers and the Internet have transformed the way we communicate and collaborate. Electronic communication is the ability to exchange information through the use of computer equipment and software.2 Broadly defined, any technology that facilitates linking one or more individuals together is a collaborative tool. Collaboration using technology encompasses an extensive range of applications that enable groups of individuals to work together including e-mail, instant messaging (IM ), and several web applications collectively referred to as Web 2.0 technologies. The term Web 2.0 refers to web applications where users interact and collaborate with each other in a collective exchange of ideas generating content in a virtual community. Examples of Web 2.0 technologies include social networking sites, blogs, wikis, video sharing sites, and mashups. Many organizations are developing collaborative strategies and tools for employees to connect and interact using web-based social media technologies.3.

  11. Scheduling quality of precise form sets which consist of tasks of circular type in GRID systems

    NASA Astrophysics Data System (ADS)

    Saak, A. E.; Kureichik, V. V.; Kravchenko, Y. A.

    2018-05-01

    Users’ demand in computer power and rise of technology favour the arrival of Grid systems. The quality of Grid systems’ performance depends on computer and time resources scheduling. Grid systems with a centralized structure of the scheduling system and user’s task are modeled by resource quadrant and re-source rectangle accordingly. A Non-Euclidean heuristic measure, which takes into consideration both the area and the form of an occupied resource region, is used to estimate scheduling quality of heuristic algorithms. The authors use sets, which are induced by the elements of square squaring, as an example of studying the adapt-ability of a level polynomial algorithm with an excess and the one with minimal deviation.

  12. Curriculum and Resources: Computer Provision in a CTC.

    ERIC Educational Resources Information Center

    Denholm, Lawrence

    The program for City Technical Colleges (CTCs) draws on ideas and resources from government, private industry, and education to focus on the educational needs of inner city and urban children. Mathematics, science, and technology are at the center of the CTCs' mission, in a context which includes economic awareness and a commitment to enterprise…

  13. A Resource Center for the Stimulation of Post Secondary Education Innovation via Computer Network.

    ERIC Educational Resources Information Center

    Savin, William

    The goal of the project described here was to improve the quality of postsecondary education by offering institutions of higher learning information on currently funded educational projects through an interactive database, the Educational Resources Directory (ERD), which contains information on new methods, curricula, and educational technology.…

  14. Redefining the Digital Divide: Beyond Access to Computers and the Internet

    ERIC Educational Resources Information Center

    Valadez, James R.; Duran, Richard

    2007-01-01

    This study critiqued the notion that a binary "digital divide" between high and low resource schools describes accurately the technology disparity in U.S society. In this study, we surveyed teachers from six southern California schools. Five of the schools were low resource schools and one school, chosen for comparative purposes, was…

  15. Explorationists and dinosaurs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, W.S.

    1993-02-01

    The exploration industry is changing, exploration technology is changing and the explorationist's job is changing. Resource companies are diversifying internationally and their central organizations are providing advisors rather than services. As a result, the relationship between the resource company and the contractor is changing. Resource companies are promoting standards so that all contract services in all parts of the world will look the same to their advisors. Contractors, for competitive reasons, want to look [open quotes]different[close quotes] from other contractors. The resource companies must encourage competition between contractors to insure the availability of new technology but must also resist themore » current trend of burdening the contractor with more and more of the risk involved in exploration. It is becoming more and more obvious that geophysical expenditures represent the best [open quotes]value added[close quotes] expenditures in exploration and development budgets. As a result, seismic-related contractors represent the growth component of our industry. The predominant growth is in 3-D seismic technology, and this growth is being further propelled by the computational power of the new generation of massively parallel computers and by recent advances in computer graphic techniques. Interpretation of seismic data involves the analysis of wavelet shapes and amplitudes prior to stacking the data. Thus, modern interpretation involves understanding compressional waves, shear waves, and propagating modes which create noise and interference. Modern interpretation and processing are carried out simultaneously, iteratively, and interactively and involve many physics-related concepts. These concepts are not merely tools for the interpretation, they are the interpretation. Explorationists who do not recognize this fact are going the way of the dinosaurs.« less

  16. Computing through Scientific Abstractions in SysBioPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.

    2004-10-13

    Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less

  17. Energy Consumption Management of Virtual Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  18. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  19. Volunteer Clouds and Citizen Cyberscience for LHC Physics

    NASA Astrophysics Data System (ADS)

    Aguado Sanchez, Carlos; Blomer, Jakob; Buncic, Predrag; Chen, Gang; Ellis, John; Garcia Quintas, David; Harutyunyan, Artem; Grey, Francois; Lombrana Gonzalez, Daniel; Marquina, Miguel; Mato, Pere; Rantala, Jarno; Schulz, Holger; Segal, Ben; Sharma, Archana; Skands, Peter; Weir, David; Wu, Jie; Wu, Wenjing; Yadav, Rohit

    2011-12-01

    Computing for the LHC, and for HEP more generally, is traditionally viewed as requiring specialized infrastructure and software environments, and therefore not compatible with the recent trend in "volunteer computing", where volunteers supply free processing time on ordinary PCs and laptops via standard Internet connections. In this paper, we demonstrate that with the use of virtual machine technology, at least some standard LHC computing tasks can be tackled with volunteer computing resources. Specifically, by presenting volunteer computing resources to HEP scientists as a "volunteer cloud", essentially identical to a Grid or dedicated cluster from a job submission perspective, LHC simulations can be processed effectively. This article outlines both the technical steps required for such a solution and the implications for LHC computing as well as for LHC public outreach and for participation by scientists from developing regions in LHC research.

  20. Law of Large Numbers: the Theory, Applications and Technology-based Education

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas; Gould, Robert

    2011-01-01

    Modern approaches for technology-based blended education utilize a variety of recently developed novel pedagogical, computational and network resources. Such attempts employ technology to deliver integrated, dynamically-linked, interactive-content and heterogeneous learning environments, which may improve student comprehension and information retention. In this paper, we describe one such innovative effort of using technological tools to expose students in probability and statistics courses to the theory, practice and usability of the Law of Large Numbers (LLN). We base our approach on integrating pedagogical instruments with the computational libraries developed by the Statistics Online Computational Resource (www.SOCR.ucla.edu). To achieve this merger we designed a new interactive Java applet and a corresponding demonstration activity that illustrate the concept and the applications of the LLN. The LLN applet and activity have common goals – to provide graphical representation of the LLN principle, build lasting student intuition and present the common misconceptions about the law of large numbers. Both the SOCR LLN applet and activity are freely available online to the community to test, validate and extend (Applet: http://socr.ucla.edu/htmls/exp/Coin_Toss_LLN_Experiment.html, and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_LLN). PMID:21603584

  1. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  2. Results of an Experimental Program to Provide Low Cost Computer Searches of the NASA Information File to University Graduate Students in the Southeast. Final Report.

    ERIC Educational Resources Information Center

    Smetana, Frederick O.; Phillips, Dennis M.

    In an effort to increase dissemination of scientific and technological information, a program was undertaken whereby graduate students in science and engineering could request a computer-produced bibliography and/or abstracts of documents identified by the computer. The principal resource was the National Aeronautics and Space Administration…

  3. Effectiveness of Kanban Approaches in Systems Engineering within Rapid Response Environments

    DTIC Science & Technology

    2012-01-01

    Procedia Computer Science Procedia Computer Science 00 (2012) 000–000 www.elsevier.com/locate/ procedia New Challenges in Systems...Author name / Procedia Computer Science 00 (2011) 000–000 inefficient use of resources. The move from ―one step to glory‖ system initiatives to...University of Science and Technology Effectiveness of kanban approaches in systems engineering within rapid response environments Richard Turner

  4. Training and Personnel Systems Technology R&D Program Description FY 1988/1989. Revision

    DTIC Science & Technology

    1988-05-20

    scenario software /database, and computer generated imagery (CIG) subsystem resources; (d) investigation of feasibility of, and preparation of plans... computer language to Army flight simulator for demonstration and evaluation. The objective is to have flight simulators which use the same software as...the Automated Performance and Readiness Training System (APARTS), which is a computer software system which facilitates training management through

  5. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    NASA Technical Reports Server (NTRS)

    Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

  6. Utility Computing: Reality and Beyond

    NASA Astrophysics Data System (ADS)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?

  7. Cloud computing in medical imaging.

    PubMed

    Kagadis, George C; Kloukinas, Christos; Moore, Kevin; Philbin, Jim; Papadimitroulas, Panagiotis; Alexakos, Christos; Nagy, Paul G; Visvikis, Dimitris; Hendee, William R

    2013-07-01

    Over the past century technology has played a decisive role in defining, driving, and reinventing procedures, devices, and pharmaceuticals in healthcare. Cloud computing has been introduced only recently but is already one of the major topics of discussion in research and clinical settings. The provision of extensive, easily accessible, and reconfigurable resources such as virtual systems, platforms, and applications with low service cost has caught the attention of many researchers and clinicians. Healthcare researchers are moving their efforts to the cloud, because they need adequate resources to process, store, exchange, and use large quantities of medical data. This Vision 20/20 paper addresses major questions related to the applicability of advanced cloud computing in medical imaging. The paper also considers security and ethical issues that accompany cloud computing.

  8. Computer Applications in Class and Transportation Scheduling. Educational Management Review Series Number 1.

    ERIC Educational Resources Information Center

    Piele, Philip K.

    This document shows how computer technology can aid educators in meeting demands for improved class scheduling and more efficient use of transportation resources. The first section surveys literature on operational systems that provide individualized scheduling for students, varied class structures, and maximum use of space and staff skills.…

  9. Technological Imperatives: Using Computers in Academic Debate.

    ERIC Educational Resources Information Center

    Ticku, Ravinder; Phelps, Greg

    Intended for forensic educators and debate teams, this document details how one university debate team, at the University of Iowa, makes use of computer resources on campus to facilitate storage and retrieval of information useful to debaters. The introduction notes the problem of storing and retrieving the amount of information required by debate…

  10. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  11. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  12. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  13. "Computer as Data Gatherer" for a New Generation: Martorella's Predictions, the Past, the Present, and the Future of Technology in Social Studies

    ERIC Educational Resources Information Center

    Friedman, Adam

    2014-01-01

    In his 1997 article "Technology and the Social Studies--or: Which Way to the Sleeping Giant?" Peter Martorella made several predictions regarding technology resources in the social studies. Through a 2014 lens, Martorella's Internet seems archaic, yet two of his predictions were particularly poignant and have had a significant impact on…

  14. Pennsylvania In-Service Technology Education: Past, Present and Future. A Status Report of the Regional Computer Resource Center at Temple University.

    ERIC Educational Resources Information Center

    Robertson, Elton

    In 1984, the Commonwealth of Pennsylvania passed the Information Technology Education Act, which created 14 Information Technology Education for the Commonwealth (ITEC) centers. The purpose of the ITEC centers was to assist teachers in using and improving their use of microcomputers in their own classrooms by training them in instructional uses of…

  15. JPRS Report, Science & Technology, USSR: Science & Technology Policy

    DTIC Science & Technology

    1988-09-23

    number of library personnel for preparing survey -analyt- ical references, but by equipping them with modern computer hardware for acquiring information...of manpower, material, technical, and financial resources and limits of capital investments and planning, surveying , and contractual work, which...USSR State Prize for the development and introduction of a technology of the production of shampoo from fish protein. During the period under review

  16. Computer network access to scientific information systems for minority universities

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  17. The application of the large particles method of numerical modeling of the process of carbonic nanostructures synthesis in plasma

    NASA Astrophysics Data System (ADS)

    Abramov, G. V.; Gavrilov, A. N.

    2018-03-01

    The article deals with the numerical solution of the mathematical model of the particles motion and interaction in multicomponent plasma by the example of electric arc synthesis of carbon nanostructures. The high order of the particles and the number of their interactions requires a significant input of machine resources and time for calculations. Application of the large particles method makes it possible to reduce the amount of computation and the requirements for hardware resources without affecting the accuracy of numerical calculations. The use of technology of GPGPU parallel computing using the Nvidia CUDA technology allows organizing all General purpose computation on the basis of the graphical processor graphics card. The comparative analysis of different approaches to parallelization of computations to speed up calculations with the choice of the algorithm in which to calculate the accuracy of the solution shared memory is used. Numerical study of the influence of particles density in the macro particle on the motion parameters and the total number of particle collisions in the plasma for different modes of synthesis has been carried out. The rational range of the coherence coefficient of particle in the macro particle is computed.

  18. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and interoperate geospatial resources by using Grid technology and extends Grid technology into the geoscience communities.

  19. Transformation of OODT CAS to Perform Larger Tasks

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean

    2008-01-01

    A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.

  20. Practising Arithmetic Using Educational Video Games with an Interpersonal Computer

    ERIC Educational Resources Information Center

    Beserra, Vagner; Nussbaum, Miguel; Zeni, Ricardo; Rodriguez, Werner; Wurman, Gabriel

    2014-01-01

    Studies show the positive effects that video games can have on student performance and attitude towards learning. In the past few years, strategies have been generated to optimize the use of technological resources with the aim of facilitating widespread adoption of technology in the classroom. Given its low acquisition and maintenance costs, the…

  1. Beliefs about Using Technology in the Mathematics Classroom: Interviews with Pre-Service Elementary Teachers

    ERIC Educational Resources Information Center

    Lin, Cheng-Yao

    2008-01-01

    This study explored the efficacy of web-based workshops in topics in elementary school mathematics in fostering teachers' confidence and competence in using instructional technology, and thereby promoting more positive attitudes toward using computers and Internet resources in the mathematics classroom. It consisted of in-depth interviews of…

  2. Multi-Media and Technology Tools: Curriculum and Activities for Idaho Business Teachers.

    ERIC Educational Resources Information Center

    Yopp, Marty; Kitchel, K. Allen; Allen, Tacey

    This guide contains information, curriculum, and activities that provide business teachers with a tool for using the World Wide Web, multimedia, and technology to enhance their programs. The opening sections contain the following: computer use policy, multimedia fact sheet, tips on using Netscape Navigator, directory of educational resources on…

  3. The New Technologies in Mathematics: A Personal History of 30 Years

    ERIC Educational Resources Information Center

    de la Villa, Agustín; García, Alfonsa; García, Francisco; Rodríguez, Gerardo

    2017-01-01

    A personal overview about the use of new technologies for teaching and learning mathematics is given in this paper. We analyse the introduction of Computer Algebra Systems with learning purposes, reviewing different frameworks and didactical resources, some of them generated according the philosophy of the European Area of Higher Education.…

  4. Academic Honesty through Technology

    ERIC Educational Resources Information Center

    Lecher, Mark

    2005-01-01

    Over the past two decades, technology use has increased in the classroom. What started out as a single computer in a classroom has evolved into a laptop or handheld for every student, with a wireless connection to the Internet and other network resources. Cell phones, PDAs, and other electronic tools have opened up new horizons for utilizing…

  5. Developing and Deploying Multihop Wireless Networks for Low-Income Communities

    ERIC Educational Resources Information Center

    Camp, Joseph D.; Knightly, Edward W.; Reed, William S.

    2006-01-01

    In most middle- and upper-income homes across the United States, children, youth, and their families have access to the world's information-technology resources at their fingertips, while in low-income communities, access to technology and the opportunities it provides are often limited to brief periods of computer use and Internet access at…

  6. Technological Barriers to Success in Distance Education: The Revolving Door of Online Education

    ERIC Educational Resources Information Center

    Roe, Richard Thomas

    2011-01-01

    Taking online courses has become a delivery mode of choice for many students. This collaborative study focuses on the impact of college readiness; technological resources, and course design on student success in an online introduction to computers distance education course within the Kentucky Community and Technical College System (KCTCS). The…

  7. E-Learning Application of Tarsier with Virtual Reality using Android Platform

    NASA Astrophysics Data System (ADS)

    Oroh, H. N.; Munir, R.; Paseru, D.

    2017-01-01

    Spectral Tarsier is a primitive primate that can only be found in the province of North Sulawesi. To study these primates can be used an e-learning application with Augmented Reality technology that uses a marker to confronted the camera computer to interact with three dimensions Tarsier object. But that application only shows tarsier object in three dimensions without habitat and requires a lot of resources because it runs on a Personal Computer. The same technology can be shown three dimensions’ objects is Virtual Reality to excess can make the user like venturing into the virtual world with Android platform that requires fewer resources. So, put on Virtual Reality technology using the Android platform that can make users not only to view and interact with the tarsiers but also the habitat. The results of this research indicate that the user can learn the Tarsier and habitat with good. Thus, the use of Virtual Reality technology in the e-learning application of tarsiers can help people to see, know, and learn about Spectral Tarsier.

  8. VLSI neuroprocessors

    NASA Technical Reports Server (NTRS)

    Kemeny, Sabrina E.

    1994-01-01

    Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.

  9. 50 CFR 263.53 - Other funds.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... distribute these funds after he or she has made a thorough evaluation of the scientific information submitted... only by existing methods and technology. Any fishery resource used in computing the states' amount...

  10. 50 CFR 263.53 - Other funds.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... distribute these funds after he or she has made a thorough evaluation of the scientific information submitted... only by existing methods and technology. Any fishery resource used in computing the states' amount...

  11. 50 CFR 263.53 - Other funds.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... distribute these funds after he or she has made a thorough evaluation of the scientific information submitted... only by existing methods and technology. Any fishery resource used in computing the states' amount...

  12. 50 CFR 263.53 - Other funds.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... distribute these funds after he or she has made a thorough evaluation of the scientific information submitted... only by existing methods and technology. Any fishery resource used in computing the states' amount...

  13. Physics and Robotic Sensing -- the good, the bad, and approaches to making it work

    NASA Astrophysics Data System (ADS)

    Huff, Brian

    2011-03-01

    All of the technological advances that have benefited consumer electronics have direct application to robotics. Technological advances have resulted in the dramatic reduction in size, cost, and weight of computing systems, while simultaneously doubling computational speed every eighteen months. The same manufacturing advancements that have enabled this rapid increase in computational power are now being leveraged to produce small, powerful and cost-effective sensing technologies applicable for use in mobile robotics applications. Despite the increase in computing and sensing resources available to today's robotic systems developers, there are sensing problems typically found in unstructured environments that continue to frustrate the widespread use of robotics and unmanned systems. This talk presents how physics has contributed to the creation of the technologies that are making modern robotics possible. The talk discusses theoretical approaches to robotic sensing that appear to suffer when they are deployed in the real world. Finally the author presents methods being used to make robotic sensing more robust.

  14. Supporting research sites in resource-limited settings: challenges in implementing information technology infrastructure.

    PubMed

    Whalen, Christopher J; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As information and communication technology infrastructure becomes more reliable, new methods of electronic data capture, data marts/data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of Internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on electronic data capture and Internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings. We describe examples of the practical and ethical/regulatory challenges raised by the use of these newer technologies for data collection in multisite clinical studies.

  15. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  16. Experience and Attitudes towards Information Technology among First-Year Medical Students in Denmark: Longitudinal Questionnaire Survey

    PubMed Central

    2004-01-01

    Background As more and more information technology (IT) resources become available both for support of campus- based medical education and for Web-based learning, it becomes increasingly interesting to map the information technology resources available to medical students and the attitudes students have towards their use. Objective To determine how extensively and effectively information handling skills are being taught in the medical curriculum, the study investigated Internet and computer availability and usage, and attitudes towards information technology among first-year medical students in Aarhus, Denmark, during a five-year period. Methods In the period from 1998 to 2002, students beginning the first semester of medical school were given courses on effective use of IT in their studies. As a part of the tutorials, the students were asked to complete a web-based questionnaire which included questions related to IT readiness and attitudes towards using IT in studies. Results A total of 1159 students (78%) responded. Overall, 71.7% of the respondents indicating they had access to a computer at home, a number that did not change significantly during the study period. Over time, the power of students' computers and the use of e-mail and Internet did increase significantly. By fall 2002, approximately 90% of students used e-mail regularly, 80% used the Internet regularly, and 60% had access to the Internet from home. Significantly more males than females had access to a computer at home, and males had a more positive attitude towards the use of computers in their medical studies. A fairly constant number of students (3-7%) stated that they would prefer not to have to use computers in their studies. Conclusions Taken together with our experience from classroom teaching, these results indicate optional teaching of basic information technology still needs to be integrated into medical studies, and that this need does not seem likely to disappear in the near future. PMID:15111276

  17. Cloud based intelligent system for delivering health care as a service.

    PubMed

    Kaur, Pankaj Deep; Chana, Inderveer

    2014-01-01

    The promising potential of cloud computing and its convergence with technologies such as mobile computing, wireless networks, sensor technologies allows for creation and delivery of newer type of cloud services. In this paper, we advocate the use of cloud computing for the creation and management of cloud based health care services. As a representative case study, we design a Cloud Based Intelligent Health Care Service (CBIHCS) that performs real time monitoring of user health data for diagnosis of chronic illness such as diabetes. Advance body sensor components are utilized to gather user specific health data and store in cloud based storage repositories for subsequent analysis and classification. In addition, infrastructure level mechanisms are proposed to provide dynamic resource elasticity for CBIHCS. Experimental results demonstrate that classification accuracy of 92.59% is achieved with our prototype system and the predicted patterns of CPU usage offer better opportunities for adaptive resource elasticity. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. U.S. Geological Survey National Computer Technology Meeting; Program and abstracts, May 7-11, 1990

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1990-01-01

    Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are system administration; distributed information systems and data bases, both current (1990) and proposed; hydrologic applications; national water information systems; geographic information systems applications and techniques. The report contains some of the abstracts that were presented at the National Computer Technology Meeting that was held in May 1990. The meeting was sponsored by the Water Resources Division and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. (USGS)

  19. Virtual microscopy and digital pathology in training and education.

    PubMed

    Hamilton, Peter W; Wang, Yinhai; McCullough, Stephen J

    2012-04-01

    Traditionally, education and training in pathology has been delivered using textbooks, glass slides and conventional microscopy. Over the last two decades, the number of web-based pathology resources has expanded dramatically with centralized pathological resources being delivered to many students simultaneously. Recently, whole slide imaging technology allows glass slides to be scanned and viewed on a computer screen via dedicated software. This technology is referred to as virtual microscopy and has created enormous opportunities in pathological training and education. Students are able to learn key histopathological skills, e.g. to identify areas of diagnostic relevance from an entire slide, via a web-based computer environment. Students no longer need to be in the same room as the slides. New human-computer interfaces are also being developed using more natural touch technology to enhance the manipulation of digitized slides. Several major initiatives are also underway introducing online competency and diagnostic decision analysis using virtual microscopy and have important future roles in accreditation and recertification. Finally, researchers are investigating how pathological decision-making is achieved using virtual microscopy and modern eye-tracking devices. Virtual microscopy and digital pathology will continue to improve how pathology training and education is delivered. © 2012 The Authors APMIS © 2012 APMIS.

  20. Seismic waveform modeling over cloud

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  1. On Study of Application of Big Data and Cloud Computing Technology in Smart Campus

    NASA Astrophysics Data System (ADS)

    Tang, Zijiao

    2017-12-01

    We live in an era of network and information, which means we produce and face a lot of data every day, however it is not easy for database in the traditional meaning to better store, process and analyze the mass data, therefore the big data was born at the right moment. Meanwhile, the development and operation of big data rest with cloud computing which provides sufficient space and resources available to process and analyze data of big data technology. Nowadays, the proposal of smart campus construction aims at improving the process of building information in colleges and universities, therefore it is necessary to consider combining big data technology and cloud computing technology into construction of smart campus to make campus database system and campus management system mutually combined rather than isolated, and to serve smart campus construction through integrating, storing, processing and analyzing mass data.

  2. The application of cloud computing to scientific workflows: a study of cost and performance.

    PubMed

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  3. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.

    NASA Astrophysics Data System (ADS)

    Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca

    2015-12-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.

  4. Integrated system dynamics toolbox for water resources planning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reno, Marissa Devan; Passell, Howard David; Malczynski, Leonard A.

    2006-12-01

    Public mediated resource planning is quickly becoming the norm rather than the exception. Unfortunately, supporting tools are lacking that interactively engage the public in the decision-making process and integrate over the myriad values that influence water policy. In the pages of this report we document the first steps toward developing a specialized decision framework to meet this need; specifically, a modular and generic resource-planning ''toolbox''. The technical challenge lies in the integration of the disparate systems of hydrology, ecology, climate, demographics, economics, policy and law, each of which influence the supply and demand for water. Specifically, these systems, their associatedmore » processes, and most importantly the constitutive relations that link them must be identified, abstracted, and quantified. For this reason, the toolbox forms a collection of process modules and constitutive relations that the analyst can ''swap'' in and out to model the physical and social systems unique to their problem. This toolbox with all of its modules is developed within the common computational platform of system dynamics linked to a Geographical Information System (GIS). Development of this resource-planning toolbox represents an important foundational element of the proposed interagency center for Computer Aided Dispute Resolution (CADRe). The Center's mission is to manage water conflict through the application of computer-aided collaborative decision-making methods. The Center will promote the use of decision-support technologies within collaborative stakeholder processes to help stakeholders find common ground and create mutually beneficial water management solutions. The Center will also serve to develop new methods and technologies to help federal, state and local water managers find innovative and balanced solutions to the nation's most vexing water problems. The toolbox is an important step toward achieving the technology development goals of this center.« less

  5. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, Tom; Yang, Xi

    The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less

  6. DEVELOPMENT OF EPA'S TOXCAST PROGRAM FOR PRIORITIZING THE TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS.

    EPA Science Inventory

    EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS)and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.

  7. Accessing and visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, G. Bruce; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids.

  8. Computer Technology in California K-12 Schools: Uses, Best Practices, and Policy Implications.

    ERIC Educational Resources Information Center

    Umbach, Kenneth W.

    Computers and Internet access are becoming increasingly frequent tools and resources in California's K-12 schools. Discussions with teachers and other education personnel and a review of published documents and other sources show the range of uses found in California classrooms, suggest what are the best practices with respect to computer…

  9. Enterprise networks. Strategies for integrated delivery systems.

    PubMed

    Siwicki, B

    1997-02-01

    More integrated delivery systems are making progress toward building computer networks that link all their care delivery sites so they can efficiently and economically coordinate care. A growing number of these systems are turning to intranets--private computer networks that use Internet-derived protocols and technologies--to move information that's essential to managing scare health care resources.

  10. Evaluating the Comparability of Paper- and Computer-Based Science Tests across Sex and SES Subgroups

    ERIC Educational Resources Information Center

    Randall, Jennifer; Sireci, Stephen; Li, Xueming; Kaira, Leah

    2012-01-01

    As access and reliance on technology continue to increase, so does the use of computerized testing for admissions, licensure/certification, and accountability exams. Nonetheless, full computer-based test (CBT) implementation can be difficult due to limited resources. As a result, some testing programs offer both CBT and paper-based test (PBT)…

  11. The games psychologists play (and the data they provide).

    PubMed

    Washburn, David A

    2003-05-01

    Computer games and the technologies marketed to support them provide unique resources for psychological research. In contrast to the sterility, simplicity, and artificiality that characterizes many cognitive tests, game-like tasks can be complex, ecologically valid, and even fun. In the present paper,the history of psychological research with video games is reviewed, and several thematic benefits of this paradigm are identified. These benefits, as well as the possible pitfalls of research with computer game technology and game-like tasks, are illustrated with data from comparative and cognitive investigations.

  12. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleary, A J; Smith, S G; Vassilevska, T K

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallelmore » computers and maturation of the technology from an academic to a lab setting.« less

  13. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less

  14. Design for Run-Time Monitor on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.

  15. Realizing the Potential of Information Resources: Information, Technology, and Services. Track 4: Rethinking User Services.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Six papers and two abstracts of papers are presented from the 1995 CAUSE conference track on user services issues faced by managers of information technology at colleges and universities. The papers include: (1) "Academic Computing Services: MORE than a Utility" (Scott Bierman and Cathy Smith), which focuses on Carleton College's efforts…

  16. Using Cloud Computing Services in e-Learning Process: Benefits and Challenges

    ERIC Educational Resources Information Center

    El Mhouti, Abderrahim; Erradi, Mohamed; Nasseh, Azeddine

    2018-01-01

    During the recent years, Information and Communication Technologies (ICT) play a significant role in the field of education and e-learning has become a very popular trend of the education technology. However, with the huge growth of the number of users, data and educational resources generated, e-learning systems have become more and more…

  17. Utilizing Technology to Examine the Impacts of Academic Program Plans on Faculty Staffing Levels. AIR 1984 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Spiro, Louis M.; Campbell, Jill F.

    The development and use of a campus-based computerized faculty staffing model is described. In addition to considering market demands for current and proposed programs, decisionmakers need to consider how program development, modification, and elimination affect the total college faculty resource base. The application of computer technology,…

  18. Technological Supports for Onsite and Distance Education and Students' Perceptions of Acquisition of Thinking and Team-Building Skills

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.; Morin, Danielle

    2010-01-01

    This paper compares students' perceptions of support provided in the acquisition of various thinking and team-building skills, resulting from the various activities, resources and technologies (ART) integrated into an upper level Distributed Computing (DC) course. The findings indicate that students perceived strong support for their acquisition…

  19. Preparing for High Technology: CAD/CAM Programs. Research & Development Series No. 234.

    ERIC Educational Resources Information Center

    Abram, Robert; And Others

    This guide is one of three developed to provide information and resources to assist in planning and developing postsecondary technican training programs in high technology areas. It is specifically intended for vocational-technical educators and planners in the initial stages of planning a specialized training option in computer-aided design (CAD)…

  20. Connexions: An Open Educational Resource for the 21st Century

    ERIC Educational Resources Information Center

    Burrus, C. Sidney

    2007-01-01

    The technology for information organization, communication, storage, and use today is the book. It has evolved over 3000 years (in its modern form over 500 years) to the mature object we currently enjoy. The book is now the primary technology used in education. But with the development of the computer and the Web, a new electronic information…

  1. The Role of Technology in Advancing Performance Standards in Science and Mathematics Learning.

    ERIC Educational Resources Information Center

    Quellmalz, Edys

    Technology permeates the lives of most Americans: voice mail, personal computers, and the ever-blinking VCR clock have become commonplace. In schools, it is creating educational opportunities at a dizzying pace and, within and beyond the classroom, it is providing unprecedented access to a universe of ideas and resources. As a next step, the…

  2. Can a tablet device alter undergraduate science students' study behavior and use of technology?

    PubMed

    Morris, Neil P; Ramsay, Luke; Chauhan, Vikesh

    2012-06-01

    This article reports findings from a study investigating undergraduate biological sciences students' use of technology and computer devices for learning and the effect of providing students with a tablet device. A controlled study was conducted to collect quantitative and qualitative data on the impact of a tablet device on students' use of devices and technology for learning. Overall, we found that students made extensive use of the tablet device for learning, using it in preference to laptop computers to retrieve information, record lectures, and access learning resources. In line with other studies, we found that undergraduate students only use familiar Web 2.0 technologies and that the tablet device did not alter this behavior for the majority of tools. We conclude that undergraduate science students can make extensive use of a tablet device to enhance their learning opportunities without institutions changing their teaching methods or computer systems, but that institutional intervention may be needed to drive changes in student behavior toward the use of novel Web 2.0 technologies.

  3. Research on the architecture and key technologies of SIG

    NASA Astrophysics Data System (ADS)

    Fu, Zhongliang; Meng, Qingxiang; Huang, Yan; Liu, Shufan

    2007-06-01

    Along with the development of computer network, Grid has become one of the hottest issues of researches on sharing and cooperation of Internet resources throughout the world. This paper illustrates a new architecture of SIG-a five-hierarchy architecture (including Data Collecting Layer, Grid Layer, Service Layer, Application Layer and Client Layer) of SIG from the traditional three hierarchies (only including resource layer, service layer and client layer). In the paper, the author proposes a new mixed network mode of Spatial Information Grid which integrates CAG (Certificate Authority of Grid) and P2P (Peer to Peer) in the Grid Layer, besides, the author discusses some key technologies of SIG and analysis the functions of these key technologies.

  4. Handheld computers in nursing education: PDA pilot project.

    PubMed

    Koeniger-Donohue, Rebecca

    2008-02-01

    Interest in the use and application of handheld technology at undergraduate and graduate nursing programs across the country is growing rapidly. Personal digital assistants (PDAs) are often referred to as a "peripheral brain" because they can save time, decrease errors, and simplify information retrieval at the point of care. In addition, research results support the notion that PDAs enhance nursing clinical education and are an effective student learning resource. However, most nursing programs lack the full range of technological resources to implement and provide ongoing support for handheld technology use by faculty and students. This article describes a 9-month pilot project for the initial use of PDAs by novice faculty and students at Simmons College.

  5. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  6. Open Educational Resources for Blended Learning in High Schools: Overcoming Impediments in Developing Countries

    ERIC Educational Resources Information Center

    Larson, Richard C.; Murray, M. Elizabeth

    2008-01-01

    With today's computer and telecommunications technologies, every young person can have a quality education regardless of his or her place of birth. This is the dream that Open Educational Resources (OERs), when viewed as a right rather than a privilege, are directed to realize. For developing countries, we propose a type of OER initiative that…

  7. Inservice Workshops on New and Emerging Agriculture/Natural Resources Occupation Instructional Materials. Final Report, January 1, 1980-June 30, 1981.

    ERIC Educational Resources Information Center

    Leising, J.; Wilkins, Russell

    This document contains the final report and appendixes from a project to develop resources for use by community college agricultural education instructors in better utilizing computer technology in instruction and to provide inservice workshops to make the instructors aware of available hard- and software. The four-page narrative lists objectives,…

  8. Advanced Computer Aids in the Planning and Execution of Air Warfare and Ground Strike Operations: Conference Proceedings, Meeting of the Avionics Panels of AGARD (51st) Held in Kongsberg, Norway on 12-16 May 1986

    DTIC Science & Technology

    1986-02-01

    the area of Artificial Intelligence (At). DARPA’s Strategic Computing Program 13 developing an At ýtchnology base upon which several applications...technologies with the Strategic Computing Program . In late 1983 the Strategic Computing Program (SCP) wes announced. The program was organizsd to develop...solving a resource allocation problem. The remainder of this paper will discuss the TEMPLAR progeam as it relates to the Strategic Computing Program

  9. NASA Astrophysics Data System (ADS)

    Knosp, B.; Neely, S.; Zimdars, P.; Mills, B.; Vance, N.

    2007-12-01

    The Microwave Limb Sounder (MLS) Science Computing Facility (SCF) stores over 50 terabytes of data, has over 240 computer processing hosts, and 64 users from around the world. These resources are spread over three primary geographical locations - the Jet Propulsion Laboratory (JPL), Raytheon RIS, and New Mexico Institute of Mining and Technology (NMT). A need for a grid network system was identified and defined to solve the problem of users competing for finite, and increasingly scarce, MLS SCF computing resources. Using Sun's Grid Engine software, a grid network was successfully created in a development environment that connected the JPL and Raytheon sites, established master and slave hosts, and demonstrated that transfer queues for jobs can work among multiple clusters in the same grid network. This poster will first describe MLS SCF resources and the lessons that were learned in the design and development phase of this project. It will then go on to discuss the test environment and plans for deployment by highlighting benchmarks and user experiences.

  10. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  11. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  12. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  13. Using Web-Based Tools for Teaching Embryology

    EPA Science Inventory

    Computers, imaging technologies, and the worldwide web have assumed an important role in augmenting traditional learning. Resources to disseminate multimedia information across platforms, and the emergence of communal knowledge environments, facilitate the visualization of diffi...

  14. A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System

    NASA Astrophysics Data System (ADS)

    Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.

    2010-05-01

    The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.

  15. NASA aeronautics R&T - A resource for aircraft design

    NASA Technical Reports Server (NTRS)

    Olstad, W. B.

    1981-01-01

    This paper discusses the NASA aeronautics research and technology program from the viewpoint of the aircraft designer. The program spans the range from fundamental research to the joint validation with industry of technology for application into product development. Examples of recent developments in structures, materials, aerodynamics, controls, propulsion systems, and safety technology are presented as new additions to the designer's handbook. Finally, the major thrusts of NASA's current and planned programs which are keyed to revolutionary advances in materials science, electronics, and computer technology are addressed.

  16. Training Technology Transfer Act of 1984. Hearing before the Subcommittee on Education, Arts and Humanities of the Committee on Labor and Human Resources, United States Senate, Ninety-Eighth Congress, Second Session on S. 2561. Entitled the "Training Technology Transfer Act of 1984."

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.

    This is a congressional hearing on the Training Technology Transfer Act of 1984, which would establish a mechanism for transferring the Federal Government's investment in computer programming for training systems to those organizations and groups that can use such technology in training the civilian work force. Focus is on refining this bill,…

  17. Concept of JINR Corporate Information System

    NASA Astrophysics Data System (ADS)

    Filozova, I. A.; Bashashin, M. V.; Korenkov, V. V.; Kuniaev, S. V.; Musulmanbekov, G.; Semenov, R. N.; Shestakova, G. V.; Strizh, T. A.; Ustenko, P. V.; Zaikina, T. N.

    2016-09-01

    The article presents the concept of JINR Corporate Information System (JINR CIS). Special attention is given to the information support of scientific researches - Current Research Information System as a part of the corporate information system. The objectives of such a system are focused on ensuring an effective implementation and research by using the modern information technology, computer technology and automation, creation, development and integration of digital resources on a common conceptual framework. The project assumes continuous system development, introduction the new information technologies to ensure the technological system relevance.

  18. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Italiano, A.; Salomoni, D.; Calabrese Melcarne, A. K.

    2011-12-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  19. Computer literacy in nursing education. An overview.

    PubMed

    Newbern, V B

    1985-09-01

    Nursing educators are beginning to realize that computer literacy has become a survival skill for the profession. They understand that literacy must be at a level that assures the ability to manage and control the flood of available information and provides an openness and awareness of future technologic possibilities. The computer has been on college campuses for a number of years, used primarily for record storage and retrieval. However, early on a few nurse educators saw the potential for its use as a practice tool. Out of this foresight came both formal and nonformal educational offerings. The evolution of formal coursework in computer literacy has moved from learning about the computer to learning with the computer. Today the use of the computer is expanding geometrically as microcomputers become common. Graduate students and faculty use them for literature searches and data analysis. Undergraduates are routinely using computer-assisted instruction. Coursework in computer technology is fast becoming a given for nursing students and computer competency a requisite for faculty. However, inculcating computer competency in faculty and student repertoires is not an easy task. There are problems related to motivation, resources, and control. Territorial disputes between schools and colleges must be arbitrated. The interface with practice must be addressed. The paucity of adequate software is a real concern. But the potential is enormous, probably restricted only by human creativity. The possibilities for teaching and learning are profound, especially if geographical constraints can be effaced and scarce resources can be shared at minimal cost. Extremely sophisticated research designs and evaluation methodologies can be used routinely.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Use of several Cloud Computing approaches for climate modelling: performance, costs and opportunities

    NASA Astrophysics Data System (ADS)

    Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.

    2017-04-01

    Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on the trends over the next years to consolidate Cloud as the preferred solution.

  1. G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS.

    PubMed

    Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin

    2015-01-01

    Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%.

  2. G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS

    PubMed Central

    Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin

    2015-01-01

    Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%. PMID:26504488

  3. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    PubMed

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  4. Design and implementation of a Windows NT network to support CNC activities

    NASA Technical Reports Server (NTRS)

    Shearrow, C. A.

    1996-01-01

    The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.

  5. Toward a Dynamically Reconfigurable Computing and Communication System for Small Spacecraft

    NASA Technical Reports Server (NTRS)

    Kifle, Muli; Andro, Monty; Tran, Quang K.; Fujikawa, Gene; Chu, Pong P.

    2003-01-01

    Future science missions will require the use of multiple spacecraft with multiple sensor nodes autonomously responding and adapting to a dynamically changing space environment. The acquisition of random scientific events will require rapidly changing network topologies, distributed processing power, and a dynamic resource management strategy. Optimum utilization and configuration of spacecraft communications and navigation resources will be critical in meeting the demand of these stringent mission requirements. There are two important trends to follow with respect to NASA's (National Aeronautics and Space Administration) future scientific missions: the use of multiple satellite systems and the development of an integrated space communications network. Reconfigurable computing and communication systems may enable versatile adaptation of a spacecraft system's resources by dynamic allocation of the processor hardware to perform new operations or to maintain functionality due to malfunctions or hardware faults. Advancements in FPGA (Field Programmable Gate Array) technology make it possible to incorporate major communication and network functionalities in FPGA chips and provide the basis for a dynamically reconfigurable communication system. Advantages of higher computation speeds and accuracy are envisioned with tremendous hardware flexibility to ensure maximum survivability of future science mission spacecraft. This paper discusses the requirements, enabling technologies, and challenges associated with dynamically reconfigurable space communications systems.

  6. U.S. Geological Survey national computer technology meeting; program and abstracts, New Orleans, Louisiana, April 10-15, 1994

    USGS Publications Warehouse

    Balthrop, B. H.; Baker, E.G.

    1994-01-01

    This report contains some of the abstracts of papers that were presented at the National Computer Technology Meeting that was held in April 1994. This meeting was sponsored by the Water Resources Division of the U.S. Geological Survey, and was attended by more than 200 technical and managerial personnel representing all the Divisions of the U.S. Geological Survey. Computer-related information from all Divisions of the U.S. Geological Survey are discussed in this compilation of abstracts. Some of the topics addressed are data transfer, data-base management, hydrologic applications, national water information systems, and geographic information systems applications and techniques.

  7. Cloud Computing - A Unified Approach for Surveillance Issues

    NASA Astrophysics Data System (ADS)

    Rachana, C. R.; Banu, Reshma, Dr.; Ahammed, G. F. Ali, Dr.; Parameshachari, B. D., Dr.

    2017-08-01

    Cloud computing describes highly scalable resources provided as an external service via the Internet on a basis of pay-per-use. From the economic point of view, the main attractiveness of cloud computing is that users only use what they need, and only pay for what they actually use. Resources are available for access from the cloud at any time, and from any location through networks. Cloud computing is gradually replacing the traditional Information Technology Infrastructure. Securing data is one of the leading concerns and biggest issue for cloud computing. Privacy of information is always a crucial pointespecially when an individual’s personalinformation or sensitive information is beingstored in the organization. It is indeed true that today; cloud authorization systems are notrobust enough. This paper presents a unified approach for analyzing the various security issues and techniques to overcome the challenges in the cloud environment.

  8. A Multi-Year Study of Teaching an Online Computer Literacy Course in a Medical University: A Lesson Learnt

    ERIC Educational Resources Information Center

    Wan, Hsu-Tien; Hsu, Kuang-Yang; Sheu, Shiow-Yunn

    2016-01-01

    In this research, we aim to understand the effectiveness of adopting educational technologies in a computer literacy course to students in a medical university. The course was organized with three core components: Open Education Resources (OER) reading, a book club, and online game competition. These components were delivered by a learning…

  9. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  10. The Diffusion of Computer Skills in Communication Curricula: Is There a Gap between the Educational Experience and Employers' Needs?

    ERIC Educational Resources Information Center

    Chen, Joyce; Bankston, Ronnie

    Computers are now perceived as a required resource by business, education, and government, as well as in personal life. The rates of adoption of information technologies among these groups (business, education, government, family/individual) have varied, which may have created knowledge gaps. Based on the data collected from a telephone survey in…

  11. CFD in design - A government perspective

    NASA Technical Reports Server (NTRS)

    Kutler, Paul; Gross, Anthony R.

    1989-01-01

    Some of the research programs involving the use of CFD in the aerodynamic design process at government laboratories around the United States are presented. Technology transfer issues and future directions in the discipline or CFD are addressed. The major challengers in the aerosciences as well as other disciplines that will require high-performance computing resources such as massively parallel computers are examined.

  12. Metacognitive factors that impact student nurse use of point of care technology in clinical settings.

    PubMed

    Kuiper, RuthAnne

    2010-01-01

    The utility of personal digital assistants (PDA) as a point of care resource in health care practice and education presents new challenges for nursing faculty. While there is a plethora of PDA resources available, little is known about the variables that effect student learning and technology adoption. In this study nursing students used PDA software programs which included a drug guide, medical dictionary, laboratory manual and nursing diagnosis manual during acute care clinical experiences. Analysis of student journals comparative reflective statements about the PDA as an adjunct to other available resources in clinical practice are presented. The benefits of having a PDA included readily available data, validation of thinking processes, and facilitation of care plan re-evaluation. Students reported increased frequency of use and independence. Significant correlations between user perceptions and computer self-efficacy suggested greater confidence in abilities with technology resulting in increased self-awareness and achievement of learning outcomes.

  13. Using Technology to Facilitate Collaboration in Community-Based Participatory Research (CBPR)

    PubMed Central

    Jessell, Lauren; Smith, Vivian; Jemal, Alexis; Windsor, Liliane

    2017-01-01

    This study explores the use of Computer-Supported Collaborative Work (CSCW) technologies, by way of a computer-based system called iCohere. This system was used to facilitate collaboration conducting Community-Based Participatory Research (CBPR). Data was gathered from 13 members of a Community Collaborative Board (CCB). Analysis revealed that iCohere served the following functions: facilitating communication, providing a depository for information and resource sharing, and allowing for remote meeting attendance. Results indicated that while iCohere was useful in performing these functions, less expensive technologies had the potential to achieve similar goals if properly implemented. Implications for future research on CSCW systems and CBPR are discussed. PMID:29056871

  14. Successful use of tablet personal computers and wireless technologies for the 2011 Nepal Demographic and Health Survey.

    PubMed

    Paudel, Deepak; Ahmed, Marie; Pradhan, Anjushree; Lal Dangol, Rajendra

    2013-08-01

    Computer-Assisted Personal Interviewing (CAPI), coupled with the use of mobile and wireless technology, is growing as a data collection methodology. Nepal, a geographically diverse and resource-scarce country, implemented the 2011 Nepal Demographic and Health Survey, a nationwide survey of major health indicators, using tablet personal computers (tablet PCs) and wireless technology for the first time in the country. This paper synthesizes responses on the benefits and challenges of using new technology in such a challenging environment from the 89 interviewers who administered the survey. Overall, feedback from the interviewers indicate that the use of tablet PCs and wireless technology to administer the survey demonstrated potential to improve data quality and reduce data collection time-benefits that outweigh manageable challenges, such as storage and transport of the tablet PCs during fieldwork, limited options for confidential interview space due to screen readability issues under direct sunlight, and inconsistent electricity supply at times. The introduction of this technology holds great promise for improving data availability and quality, even in a context with limited infrastructure and extremely difficult terrain.

  15. Intelligent instrumentation applied in environment management

    NASA Astrophysics Data System (ADS)

    Magheti, Mihnea I.; Walsh, Patrick; Delassus, Patrick

    2005-06-01

    The use of information and communications technology in environment management and research has witnessed a renaissance in recent years. From optical sensor technology, expert systems, GIS and communications technologies to computer aided harvesting and yield prediction, these systems are increasable used for applications developing in the management sector of natural resources and biodiversity. This paper presents an environmental decision support system, used to monitor biodiversity and present a risk rating for the invasion of pests into the particular systems being examined. This system will utilise expert mobile technology coupled with artificial intelligence and predictive modelling, and will emphasize the potential for expansion into many areas of intelligent remote sensing and computer aided decision-making for environment management or certification. Monitoring and prediction in natural systems, harnessing the potential of computing and communication technologies is an emerging technology within the area of environmental management. This research will lead to the initiation of a hardware and software multi tier decision support system for environment management allowing an evaluation of areas for biodiversity or areas at risk from invasive species, based upon environmental factors/systems.

  16. [Application of image recognition technology in census of national traditional Chinese medicine resources].

    PubMed

    Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Shi, Ting-Ting; Wang, Hui; Li, Meng; Jing, Zhi-Xian; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    With the development of computer and image processing technology, image recognition technology has been applied to the national medicine resources census work at all stages.Among them: ①In the preparatory work, in order to establish a unified library of traditional Chinese medicine resources, using text recognition technology based on paper materials, be the assistant in the digitalization of various categories related to Chinese medicine resources; to determine the representative area and plots of the survey from each census team, based on the satellite remote sensing image and vegetation map and other basic data, using remote sensing image classification and other technical methods to assist in determining the key investigation area. ②In the process of field investigation, to obtain the planting area of Chinese herbal medicine was accurately, we use the decision tree model, spectral feature and object-oriented method were used to assist the regional identification and area estimation of Chinese medicinal materials.③In the process of finishing in the industry, in order to be able to relatively accurately determine the type of Chinese medicine resources in the region, based on the individual photos of the plant, the specimens and the name of the use of image recognition techniques, to assist the statistical summary of the types of traditional Chinese medicine resources. ④In the application of the results of transformation, based on the pharmaceutical resources and individual samples of medicinal herbs, the development of Chinese medicine resources to identify APP and authentic herbs 3D display system, assisted the identification of Chinese medicine resources and herbs identification characteristics. The introduction of image recognition technology in the census of Chinese medicine resources, assisting census personnel to carry out related work, not only can reduce the workload of the artificial, improve work efficiency, but also improve the census results of information technology and sharing application ability. With the deepening of the work of Chinese medicine resources census, image recognition technology in the relevant work will also play its unique role. Copyright© by the Chinese Pharmaceutical Association.

  17. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, Felicia Angelica; Waymire, Russell L.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documentsmore » have also been provided to KHNP-CRI.« less

  18. FNAS computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.

    1990-01-01

    This work involves the coordination of necessary resources, facilities, and special personnel to provide a workshop to promote the exchange of CFD technology between industry, universities, and government. Critical flow problems have been isolated and simulation of these is being done.

  19. THE TOXCAST PROGRAM FOR PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS

    EPA Science Inventory

    The United States Environmental Protection Agency (EPA) is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals...

  20. Benefit-Cost Analysis of TAT Phase I Worker Training. Training and Technology Project. Special Report.

    ERIC Educational Resources Information Center

    Kirby, Frederick C.; Castagna, Paul A.

    The purpose of this study is to estimate costs and benefits and to compute alternative benefit-cost ratios for both the individuals and the Federal Government as a result of investing time and resources in the Training and Technology (TAT) Project. TAT is a continuing experimental program in training skilled workers for private industry. The five…

  1. The Effect of a One to One Laptop Initiative on High School Math Achievement in a Suburban High School Environment

    ERIC Educational Resources Information Center

    Heap, Bryan

    2018-01-01

    Technology continues to advance the pace of American education. Each year school districts across the country invest resources into computers, software, technology specialists, and staff development. The stated goal given to stakeholders is usually to increase student achievement, increase motivation, or to better prepare students for the future.…

  2. Formal to Informal Learning with IT: Research Challenges and Issues for E-Learning

    ERIC Educational Resources Information Center

    Cox, M.J.

    2013-01-01

    For the purpose of clarity and consistency, the term e-learning is used throughout the paper to refer to technology-enhanced learning and information technology (IT) in teaching and learning. IT depicts computing and other IT resources. Research into e-learning has changed in focus and breadth over the last four decades as a consequence of…

  3. The Need for Integration of Technology in K-12 School Settings in Kenya, Africa

    ERIC Educational Resources Information Center

    Momanyi, Lilian; Norby, RenaFaye; Strand, Sharon

    2006-01-01

    Many computer users around the world have access to the latest advances in technology and use of the World Wide Web (WWW or Web). However, for a variety of political, economic, and social reasons, some peoples of the world do not have access to these resources. The educational systems of developing countries have not completely missed the…

  4. Integrating On-Line Technology into Teaching Activities to Enhance Student and Teacher Learning in a New Zealand Primary School

    ERIC Educational Resources Information Center

    Baskerville, Delia

    2012-01-01

    Continuing emphasis given to computer technology resourcing in schools presents potential for web-based initiatives which focus on quality arts teaching and learning, as ways to improve arts outcomes for all students. An arts e-learning collaborative research project between specialist on-line teacher/researchers and generalist primary teachers…

  5. Earth Resources Technology Satellite: US standard catalog No. U-12

    NASA Technical Reports Server (NTRS)

    1973-01-01

    To provide dissemination of information regarding the availability of Earth Resources Technology Satellite (ERTS) imagery, a U.S. Standard Catalog is published on a monthly schedule. The catalogs identify imagery which has been processed and input to the data files during the preceding month. The U.S. Standard Catalog includes imagery covering the Continental United States, Alaska, and Hawaii. As a supplement to these catalogs, an inventory of ERTS imagery on 16 millimeter microfilm is available. The catalogs consist of four parts: (1) annotated maps which graphically depict the geographic areas covered by the imagery listed in the current catalog, (2) a computer-generated listing organized by observation identification number (D) with pertinent information on each image, (3) a computer listing of observations organized by longitude and latitude, and (4) observations which have had changes made in their catalog information since the original entry in the data base.

  6. The DoD's High Performance Computing Modernization Program - Ensuing the National Earth Systems Prediction Capability Becomes Operational

    NASA Astrophysics Data System (ADS)

    Burnett, W.

    2016-12-01

    The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.

  7. USSR Report, Cybernetics, Computers and Automation Technology

    DTIC Science & Technology

    1987-04-02

    Communication Channel (NTR: PROBLEMY I RESHENIYA, No 14, 22 Jul-4 Aug 86) 52 EDUCATION Informatics and the National Information Resource (I. Chebotaru...the method of actions, which were successful in the past. The experience of previous developments is implemented in the prototype programs. Many data...of the converter lining, due to reduction of ferroalloy consumption, oxygen consumption and energy resource consumption and due to a decrease of

  8. A cross-sectional study of the effects of load carriage on running characteristics and tibial mechanical stress: implications for stress fracture injuries in women

    DTIC Science & Technology

    2017-03-23

    performance computing resources made available by the US Department of Defense High Performance Computing Modernization Program at the Air Force...1Department of Defense Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, United...States Army Medical Research and Materiel Command, Fort Detrick, Maryland, USA Full list of author information is available at the end of the article

  9. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers and Automation Technology, Number 29.

    DTIC Science & Technology

    1978-01-17

    approach to designing computers: Formal mathematical methods were applied and computers themselves began to be widely used in designing other...capital, labor resources and the funds of consumers. Analysis of the model indicates that at the present time the average complexity of production of...ALGORITHMIC COMPLETENESS AND COMPLEXITY OF MICROPROGRAMS Kiev KIBERNETIKA in Russian No 3, May/Jun 77 pp 1-15 manuscript received 22 Dec 76 G0LUNK0V

  10. Research on the application of wisdom technology in smart city

    NASA Astrophysics Data System (ADS)

    Li, Juntao; Ma, Shuai; Gu, Weihua; Chen, Weiyi

    2015-12-01

    This paper first analyzes the concept of smart technology, the relationship between wisdom technology and smart city, and discusses the practical application of IOT(Internet of things) in smart city to explore a better way to realize smart city; then Introduces the basic concepts of cloud computing and smart city, and explains the relationship between the two; Discusses five advantages of cloud computing that applies to smart city construction: a unified and highly efficient, large-scale infrastructure software and hardware management, service scheduling and resource management, security control and management, energy conservation and management platform layer, and to promote modern practical significance of the development of services, promoting regional social and economic development faster. Finally, a brief description of the wisdom technology and smart city management is presented.

  11. Evolutionary Technologies: Fundamentals and Applications to Information/Communication Systems and Manufacturing/Logistics Systems

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Kawakami, Hiroshi; Tsujimura, Yasuhiro; Handa, Hisashi; Lin, Lin; Okamoto, Azuma

    As efficient utilization of computational resources is increasing, evolutionary technology based on the Genetic Algorithm (GA), Genetic Programming (GP), Evolution Strategy (ES) and other Evolutionary Computations (ECs) is making rapid progress, and its social recognition and the need as applied technology are increasing. This is explained by the facts that EC offers higher robustness for knowledge information processing systems, intelligent production and logistics systems, most advanced production scheduling and other various real-world problems compared to the approaches based on conventional theories, and EC ensures flexible applicability and usefulness for any unknown system environment even in a case where accurate mathematical modeling fails in the formulation. In this paper, we provide a comprehensive survey of the current state-of-the-art in the fundamentals and applications of evolutionary technologies.

  12. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    USGS Publications Warehouse

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  13. Supporting research sites in resource-limited settings: Challenges in implementing IT infrastructure

    PubMed Central

    Whalen, Christopher; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As Information and Communication Technology infrastructure becomes more reliable, new methods of Electronic Data Capture (EDC), datamarts/Data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on EDC and internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings (RLS). We describe examples of the practical and ethical/regulatory challenges raised by use of these newer technologies for data collection in multisite clinical studies. PMID:24321986

  14. Bits and bytes: the future of radiology lies in informatics and information technology.

    PubMed

    Brink, James A; Arenson, Ronald L; Grist, Thomas M; Lewin, Jonathan S; Enzmann, Dieter

    2017-09-01

    Advances in informatics and information technology are sure to alter the practice of medical imaging and image-guided therapies substantially over the next decade. Each element of the imaging continuum will be affected by substantial increases in computing capacity coincident with the seamless integration of digital technology into our society at large. This article focuses primarily on areas where this IT transformation is likely to have a profound effect on the practice of radiology. • Clinical decision support ensures consistent and appropriate resource utilization. • Big data enables correlation of health information across multiple domains. • Data mining advances the quality of medical decision-making. • Business analytics allow radiologists to maximize the benefits of imaging resources.

  15. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place, Grid will become limited to HEP; if however the current multitude of Grid-like systems will converge to a generic, modular and extensible solution, Grid will become true to its name.

  16. Knowledge base for v-Embryo: Information Infrastructure for in silico modeling

    EPA Science Inventory

    Computers, imaging technologies, and the worldwide web have assumed an important role in augmenting traditional learning. Resources to disseminate multimedia information across platforms, and the emergence of communal knowledge environments, facilitate the visualization of diffi...

  17. The Network Classroom.

    ERIC Educational Resources Information Center

    Maule, R. William

    1993-01-01

    Discussion of the role of new computer communications technologies in education focuses on modern networking systems, including fiber distributed data interface and Integrated Services Digital Network; strategies for implementing networked-based communication; and public online information resources for the classroom, including Bitnet, Internet,…

  18. COMPUTATIONAL TOXICOLOGY: AN APPROACH FOR PRIORITIZING CHEMICAL RISK ASSESSMENTS

    EPA Science Inventory

    Characterizing toxic effects for industrial chemicals carries the challenge of focusing resources on the greatest potential risks for human health and the environment. The union of molecular modeling, bioinformatics and simulation of complex systems with emerging technologies suc...

  19. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  20. Science-Driven Computing: NERSC's Plan for 2006-2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Horst D.; Kramer, William T.C.; Bailey, David H.

    NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less

  1. From transistor to trapped-ion computers for quantum chemistry.

    PubMed

    Yung, M-H; Casanova, J; Mezzacapo, A; McClean, J; Lamata, L; Aspuru-Guzik, A; Solano, E

    2014-01-07

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology.

  2. From transistor to trapped-ion computers for quantum chemistry

    PubMed Central

    Yung, M.-H.; Casanova, J.; Mezzacapo, A.; McClean, J.; Lamata, L.; Aspuru-Guzik, A.; Solano, E.

    2014-01-01

    Over the last few decades, quantum chemistry has progressed through the development of computational methods based on modern digital computers. However, these methods can hardly fulfill the exponentially-growing resource requirements when applied to large quantum systems. As pointed out by Feynman, this restriction is intrinsic to all computational models based on classical physics. Recently, the rapid advancement of trapped-ion technologies has opened new possibilities for quantum control and quantum simulations. Here, we present an efficient toolkit that exploits both the internal and motional degrees of freedom of trapped ions for solving problems in quantum chemistry, including molecular electronic structure, molecular dynamics, and vibronic coupling. We focus on applications that go beyond the capacity of classical computers, but may be realizable on state-of-the-art trapped-ion systems. These results allow us to envision a new paradigm of quantum chemistry that shifts from the current transistor to a near-future trapped-ion-based technology. PMID:24395054

  3. The application of LANDSAT remote sensing technology to natural resources management. Section 1: Introduction to VICAR - Image classification module. Section 2: Forest resource assessment of Humboldt County.

    NASA Technical Reports Server (NTRS)

    Fox, L., III (Principal Investigator); Mayer, K. E.

    1980-01-01

    A teaching module on image classification procedures using the VICAR computer software package was developed to optimize the training benefits for users of the VICAR programs. The field test of the module is discussed. An intensive forest land inventory strategy was developed for Humboldt County. The results indicate that LANDSAT data can be computer classified to yield site specific forest resource information with high accuracy (82%). The "Douglas-fir 80%" category was found to cover approximately 21% of the county and "Mixed Conifer 80%" covering about 13%. The "Redwood 80%" resource category, which represented dense old growth trees as well as large second growth, comprised 4.0% of the total vegetation mosaic. Furthermore, the "Brush" and "Brush-Regeneration" categories were found to be a significant part of the vegetative community, with area estimates of 9.4 and 10.0%.

  4. What Can Technology Offer to Linguistically Diverse Classrooms? Using Multilingual Content in a Computer-Based Learning Environment for Primary Education

    ERIC Educational Resources Information Center

    Van Laere, Evelien; Rosiers, Kirsten; Van Avermaet, Piet; Slembrouck, Stef; van Braak, Johan

    2017-01-01

    Computer-based learning environments (CBLEs) have the potential to integrate the linguistic diversity present in classrooms as a resourceful tool in pupils' learning process. Particularly for pupils who speak a language at home other than the language which is used at school, more understanding is needed on how CBLEs offering multilingual content…

  5. Internet Access as a Structural Factor in Career Choice: A Comparison between Computing and Non-Computing Major Students

    ERIC Educational Resources Information Center

    Lotriet, Hugo; Matthee, Machdel; Alexander, Patricia

    2011-01-01

    The career choice model of Adya and Kaiser posits the availability of technology resources as a structural element impacting on career choice. The model distinguishes between accessibility at school and at home. Based on this theoretical point of departure and by arguing a link between choice of major and choice of field of career, this paper…

  6. An approximate dynamic programming approach to resource management in multi-cloud scenarios

    NASA Astrophysics Data System (ADS)

    Pietrabissa, Antonio; Priscoli, Francesco Delli; Di Giorgio, Alessandro; Giuseppi, Alessandro; Panfili, Martina; Suraci, Vincenzo

    2017-03-01

    The programmability and the virtualisation of network resources are crucial to deploy scalable Information and Communications Technology (ICT) services. The increasing demand of cloud services, mainly devoted to the storage and computing, requires a new functional element, the Cloud Management Broker (CMB), aimed at managing multiple cloud resources to meet the customers' requirements and, simultaneously, to optimise their usage. This paper proposes a multi-cloud resource allocation algorithm that manages the resource requests with the aim of maximising the CMB revenue over time. The algorithm is based on Markov decision process modelling and relies on reinforcement learning techniques to find online an approximate solution.

  7. NPSS on NASA's IPG: Using CORBA and Globus to Coordinate Multidisciplinary Aeroscience Applications

    NASA Technical Reports Server (NTRS)

    Lopez, Isaac; Follen, Gregory J.; Gutierrez, Richard; Naiman, Cynthia G.; Foster, Ian; Ginsburg, Brian; Larsson, Olle; Martin, Stuart; Tuecke, Steven; Woodford, David

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, the NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. To this end, NPSS integrates multiple disciplines such as aerodynamics, structures, and heat transfer and supports "numerical zooming" between O-dimensional to 1-, 2-, and 3-dimensional component engine codes. In order to facilitate the timely and cost-effective capture of complex physical processes, NPSS uses object-oriented technologies such as C++ objects to encapsulate individual engine components and CORBA ORBs for object communication and deployment across heterogeneous computing platforms. Recently, the HPCC program has initiated a concept called the Information Power Grid (IPG), a virtual computing environment that integrates computers and other resources at different sites. IPG implements a range of Grid services such as resource discovery, scheduling, security, instrumentation, and data access, many of which are provided by the Globus toolkit. IPG facilities have the potential to benefit NPSS considerably. For example, NPSS should in principle be able to use Grid services to discover dynamically and then co-schedule the resources required for a particular engine simulation, rather than relying on manual placement of ORBs as at present. Grid services can also be used to initiate simulation components on parallel computers (MPPs) and to address inter-site security issues that currently hinder the coupling of components across multiple sites. These considerations led NASA Glenn and Globus project personnel to formulate a collaborative project designed to evaluate whether and how benefits such as those just listed can be achieved in practice. This project involves firstly development of the basic techniques required to achieve co-existence of commodity object technologies and Grid technologies; and secondly the evaluation of these techniques in the context of NPSS-oriented challenge problems. The work on basic techniques seeks to understand how "commodity" technologies (CORBA, DCOM, Excel, etc.) can be used in concert with specialized "Grid" technologies (for security, MPP scheduling, etc.). In principle, this coordinated use should be straightforward because of the Globus and IPG philosophy of providing low-level Grid mechanisms that can be used to implement a wide variety of application-level programming models. (Globus technologies have previously been used to implement Grid-enabled message-passing libraries, collaborative environments, and parameter study tools, among others.) Results obtained to date are encouraging: we have successfully demonstrated a CORBA to Globus resource manager gateway that allows the use of CORBA RPCs to control submission and execution of programs on workstations and MPPs; a gateway from the CORBA Trader service to the Grid information service; and a preliminary integration of CORBA and Grid security mechanisms. The two challenge problems that we consider are the following: 1) Desktop-controlled parameter study. Here, an Excel spreadsheet is used to define and control a CFD parameter study, via a CORBA interface to a high throughput broker that runs individual cases on different IPG resources. 2) Aviation safety. Here, about 100 near real time jobs running NPSS need to be submitted, run and data returned in near real time. Evaluation will address such issues as time to port, execution time, potential scalability of simulation, and reliability of resources. The full paper will present the following information: 1. A detailed analysis of the requirements that NPSS applications place on IPG. 2. A description of the techniques used to meet these requirements via the coordinated use of CORBA and Globus. 3. A description of results obtained to date in the first two challenge problems.

  8. ICT in the Education of Students with SEN: Perceptions of Stakeholders

    NASA Astrophysics Data System (ADS)

    Ribeiro, Jaime; Moreira, António; Almeida, Ana Margarida

    Portugal is experiencing a technological reform in education. Technological refurbishing of schools and training of students and teachers is a reality on the rise, enhanced by the implementation of the Education Technological Plan, which also aims at computer skills certification, by 2010, of 90% of teachers. In a School that must be adjusted to all pupils, Special Educational Needs cannot be neglected and the nature and constitution of its computer resources should obviate the support of these students. ICT training is essential to benefit all students from its use. In the case of SEN, this need for training is of paramount importance to establish itself as a facilitator for these students. ICT Coordinators are the visible face of ICT implementation in schools; their functions include the management of the schools computer facilities and to zeal for the ICT training of fellow teachers.

  9. Fast data reconstructed method of Fourier transform imaging spectrometer based on multi-core CPU

    NASA Astrophysics Data System (ADS)

    Yu, Chunchao; Du, Debiao; Xia, Zongze; Song, Li; Zheng, Weijian; Yan, Min; Lei, Zhenggang

    2017-10-01

    Imaging spectrometer can gain two-dimensional space image and one-dimensional spectrum at the same time, which shows high utility in color and spectral measurements, the true color image synthesis, military reconnaissance and so on. In order to realize the fast reconstructed processing of the Fourier transform imaging spectrometer data, the paper designed the optimization reconstructed algorithm with OpenMP parallel calculating technology, which was further used for the optimization process for the HyperSpectral Imager of `HJ-1' Chinese satellite. The results show that the method based on multi-core parallel computing technology can control the multi-core CPU hardware resources competently and significantly enhance the calculation of the spectrum reconstruction processing efficiency. If the technology is applied to more cores workstation in parallel computing, it will be possible to complete Fourier transform imaging spectrometer real-time data processing with a single computer.

  10. Earth resources-regional transfer activity contracts review

    NASA Technical Reports Server (NTRS)

    Bensko, J., Jr.; Daniels, J. L.; Downs, S. W., Jr.; Jones, N. L.; Morton, R. R.; Paludan, C. T.

    1977-01-01

    A regional transfer activity contracts review held by the Earth Resources Office was summarized. Contracts in the earth resources field primarily directed toward applications of satellite data and technology in solution of state and regional problems were reviewed. A summary of the progress of each contract was given in order to share experiences of researchers across a seven state region. The region included Missouri, Kentucky, Tennessee, Mississippi, Alabama, Georgia, and North Carolina. Research in several earth science disciplines included forestry, limnology, water resources, land use, geology, and mathematical modeling. The use of computers for establishment of information retrieval systems was also emphasized.

  11. Climate simulations and services on HPC, Cloud and Grid infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Blanco, Carlos; Minondo Tshuma, Antonio

    2017-04-01

    Cloud, Grid and High Performance Computing have changed the accessibility and availability of computing resources for Earth Science research communities, specially for Climate community. These paradigms are modifying the way how climate applications are being executed. By using these technologies the number, variety and complexity of experiments and resources are increasing substantially. But, although computational capacity is increasing, traditional applications and tools used by the community are not good enough to manage this large volume and variety of experiments and computing resources. In this contribution, we evaluate the challenges to run climate simulations and services on Grid, Cloud and HPC infrestructures and how to tackle them. The Grid and Cloud infrastructures provided by EGI's VOs ( esr , earth.vo.ibergrid and fedcloud.egi.eu) will be evaluated, as well as HPC resources from PRACE infrastructure and institutional clusters. To solve those challenges, solutions using DRM4G framework will be shown. DRM4G provides a good framework to manage big volume and variety of computing resources for climate experiments. This work has been supported by the Spanish National R&D Plan under projects WRF4G (CGL2011-28864), INSIGNIA (CGL2016-79210-R) and MULTI-SDM (CGL2015-66583-R) ; the IS-ENES2 project from the 7FP of the European Commission (grant agreement no. 312979); the European Regional Development Fund—ERDF and the Programa de Personal Investigador en Formación Predoctoral from Universidad de Cantabria and Government of Cantabria.

  12. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    PubMed Central

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811

  13. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    PubMed

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  14. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  15. Real-time interactive 3D computer stereography for recreational applications

    NASA Astrophysics Data System (ADS)

    Miyazawa, Atsushi; Ishii, Motonaga; Okuzawa, Kazunori; Sakamoto, Ryuuichi

    2008-02-01

    With the increasing calculation costs of 3D computer stereography, low-cost, high-speed implementation of the latter requires effective distribution of computing resources. In this paper, we attempt to re-classify 3D display technologies on the basis of humans' 3D perception, in order to determine what level of presence or reality is required in recreational video game systems. We then discuss the design and implementation of stereography systems in two categories of the new classification.

  16. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    NASA Technical Reports Server (NTRS)

    Doyle, Richard; Bergman, Larry; Some, Raphael; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and the mission; it can be aptly viewed as a "technology multiplier" in that advances in onboard computing provide dramatic improvements in flight functions and capabilities across the NASA mission classes, and will enable new flight capabilities and mission scenarios, increasing science and exploration return per mission-dollar.

  17. The Information Science Experiment System - The computer for science experiments in space

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  18. A Survey of IT Policy and of Current Practice in a Sample of Colleges. Information Technology in FE. An Occasional Paper.

    ERIC Educational Resources Information Center

    Further Education Unit, London (England).

    A 1983 site-visit survey of 20 colleges actively engaged in developing their information technology (IT) expertise and resources indicate that many teachers do not appreciate the relevance of the computer to their work; few colleges have an effective and structured approach to IT in terms of staffing, equipment, or application; and lack of…

  19. The Making of a History Standards Wiki: "Covering", "Uncovering", and "Discovering" Curriculum Frameworks Using a Highly Interactive Technology

    ERIC Educational Resources Information Center

    Maloy, Robert W.; Poirier, Michelle; Smith, Hilary K.; Edwards, Sharon A.

    2010-01-01

    This article explores using a wiki, one of the newest forms of interactive computer-based technology, as a resource for teaching the Massachusetts K-12 History and Social Science Curriculum Framework, a set of state-mandated learning standards. Wikis are web pages that can be easily edited by multiple authors. They invite active involvement by…

  20. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    PubMed

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  1. Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy

    DTIC Science & Technology

    2016-03-01

    to be used as a decision - making aid to guide system designers and program managers not necessarily familiar with cognitive pro- cessing, or resource...implementing end-to-end cognitive processing flows multiplies and the impact of these design decisions on efficiency and effectiveness increases [1]. The...end-to-end cognitive systems and alternative computing technologies, then system design and acquisition personnel could make systematic analyses and

  2. Diffusion of innovations: smartphones and wireless anatomy learning resources.

    PubMed

    Trelease, Robert B

    2008-01-01

    The author has previously reported on principles of diffusion of innovations, the processes by which new technologies become popularly adopted, specifically in relation to anatomy and education. In presentations on adopting handheld computers [personal digital assistants (PDAs)] and personal media players for health sciences education, particular attention has been directed to the anticipated integration of PDA functions into popular cellular telephones. However, limited distribution of early "smartphones" (e.g., Palm Treo and Blackberry) has provided few potential users for anatomical learning resources. In contrast, iPod media players have been self-adopted by millions of students, and "podcasting" has become a popular medium for distributing educational media content. The recently introduced Apple iPhone has combined smartphone and higher resolution media player capabilities. The author successfully tested the iPhone and the "work alike" iPod touch wireless media player with text-based "flashcard" resources, existing PDF educational documents, 3D clinical imaging data, lecture "podcasts," and clinical procedure video. These touch-interfaced, mobile computing devices represent just the first of a new generation providing practical, scalable wireless Web access with enhanced multimedia capabilities. With widespread student self-adoption of such new personal technology, educators can look forward to increasing portability of well-designed, multiplatform "learn anywhere" resources. Copyright 2008 American Association of Anatomists

  3. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    NASA Astrophysics Data System (ADS)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  4. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  5. EPA'S TOXCAST PROGRAM FOR PREDICTING HAZARD AND PRIORITIZING TOXICITY TESTING OF ENVIRONMENTAL CHEMICALS

    EPA Science Inventory

    EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals that likely represent the greatest hazard to human ...

  6. Equasions for Curriculum Improvement.

    ERIC Educational Resources Information Center

    Eckenrod, James S.

    1986-01-01

    Describes the Technology in Curriculum (TIC) program resource guides which will be distributed to California schools in the fall of 1986. These guides match available instructional television programs and computer software to existing California curriculum guides in order to facilitate teachers' classroom use. (JDH)

  7. OCLC: Yesterday, Today and Tomorrow.

    ERIC Educational Resources Information Center

    Smith, K. Wayne

    1998-01-01

    Discusses the Online Computer Library Center's (OCLC) evolution as an organization, highlighting its nonprofit status, financial philosophy, membership role in governance, collections and technical services, resource sharing, and reference services. Presents a chronology of OCLC products, services, and technological innovation 1967-1997. (PEN)

  8. Intelligent transportation systems, shared resource projects : an action guide : telecommunications infrastructure in transportation right-of-way

    DOT National Transportation Integrated Search

    1997-01-01

    Intelligent transportation systems (ITS) use advances in communications, computer and information systems to create technologies that can improve traffic, transit and commercial vehicle operations. Essentially, ITS provides the right people in the tr...

  9. Design and deployment of an elastic network test-bed in IHEP data center based on SDN

    NASA Astrophysics Data System (ADS)

    Zeng, Shan; Qi, Fazhi; Chen, Gang

    2017-10-01

    High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.

  10. The Czech National Grid Infrastructure

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Křenková, I.; Mulač, M.; Ruda, M.; Sitera, J.

    2017-10-01

    The Czech National Grid Infrastructure is operated by MetaCentrum, a CESNET department responsible for coordinating and managing activities related to distributed computing. CESNET as the Czech National Research and Education Network (NREN) provides many e-infrastructure services, which are used by 94% of the scientific and research community in the Czech Republic. Computing and storage resources owned by different organizations are connected by fast enough network to provide transparent access to all resources. We describe in more detail the computing infrastructure, which is based on several different technologies and covers grid, cloud and map-reduce environment. While the largest part of CPUs is still accessible via distributed torque servers, providing environment for long batch jobs, part of infrastructure is available via standard EGI tools in EGI, subset of NGI resources is provided into EGI FedCloud environment with cloud interface and there is also Hadoop cluster provided by the same e-infrastructure.A broad spectrum of computing servers is offered; users can choose from standard 2 CPU servers to large SMP machines with up to 6 TB of RAM or servers with GPU cards. Different groups have different priorities on various resources, resource owners can even have an exclusive access. The software is distributed via AFS. Storage servers offering up to tens of terabytes of disk space to individual users are connected via NFS4 on top of GPFS and access to long term HSM storage with peta-byte capacity is also provided. Overview of available resources and recent statistics of usage will be given.

  11. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    PubMed

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services indicated that the security infrastructure in cloud services may be more compliant with the Health Insurance Portability and Accountability Act of 1996 regulations than traditional methods. To gauge the DoD's adoption of cloud technologies proposed metrics included cost factors, ease of use, automation, availability, accessibility, security, and policy compliance. Since 2009, plans and policies were developed for the use of cloud technology to help consolidate and reduce the number of data centers which were expected to reduce costs, improve environmental factors, enhance information technology security, and maintain mission support for service members. Cloud technologies were also expected to improve employee efficiency and productivity. Federal cloud computing policies within the last decade also offered increased opportunities to advance military healthcare. It was assumed that these opportunities would benefit consumers of healthcare and health science data by allowing more access to centralized cloud computer facilities to store, analyze, search and share relevant data, to enhance standardization, and to reduce potential duplications of effort. We recommend that cloud computing be considered by DoD biomedical researchers for increasing connectivity, presumably by facilitating communications and data sharing, among the various intra- and extramural laboratories. We also recommend that policies and other guidances be updated to include developing additional metrics that will help stakeholders evaluate the above mentioned assumptions and expectations. Published by Oxford University Press on behalf of the Association of Military Surgeons of the United States 2018. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  12. Modeling and computational simulation and the potential of virtual and augmented reality associated to the teaching of nanoscience and nanotechnology

    NASA Astrophysics Data System (ADS)

    Ribeiro, Allan; Santos, Helen

    With the advent of new information and communication technologies (ICTs), the communicative interaction changes the way of being and acting of people, at the same time that changes the way of work activities related to education. In this range of possibilities provided by the advancement of computational resources include virtual reality (VR) and augmented reality (AR), are highlighted as new forms of information visualization in computer applications. While the RV allows user interaction with a virtual environment totally computer generated; in RA the virtual images are inserted in real environment, but both create new opportunities to support teaching and learning in formal and informal contexts. Such technologies are able to express representations of reality or of the imagination, as systems in nanoscale and low dimensionality, being imperative to explore, in the most diverse areas of knowledge, the potential offered by ICT and emerging technologies. In this sense, this work presents computer applications of virtual and augmented reality developed with the use of modeling and simulation in computational approaches to topics related to nanoscience and nanotechnology, and articulated with innovative pedagogical practices.

  13. CUDA Optimization Strategies for Compute- and Memory-Bound Neuroimaging Algorithms

    PubMed Central

    Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W.

    2011-01-01

    As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. PMID:21159404

  14. CUDA optimization strategies for compute- and memory-bound neuroimaging algorithms.

    PubMed

    Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W

    2012-06-01

    As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  15. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  16. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    PubMed

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  17. The Challenges and Benefits of Using Computer Technology for Communication and Teaching in the Geosciences

    NASA Astrophysics Data System (ADS)

    Fairley, J. P.; Hinds, J. J.

    2003-12-01

    The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.

  18. PanDA: Exascale Federation of Resources for the ATLAS Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, Fernando; Caballero Bejar, Jose; De, Kaushik; Hover, John; Klimentov, Alexei; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Padolski, Siarhei; Panitkin, Sergey; Petrosyan, Artem; Wenaus, Torre

    2016-02-01

    After a scheduled maintenance and upgrade period, the world's largest and most powerful machine - the Large Hadron Collider(LHC) - is about to enter its second run at unprecedented energies. In order to exploit the scientific potential of the machine, the experiments at the LHC face computational challenges with enormous data volumes that need to be analysed by thousand of physics users and compared to simulated data. Given diverse funding constraints, the computational resources for the LHC have been deployed in a worldwide mesh of data centres, connected to each other through Grid technologies. The PanDA (Production and Distributed Analysis) system was developed in 2005 for the ATLAS experiment on top of this heterogeneous infrastructure to seamlessly integrate the computational resources and give the users the feeling of a unique system. Since its origins, PanDA has evolved together with upcoming computing paradigms in and outside HEP, such as changes in the networking model, Cloud Computing and HPC. It is currently running steadily up to 200 thousand simultaneous cores (limited by the available resources for ATLAS), up to two million aggregated jobs per day and processes over an exabyte of data per year. The success of PanDA in ATLAS is triggering the widespread adoption and testing by other experiments. In this contribution we will give an overview of the PanDA components and focus on the new features and upcoming challenges that are relevant to the next decade of distributed computing workload management using PanDA.

  19. R and D Productivity: New Challenges for the US Space Program

    NASA Technical Reports Server (NTRS)

    Baskin, O. W. (Editor); Sullivan, L. J. (Editor)

    1985-01-01

    Various topics related to research and development activities applicable to their U.S. space program are discussed. Project management, automatic control technology, human resources, management information systems, computer aided design, systems engineering, and personnel management were among the topics covered.

  20. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  1. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  2. Automating NEURON Simulation Deployment in Cloud Resources

    PubMed Central

    Santamaria, Fidel

    2016-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341

  3. Information Technology and the Autonomous Control of a Mars In-Situ Propellant Production System

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Sridhar, K. R.; Larson, William E.; Clancy, Daniel J.; Peschur, Charles; Briggs, Geoffrey A.; Zornetzer, Steven F. (Technical Monitor)

    1999-01-01

    With the rapidly increasing performance of information technology, i.e., computer hardware and software systems, as well as networks and communication systems, a new capability is being developed that holds the clear promise of greatly increased exploration capability, along with dramatically reduced design, development, and operating costs. These new intelligent systems technologies, utilizing knowledge-based software and very high performance computer systems, will provide new design and development tools, scheduling mechanisms, and vehicle and system health monitoring capabilities. In addition, specific technologies such as neural nets will provide a degree of machine intelligence and associated autonomy which has previously been unavailable to the mission and spacecraft designer and to the system operator. One of the most promising applications of these new information technologies is to the area of in situ resource utilization. Useful resources such as oxygen, compressed carbon dioxide, water, methane, and buffer gases can be extracted and/or generated from planetary atmospheres, such as the Martian atmosphere. These products, when used for propulsion and life-support needs can provide significant savings in the launch mass and costs for both robotic and crewed missions. In the longer term the utilization of indigenous resources is an enabling technology that is vital to sustaining long duration human presence on Mars. This paper will present the concepts that are currently under investigation and development for mining the Martian atmosphere, such as temperature-swing adsorption, zirconia electrolysis etc., to create propellants and life-support materials. This description will be followed by an analysis of the information technology and control needs for the reliable and autonomous operation of such processing plants in a fault tolerant manner, as well as the approach being taken for the development of the controlling software. Finally, there will be a brief discussion of the verification and validation process so crucial to the implementation of mission-critical software.

  4. AgRISTARS: Renewable resources inventory. Land information support system implementation plan and schedule. [San Juan National Forest pilot test

    NASA Technical Reports Server (NTRS)

    Yao, S. S. (Principal Investigator)

    1981-01-01

    The planning and scheduling of the use of remote sensing and computer technology to support the land management planning effort at the national forests level are outlined. The task planning and system capability development were reviewed. A user evaluation is presented along with technological transfer methodology. A land management planning pilot test of the San Juan National Forest is discussed.

  5. Research on numerical control system based on S3C2410 and MCX314AL

    NASA Astrophysics Data System (ADS)

    Ren, Qiang; Jiang, Tingbiao

    2008-10-01

    With the rapid development of micro-computer technology, embedded system, CNC technology and integrated circuits, numerical control system with powerful functions can be realized by several high-speed CPU chips and RISC (Reduced Instruction Set Computing) chips which have small size and strong stability. In addition, the real-time operating system also makes the attainment of embedded system possible. Developing the NC system based on embedded technology can overcome some shortcomings of common PC-based CNC system, such as the waste of resources, low control precision, low frequency and low integration. This paper discusses a hardware platform of ENC (Embedded Numerical Control) system based on embedded processor chip ARM (Advanced RISC Machines)-S3C2410 and DSP (Digital Signal Processor)-MCX314AL and introduces the process of developing ENC system software. Finally write the MCX314AL's driver under the embedded Linux operating system. The embedded Linux operating system can deal with multitask well moreover satisfy the real-time and reliability of movement control. NC system has the advantages of best using resources and compact system with embedded technology. It provides a wealth of functions and superior performance with a lower cost. It can be sure that ENC is the direction of the future development.

  6. Model documentation renewable fuels module of the National Energy Modeling System

    NASA Astrophysics Data System (ADS)

    1995-06-01

    This report documents the objectives, analytical approach, and design of the National Energy Modeling System (NEMS) Renewable Fuels Module (RFM) as it relates to the production of the 1995 Annual Energy Outlook (AEO95) forecasts. The report catalogs and describes modeling assumptions, computational methodologies, data inputs, and parameter estimation techniques. A number of offline analyses used in lieu of RFM modeling components are also described. The RFM consists of six analytical submodules that represent each of the major renewable energy resources -- wood, municipal solid waste (MSW), solar energy, wind energy, geothermal energy, and alcohol fuels. The RFM also reads in hydroelectric facility capacities and capacity factors from a data file for use by the NEMS Electricity Market Module (EMM). The purpose of the RFM is to define the technological, cost, and resource size characteristics of renewable energy technologies. These characteristics are used to compute a levelized cost to be competed against other similarly derived costs from other energy sources and technologies. The competition of these energy sources over the NEMS time horizon determines the market penetration of these renewable energy technologies. The characteristics include available energy capacity, capital costs, fixed operating costs, variable operating costs, capacity factor, heat rate, construction lead time, and fuel product price.

  7. Central Limit Theorem: New SOCR Applet and Demonstration Activity

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas; Sanchez, Juana

    2011-01-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem). PMID:21833159

  8. Central Limit Theorem: New SOCR Applet and Demonstration Activity.

    PubMed

    Dinov, Ivo D; Christou, Nicolas; Sanchez, Juana

    2008-07-01

    Modern approaches for information technology based blended education utilize a variety of novel instructional, computational and network resources. Such attempts employ technology to deliver integrated, dynamically linked, interactive content and multifaceted learning environments, which may facilitate student comprehension and information retention. In this manuscript, we describe one such innovative effort of using technological tools for improving student motivation and learning of the theory, practice and usability of the Central Limit Theorem (CLT) in probability and statistics courses. Our approach is based on harnessing the computational libraries developed by the Statistics Online Computational Resource (SOCR) to design a new interactive Java applet and a corresponding demonstration activity that illustrate the meaning and the power of the CLT. The CLT applet and activity have clear common goals; to provide graphical representation of the CLT, to improve student intuition, and to empirically validate and establish the limits of the CLT. The SOCR CLT activity consists of four experiments that demonstrate the assumptions, meaning and implications of the CLT and ties these to specific hands-on simulations. We include a number of examples illustrating the theory and applications of the CLT. Both the SOCR CLT applet and activity are freely available online to the community to test, validate and extend (Applet: http://www.socr.ucla.edu/htmls/SOCR_Experiments.html and Activity: http://wiki.stat.ucla.edu/socr/index.php/SOCR_EduMaterials_Activities_GeneralCentralLimitTheorem).

  9. The Internet: friend or foe when providing patient education?

    PubMed

    Anderson, Amy Shelton; Klemm, Paula

    2008-02-01

    The Internet has changed how patients with cancer learn about and cope with their disease. Newly diagnosed patients with cancer often have complex educational and informational needs related to diagnosis and treatment. Nurses frequently encounter time and work-related constraints that can interfere with the provision of patient education. They are challenged to educate patients in an environment of rapidly expanding and innovative computer technology. Barriers that hinder nurses in integrating educational Internet resources into patient care include lack of training, time constraints, and inadequate administrative support. Advantages of Internet use for patient education and support include wide-ranging and current information, a variety of teaching formats, patient empowerment, new communication options, and support 24 hours a day, seven days a week. Pitfalls associated with Internet use for patients with cancer include inaccurate information, lack of access, poor quality of online resources, and security and privacy issues. Nurses routinely use computer technology in the workplace and follow rigorous security and privacy standards to protect patient information. Those skills can provide the foundation for the use of online sources for patient teaching. Nurses play an important role in helping patients evaluate the veracity of online information and introducing them to reliable Internet resources.

  10. Ways of thinking about and teaching ethical problem solving: microethics and macroethics in engineering.

    PubMed

    Herkert, Joseph R

    2005-07-01

    Engineering ethics entails three frames of reference: individual, professional, and social. "Microethics" considers individuals and internal relations of the engineering profession; "macroethics" applies to the collective social responsibility of the profession and to societal decisions about technology. Most research and teaching in engineering ethics, including online resources, has had a "micro" focus. Mechanisms for incorporating macroethical perspectives include: integrating engineering ethics and science, technology and society (STS); closer integration of engineering ethics and computer ethics; and consideration of the influence of professional engineering societies and corporate social responsibility programs on ethical engineering practice. Integrating macroethical issues and concerns in engineering ethics involves broadening the context of ethical problem solving. This in turn implies: developing courses emphasizing both micro and macro perspectives, providing faculty development that includes training in both STS and practical ethics; and revision of curriculum materials, including online resources. Multidisciplinary collaboration is recommended 1) to create online case studies emphasizing ethical decision making in individual, professional, and societal contexts; 2) to leverage existing online computer ethics resources with relevance to engineering education and practice; and 3) to create transparent linkages between public policy positions advocated by professional societies and codes of ethics.

  11. Design and Implement of Astronomical Cloud Computing Environment In China-VO

    NASA Astrophysics Data System (ADS)

    Li, Changhua; Cui, Chenzhou; Mi, Linying; He, Boliang; Fan, Dongwei; Li, Shanshan; Yang, Sisi; Xu, Yunfei; Han, Jun; Chen, Junyi; Zhang, Hailong; Yu, Ce; Xiao, Jian; Wang, Chuanjun; Cao, Zihuang; Fan, Yufeng; Liu, Liang; Chen, Xiao; Song, Wenming; Du, Kangyu

    2017-06-01

    Astronomy cloud computing environment is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on virtualization technology, astronomy cloud computing environment was designed and implemented by China-VO team. It consists of five distributed nodes across the mainland of China. Astronomer can get compuitng and storage resource in this cloud computing environment. Through this environments, astronomer can easily search and analyze astronomical data collected by different telescopes and data centers , and avoid the large scale dataset transportation.

  12. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  13. On the development of an interactive resource information management system for analysis and display of spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Schell, J. A.

    1974-01-01

    The recent availability of timely synoptic earth imagery from the Earth Resources Technology Satellites (ERTS) provides a wealth of information for the monitoring and management of vital natural resources. Formal language definitions and syntax interpretation algorithms were adapted to provide a flexible, computer information system for the maintenance of resource interpretation of imagery. These techniques are incorporated, together with image analysis functions, into an Interactive Resource Information Management and Analysis System, IRIMAS, which is implemented on a Texas Instruments 980A minicomputer system augmented with a dynamic color display for image presentation. A demonstration of system usage and recommendations for further system development are also included.

  14. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  15. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  16. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  17. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  18. The Computer-based Lecture

    PubMed Central

    Wofford, Marcia M; Spickard, Anderson W; Wofford, James L

    2001-01-01

    Advancing computer technology, cost-containment pressures, and desire to make innovative improvements in medical education argue for moving learning resources to the computer. A reasonable target for such a strategy is the traditional clinical lecture. The purpose of the lecture, the advantages and disadvantages of “live” versus computer-based lectures, and the technical options in computerizing the lecture deserve attention in developing a cost-effective, complementary learning strategy that preserves the teacher-learner relationship. Based on a literature review of the traditional clinical lecture, we build on the strengths of the lecture format and discuss strategies for converting the lecture to a computer-based learning presentation. PMID:11520384

  19. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  20. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  1. Pulse!!: a model for research and development of virtual-reality learning in military medical education and training.

    PubMed

    Dunne, James R; McDonald, Claudia L

    2010-07-01

    Pulse!! The Virtual Clinical Learning Lab at Texas A&M University-Corpus Christi, in collaboration with the United States Navy, has developed a model for research and technological development that they believe is an essential element in the future of military and civilian medical education. The Pulse!! project models a strategy for providing cross-disciplinary expertise and resources to educational, governmental, and business entities challenged with meeting looming health care crises. It includes a three-dimensional virtual learning platform that provides unlimited, repeatable, immersive clinical experiences without risk to patients, and is available anywhere there is a computer. Pulse!! utilizes expertise in the fields of medicine, medical education, computer science, software engineering, physics, computer animation, art, and architecture. Lab scientists collaborate with the commercial virtual-reality simulation industry to produce research-based learning platforms based on cutting-edge computer technology.

  2. Tell Me More: Issues and Challenges

    ERIC Educational Resources Information Center

    Hashim, Harwati; Yunus, Melor MD.

    2012-01-01

    Integration of technology into language education has become an everyday occurrence. Educational multimedia courseware as resource materials to enhance the teaching and learning of English language was produced extensively. Regardless of the cost, computers and courseware are becoming important tools for learning in institutions. Therefore, a…

  3. Avoiding Road-Kill on the Information Highway.

    ERIC Educational Resources Information Center

    Marcus, Stephen

    1994-01-01

    Describes the available computer network resources and some of the relevant terms, practices, and notions that are quickly evolving with the growth of the so-called information superhighway. Covers the historical trends and considers future challenges to be faced in view of changing technologies. (HB)

  4. 40 CFR 1.33 - Office of Administration and Resources Management.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... provides national program policy and technical guidance for: The acquisition of all information technology... acquired by grantees and contractors using Agency funds; the operation of all Agency computers and... Management Interns, OHRM establishes policies; assesses and projects Agency executive needs and workforce...

  5. 40 CFR 1.33 - Office of Administration and Resources Management.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... provides national program policy and technical guidance for: The acquisition of all information technology... acquired by grantees and contractors using Agency funds; the operation of all Agency computers and... Management Interns, OHRM establishes policies; assesses and projects Agency executive needs and workforce...

  6. 40 CFR 1.33 - Office of Administration and Resources Management.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... provides national program policy and technical guidance for: The acquisition of all information technology... acquired by grantees and contractors using Agency funds; the operation of all Agency computers and... Management Interns, OHRM establishes policies; assesses and projects Agency executive needs and workforce...

  7. 40 CFR 1.33 - Office of Administration and Resources Management.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... provides national program policy and technical guidance for: The acquisition of all information technology... acquired by grantees and contractors using Agency funds; the operation of all Agency computers and... Management Interns, OHRM establishes policies; assesses and projects Agency executive needs and workforce...

  8. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  9. Feasibility study of scanning celestial Attitude System (SCADS) for Earth Resources Technology Satellite (ERTS)

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The feasibility of using the Scanning Celestial Attitude Determination System (SCADS) during Earth Resources Technology Satellite (ERTS) missions to compute an accurate spacecraft attitude by use of stellar measurements is considered. The spacecraft is local-vertical-stabilized. A heuristic discussion of the SCADS concept is first given. Two concepts are introduced: a passive system which contains no moving parts, and an active system in which the reticle is caused to rotate about the sensor's axis. A quite complete development of the equations of attitude motions is then given. These equations are used to generate the true attitude which in turn is used to compute the transit times of detectable stars and to determine the errors associated with the SCADS attitude. A more complete discussion of the analytical foundation of SCADS concept and its use for the geometries particular to this study, as well as salient design parameters for the passive and active systems are included.

  10. On transferring the grid technology to the biomedical community.

    PubMed

    Mohammed, Yassene; Sax, Ulrich; Dickmann, Frank; Lippert, Joerg; Solodenko, Juri; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which resulted in the Grid. The inter domain transfer process of this technology has been an intuitive process. Some difficulties facing the life science community can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies that have achieved certain stability. Grid and Cloud solutions are technologies that are still in flux. We illustrate how Grid computing creates new difficulties for the technology transfer process that are not considered in Bozeman's model. We show why the success of health Grids should be measured by the qualified scientific human capital and opportunities created, and not primarily by the market impact. With two examples we show how the Grid technology transfer theory corresponds to the reality. We conclude with recommendations that can help improve the adoption of Grid solutions into the biomedical community. These results give a more concise explanation of the difficulties most life science IT projects are facing in the late funding periods, and show some leveraging steps which can help to overcome the "vale of tears".

  11. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  12. Sustainable mobile information infrastructures in low resource settings.

    PubMed

    Braa, Kristin; Purkayastha, Saptarshi

    2010-01-01

    Developing countries represent the fastest growing mobile markets in the world. For people with no computing access, a mobile will be their first computing device. Mobile technologies offer a significant potential to strengthen health systems in developing countries with respect to community based monitoring, reporting, feedback to service providers, and strengthening communication and coordination between different health functionaries, medical officers and the community. However, there are various challenges in realizing this potential including technological such as lack of power, social, institutional and use issues. In this paper a case study from India on mobile health implementation and use will be reported. An underlying principle guiding this paper is to see mobile technology not as a "stand alone device" but potentially an integral component of an integrated mobile supported health information infrastructure.

  13. Technology development of the Space Transportation System mission and terrestrial applications of satellite technology

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.

  14. A Look at the Impact of High-End Computing Technologies on NASA Missions

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart

    2012-01-01

    From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.

  15. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    PubMed

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  16. Linear equations and rap battles: how students in a wired classroom utilized the computer as a resource to coordinate personal and mathematical positional identities in hybrid spaces

    NASA Astrophysics Data System (ADS)

    Langer-Osuna, Jennifer

    2015-03-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.

  17. Scientific American Inventions From Outer Space: Everyday Uses For NASA Technology

    NASA Technical Reports Server (NTRS)

    Baker, David

    2000-01-01

    The purpose of this book is to present some of the inventions highlighted in the yearly publication of the National Aeronautics and Space Administration (NASA) Spinoff. These inventions cover a wide range, some of which include improvements in health, medicine, public safety, energy, environment, resource management, computer technology, automation, construction, transportation, and manufacturing technology. NASA technology has brought forth thousands of commercial products which include athletic shoes, portable x-ray machines, and scratch-resistant sunglasses, guidance systems, lasers, solar power, robotics and prosthetic devices. These products are examples of NASA research innovations which have positively impacted the community.

  18. Economic Migrants in a Global Labour Market: A Report on the Recruitment and Retention of Asian Computer Professionals by Canadian High Tech Firms. CPRN Discussion Paper.

    ERIC Educational Resources Information Center

    Rao, Badrinath

    The recruitment and retention of Asian computer professionals by Canadian high-tech companies was examined by interviewing 8 Canadian-born information technology (IT) workers, 47 Asian-born IT workers, and 8 human resource (HR) professionals employed by high-tech companies in Ottawa. Of the 47 Asians, 33 stated that they did not know much about…

  19. The transforming effect of handheld computers on nursing practice.

    PubMed

    Thompson, Brent W

    2005-01-01

    Handheld computers have the power to transform nursing care. The roots of this power are the shift to decentralization of communication, electronic health records, and nurses' greater need for information at the point of care. This article discusses the effects of handheld resources, calculators, databases, electronic health records, and communication devices on nursing practice. The US government has articulated the necessity of implementing the use of handheld computers in healthcare. Nurse administrators need to encourage and promote the diffusion of this technology, which can reduce costs and improve care.

  20. Application of SLURM, BOINC, and GlusterFS as Software System for Sustainable Modeling and Data Analytics

    NASA Astrophysics Data System (ADS)

    Kashansky, Vladislav V.; Kaftannikov, Igor L.

    2018-02-01

    Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.

  1. Snapshot of the supports of communication used by patients at a French psychiatric hospital: a digital or social division?

    PubMed

    Girard, Murielle; Nubukpo, Philippe; Malauzat, Dominique

    2017-02-01

    The role of information and communications technology is becoming increasingly prevalent in daily life and in the organization of medical care: are some people being left out? To evaluate access to and the uses of communication resources by psychiatric patients, focusing on the means of communication (e.g. mobile phones and computers), access and frequency of internet use. A questionnaire was distributed, over a period of 1 week, to inpatients or day hospitalised patients aged over 12 years in all care units. Access to and the uses of modern communication resources were lower than in the general population. Among places and means of internet consultation, the personal computer was most often cited, but only by 34%, and the use of mobile phones is still not widespread. Finally, day hospitalised subjects, the elderly, or subjects being treated in the psychosis care sector use internet and technology the least. Some differences exist between this population with mental illness and the general population on the use of new communication technologies. The possibility of integrating these techniques in individualized psychiatric care requires prior equipment and/or updates.

  2. Developing a Science Commons for Geosciences

    NASA Astrophysics Data System (ADS)

    Lenhardt, W. C.; Lander, H.

    2016-12-01

    Many scientific communities, recognizing the research possibilities inherent in data sets, have created domain specific archives such as the Incorporated Research Institutions for Seismology (iris.edu) and ClinicalTrials.gov. Though this is an important step forward, most scientists, including geoscientists, also use a variety of software tools and at least some amount of computation to conduct their research. While the archives make it simpler for scientists to locate the required data, provisioning disk space, compute resources, and network bandwidth can still require significant efforts. This challenge exists despite the wealth of resources available to researchers, namely lab IT resources, institutional IT resources, national compute resources (XSEDE, OSG), private clouds, public clouds, and the development of cyberinfrastructure technologies meant to facilitate use of those resources. Further tasks include obtaining and installing required tools for analysis and visualization. If the research effort is a collaboration or involves certain types of data, then the partners may well have additional non-scientific tasks such as securing the data and developing secure sharing methods for the data. These requirements motivate our investigations into the "Science Commons". This paper will present a working definition of a science commons, compare and contrast examples of existing science commons, and describe a project based at RENCI to implement a science commons for risk analytics. We will then explore what a similar tool might look like for the geosciences.

  3. A framework for analyzing the cognitive complexity of computer-assisted clinical ordering.

    PubMed

    Horsky, Jan; Kaufman, David R; Oppenheim, Michael I; Patel, Vimla L

    2003-01-01

    Computer-assisted provider order entry is a technology that is designed to expedite medical ordering and to reduce the frequency of preventable errors. This paper presents a multifaceted cognitive methodology for the characterization of cognitive demands of a medical information system. Our investigation was informed by the distributed resources (DR) model, a novel approach designed to describe the dimensions of user interfaces that introduce unnecessary cognitive complexity. This method evaluates the relative distribution of external (system) and internal (user) representations embodied in system interaction. We conducted an expert walkthrough evaluation of a commercial order entry system, followed by a simulated clinical ordering task performed by seven clinicians. The DR model was employed to explain variation in user performance and to characterize the relationship of resource distribution and ordering errors. The analysis revealed that the configuration of resources in this ordering application placed unnecessarily heavy cognitive demands on the user, especially on those who lacked a robust conceptual model of the system. The resources model also provided some insight into clinicians' interactive strategies and patterns of associated errors. Implications for user training and interface design based on the principles of human-computer interaction in the medical domain are discussed.

  4. A distributed system for fast alignment of next-generation sequencing data.

    PubMed

    Srimani, Jaydeep K; Wu, Po-Yen; Phan, John H; Wang, May D

    2010-12-01

    We developed a scalable distributed computing system using the Berkeley Open Interface for Network Computing (BOINC) to align next-generation sequencing (NGS) data quickly and accurately. NGS technology is emerging as a promising platform for gene expression analysis due to its high sensitivity compared to traditional genomic microarray technology. However, despite the benefits, NGS datasets can be prohibitively large, requiring significant computing resources to obtain sequence alignment results. Moreover, as the data and alignment algorithms become more prevalent, it will become necessary to examine the effect of the multitude of alignment parameters on various NGS systems. We validate the distributed software system by (1) computing simple timing results to show the speed-up gained by using multiple computers, (2) optimizing alignment parameters using simulated NGS data, and (3) computing NGS expression levels for a single biological sample using optimal parameters and comparing these expression levels to that of a microarray sample. Results indicate that the distributed alignment system achieves approximately a linear speed-up and correctly distributes sequence data to and gathers alignment results from multiple compute clients.

  5. Offshore Wind Resource, Cost, and Economic Potential in the State of Maine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musial, Walter D.

    This report provides information for decision-makers about floating offshore wind technologies in the state of Maine. It summarizes research efforts performed at the National Renewable Energy Laboratory between 2015 and 2017 to analyze the resource potential, cost of offshore wind, and economic potential of offshore wind from four primary reports: Musial et al. (2016); Beiter et al. (2016, 2017); and Mone et al. (unpublished). From Musial et al. (2016), Maine's technical offshore wind resource potential ranked seventh in the nation overall with more than 411 terawatt-hours/year of offshore resource generating potential. Although 90% of this wind resource is greater thanmore » 9.0-meters-per-second average velocity, most of the resource is over deep water, where floating wind technology is needed. Levelized cost of energy and levelized avoided cost of energy were computed to estimate the unsubsidized 'economic potential' for Maine in the year 2027 (Beiter et al. 2016, 2017). The studies found that Maine may have 65 gigawatts of economic potential by 2027, the highest of any U.S. state. Bottom-line costs for the Aqua Ventus project, which is part of the U.S. Department of Energy's Advanced Technology Demonstration project, were released from a proprietary report written by NREL in 2016 for the University of Maine (Mone et al. unpublished). The report findings were that economies of scale and new technology advancements lowered the cost from $300/megawatt-hour (MWh) for the two-turbine 12-megawatt (MW) Aqua Ventus 1 project, to $126/MWh for the commercial-scale, 498-MW Aqua Ventus-2 project. Further cost reductions to $77/MWh were found when new technology advancements were applied for the 1,000-MW Aqua Ventus-3 project in 2030. No new analysis was conducted for this report.« less

  6. Rich client data exploration and research prototyping for NOAA

    NASA Astrophysics Data System (ADS)

    Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah

    2009-08-01

    Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.

  7. The Radiology Resident iPad Toolbox: an educational and clinical tool for radiology residents.

    PubMed

    Sharpe, Emerson E; Kendrick, Michael; Strickland, Colin; Dodd, Gerald D

    2013-07-01

    Tablet computing and mobile resources are the hot topics in technology today, with that interest spilling into the medical field. To improve resident education, a fully configured iPad, referred to as the "Radiology Resident iPad Toolbox," was created and implemented at the University of Colorado. The goal was to create a portable device with comprehensive educational, clinical, and communication tools that would contain all necessary resources for an entire 4-year radiology residency. The device was distributed to a total of 34 radiology residents (8 first-year residents, 8 second-year residents, 9 third-year residents, and 9 fourth-year residents). This article describes the process used to develop and deploy the device, provides a distillation of useful applications and resources decided upon after extensive evaluation, and assesses the impact this device had on resident education. The Radiology Resident iPad Toolbox is a cost-effective, portable, educational instrument that has increased studying efficiency; improved access to study materials such as books, radiology cases, lectures, and web-based resources; and increased interactivity in educational conferences and lectures through the use of audience-response software, with questions geared toward the new ABR board format. This preconfigured tablet fully embraces the technology shift into mobile computing and represents a paradigm shift in educational strategy. Copyright © 2013 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  8. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    PubMed Central

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-01-01

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728

  9. A new approach to integrate Internet-of-things and software-as-a-service model for logistic systems: a case study.

    PubMed

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-03-28

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  10. The Effects of Educational Technology upon the Critical Thinking and Analytical Skills of Below Grade-Level and or Non-College Bound High School Students.

    ERIC Educational Resources Information Center

    Santavenere, Alex

    An action research study was undertaken to examine the effects of educational technology resources on critical thinking and analytical skills. The researcher observed 3 different 11th grade classes, a total of 75 students, over a week as they worked in the school's computer lab. Each class was composed of 25 to 30 students, all of whom were…

  11. Technologies and Reformed-Based Science Instruction: The Examination of a Professional Development Model Focused on Supporting Science Teaching and Learning with Technologies

    NASA Astrophysics Data System (ADS)

    Campbell, Todd; Longhurst, Max L.; Wang, Shiang-Kwei; Hsu, Hui-Yin; Coster, Dan C.

    2015-10-01

    While access to computers, other technologies, and cyber-enabled resources that could be leveraged for enhancing student learning in science is increasing, generally it has been found that teachers use technology more for administrative purposes or to support traditional instruction. This use of technology, especially to support traditional instruction, sits in opposition to most recent standards documents in science education that call for student involvement in evidence-based sense-making activities. Many see technology as a potentially powerful resource that is reshaping society and has the potential to do the same in science classrooms. To consider the promise of technology in science classrooms, this research investigated the impact of a professional development project focused on enhancing teacher and student learning by using information and communication technologies (ICTs) for engaging students in reformed-based instruction. More specifically, these findings revealed positive teacher outcomes with respect to reformed-based and technology-supported instruction and increased ICT and new literacies skills. When considering students, the findings revealed positive outcomes with respect to ICT and new literacies skills and student achievement in science.

  12. Applications of computational modeling in ballistics

    NASA Technical Reports Server (NTRS)

    Sturek, Walter B.

    1987-01-01

    The development of the technology of ballistics as applied to gun launched Army weapon systems is the main objective of research at the U.S. Army Ballistic Research Laboratory (BRL). The primary research programs at the BRL consist of three major ballistic disciplines: exterior, interior, and terminal. The work done at the BRL in these areas was traditionally highly dependent on experimental testing. A considerable emphasis was placed on the development of computational modeling to augment the experimental testing in the development cycle; however, the impact of the computational modeling to this date is modest. With the availability of supercomputer computational resources recently installed at the BRL, a new emphasis on the application of computational modeling to ballistics technology is taking place. The major application areas are outlined which are receiving considerable attention at the BRL at present and to indicate the modeling approaches involved. An attempt was made to give some information as to the degree of success achieved and indicate the areas of greatest need.

  13. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    NASA Astrophysics Data System (ADS)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  14. Triple-server blind quantum computation using entanglement swapping

    NASA Astrophysics Data System (ADS)

    Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua

    2014-04-01

    Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.

  15. ANL site response for the DOE FY1994 information resources management long-range plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory`s ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory`s previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory`s Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, ``Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less

  16. ANL site response for the DOE FY1994 information resources management long-range plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory's ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory's previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory's Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less

  17. Diffusion of Innovations: Smartphones and Wireless Anatomy Learning Resources

    ERIC Educational Resources Information Center

    Trelease, Robert B.

    2008-01-01

    The author has previously reported on principles of diffusion of innovations, the processes by which new technologies become popularly adopted, specifically in relation to anatomy and education. In presentations on adopting handheld computers [personal digital assistants (PDAs)] and personal media players for health sciences education, particular…

  18. Cross-Country Adventures. Teaching with Technology.

    ERIC Educational Resources Information Center

    Allen, Denise

    1995-01-01

    Features reviews of four computer games for use with intermediate and upper grade students, three on geography (Travelrama USA, Crosscountry USA, My America) and one on history (Vital Links). Comments include strengths of each activity, related multimedia activities and resources, and links to literature. Also reviews "Educator's Internet…

  19. A Virtual Embedded Microcontroller Laboratory for Undergraduate Education: Development and Evaluation

    ERIC Educational Resources Information Center

    Richardson, Jeffrey J.; Adamo-Villani, Nicoletta

    2010-01-01

    Laboratory instruction is a major component of the engineering and technology undergraduate curricula. Traditional laboratory instruction is hampered by several factors including limited access to resources by students and high laboratory maintenance cost. A photorealistic 3D computer-simulated laboratory for undergraduate instruction in…

  20. OCLC in Asia Pacific.

    ERIC Educational Resources Information Center

    Chang, Min-min

    1998-01-01

    Discusses the Online Computer Library Center (OCLC) and the changing Asia Pacific library scene under the broad headings of the three phases of technology innovation. Highlights include WorldCat and the OCLC shared cataloging system; resource sharing and interlibrary loan; enriching OCLC online catalog with Asian collections; and future outlooks.…

  1. Human Resources and the Internet.

    ERIC Educational Resources Information Center

    Cohen, Suzanne; Joseph, Deborah

    Concerned about falling behind the technology curve, organizations are using the Internet or intranets to provide and communicate information to their employees and create more efficient workplaces. The Internet is not just a "network of computer networks," but a medium conveying a vast, diverse amount of information. This publication is…

  2. Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André

    2016-01-01

    Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric…

  3. Manufacturing Processes: New Methods for the "Materials Age." Resources in Technology.

    ERIC Educational Resources Information Center

    Technology Teacher, 1990

    1990-01-01

    To make the best use of new materials developed for everything from computers to artificial hearts to more fuel-efficient cars, improved materials syntheses and manufacturing processes are needed. This instructional module includes teacher materials, a student quiz, and possible student outcomes. (JOW)

  4. UCMP and the Internet help hospital libraries share resources.

    PubMed

    Dempsey, R; Weinstein, L

    1999-07-01

    The Medical Library Center of New York (MLCNY), a medical library consortium founded in 1959, has specialized in supporting resource sharing and fostering technological advances. In 1961, MLCNY developed and continues to maintain the Union Catalog of Medical Periodicals (UCMP), a resource tool including detailed data about the collections of more than 720 medical library participants. UCMP was one of the first library tools to capitalize on the benefits of computer technology and, from the beginning, invited hospital libraries to play a substantial role in its development. UCMP, beginning with products in print and later in microfiche, helped to create a new resource sharing environment. Today, UCMP continues to capitalize on new technology by providing access via the Internet and an Oracle-based search system providing subscribers with the benefits of: a database that contains serial holdings information on an issue specific level, a database that can be updated in real time, a system that provides multi-type searching and allows users to define how the results will be sorted, and an ordering function that can more precisely target libraries that have a specific issue of a medical journal. Current development of a Web-based system will ensure that UCMP continues to provide cost effective and efficient resource sharing in future years.

  5. UCMP and the Internet help hospital libraries share resources.

    PubMed Central

    Dempsey, R; Weinstein, L

    1999-01-01

    The Medical Library Center of New York (MLCNY), a medical library consortium founded in 1959, has specialized in supporting resource sharing and fostering technological advances. In 1961, MLCNY developed and continues to maintain the Union Catalog of Medical Periodicals (UCMP), a resource tool including detailed data about the collections of more than 720 medical library participants. UCMP was one of the first library tools to capitalize on the benefits of computer technology and, from the beginning, invited hospital libraries to play a substantial role in its development. UCMP, beginning with products in print and later in microfiche, helped to create a new resource sharing environment. Today, UCMP continues to capitalize on new technology by providing access via the Internet and an Oracle-based search system providing subscribers with the benefits of: a database that contains serial holdings information on an issue specific level, a database that can be updated in real time, a system that provides multi-type searching and allows users to define how the results will be sorted, and an ordering function that can more precisely target libraries that have a specific issue of a medical journal. Current development of a Web-based system will ensure that UCMP continues to provide cost effective and efficient resource sharing in future years. PMID:10427426

  6. Batching System for Superior Service

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  7. Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1977-01-01

    Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.

  8. Earth System Grid II (ESG): Turning Climate Model Datasets Into Community Resources

    NASA Astrophysics Data System (ADS)

    Williams, D.; Middleton, D.; Foster, I.; Nevedova, V.; Kesselman, C.; Chervenak, A.; Bharathi, S.; Drach, B.; Cinquni, L.; Brown, D.; Strand, G.; Fox, P.; Garcia, J.; Bernholdte, D.; Chanchio, K.; Pouchard, L.; Chen, M.; Shoshani, A.; Sim, A.

    2003-12-01

    High-resolution, long-duration simulations performed with advanced DOE SciDAC/NCAR climate models will produce tens of petabytes of output. To be useful, this output must be made available to global change impacts researchers nationwide, both at national laboratories and at universities, other research laboratories, and other institutions. To this end, we propose to create a new Earth System Grid, ESG-II - a virtual collaborative environment that links distributed centers, users, models, and data. ESG-II will provide scientists with virtual proximity to the distributed data and resources that they require to perform their research. The creation of this environment will significantly increase the scientific productivity of U.S. climate researchers by turning climate datasets into community resources. In creating ESG-II, we will integrate and extend a range of Grid and collaboratory technologies, including the DODS remote access protocols for environmental data, Globus Toolkit technologies for authentication, resource discovery, and resource access, and Data Grid technologies developed in other projects. We will develop new technologies for (1) creating and operating "filtering servers" capable of performing sophisticated analyses, and (2) delivering results to users. In so doing, we will simultaneously contribute to climate science and advance the state of the art in collaboratory technology. We expect our results to be useful to numerous other DOE projects. The three-year R&D program will be undertaken by a talented and experienced team of computer scientists at five laboratories (ANL, LBNL, LLNL, NCAR, ORNL) and one university (ISI), working in close collaboration with climate scientists at several sites.

  9. Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.

    2018-04-01

    The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.

  10. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and modeling through the web portal. The GeoBrain cyber-laboratory provides solutions to meet common needs of ES research and education, such as, distributed data access and analysis services, easy access to and use of ES data, and enhanced geoprocessing and geospatial modeling capability. It greatly facilitates ES research, education, and applications. The development of the cyber-laboratory provides insights, lessons-learned, and technology readiness to build more capable computing infrastructure for ES studies, which can meet wide-range needs of current and future generations of scientists, researchers, educators, and students for their formal or informal educational training, research projects, career development, and lifelong learning.

  11. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  12. RESTful M2M Gateway for Remote Wireless Monitoring for District Central Heating Networks

    PubMed Central

    Cheng, Bo; Wei, Zesan

    2014-01-01

    In recent years, the increased interest in energy conservation and environmental protection, combined with the development of modern communication and computer technology, has resulted in the replacement of distributed heating by central heating in urban areas. This paper proposes a Representational State Transfer (REST) Machine-to-Machine (M2M) gateway for wireless remote monitoring for a district central heating network. In particular, we focus on the resource-oriented RESTful M2M gateway architecture, and present an uniform devices abstraction approach based on Open Service Gateway Initiative (OSGi) technology, and implement the resource mapping mechanism between resource address mapping mechanism between RESTful resources and the physical sensor devices, and present the buffer queue combined with polling method to implement the data scheduling and Quality of Service (QoS) guarantee, and also give the RESTful M2M gateway open service Application Programming Interface (API) set. The performance has been measured and analyzed. Finally, the conclusions and future work are presented. PMID:25436650

  13. RESTful M2M gateway for remote wireless monitoring for district central heating networks.

    PubMed

    Cheng, Bo; Wei, Zesan

    2014-11-27

    In recent years, the increased interest in energy conservation and environmental protection, combined with the development of modern communication and computer technology, has resulted in the replacement of distributed heating by central heating in urban areas. This paper proposes a Representational State Transfer (REST) Machine-to-Machine (M2M) gateway for wireless remote monitoring for a district central heating network. In particular, we focus on the resource-oriented RESTful M2M gateway architecture, and present an uniform devices abstraction approach based on Open Service Gateway Initiative (OSGi) technology, and implement the resource mapping mechanism between resource address mapping mechanism between RESTful resources and the physical sensor devices, and present the buffer queue combined with polling method to implement the data scheduling and Quality of Service (QoS) guarantee, and also give the RESTful M2M gateway open service Application Programming Interface (API) set. The performance has been measured and analyzed. Finally, the conclusions and future work are presented.

  14. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.

  15. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  16. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  17. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  18. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  19. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  20. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS) Evolution of Computer Integrated Manufacturing (CIM) Technologies

    DTIC Science & Technology

    1988-11-01

    Manufacturing System 22 4. Similar Parts Based Shape or Manufactuting Process 24 5. Projected Annual Unit Robot Sales and Installed Base Through 1992 30 6. U.S...effort needed to perform personnel, product design, marketing , and advertising, and finance tasks of the firm. Level III controls the resource...planning and accounting functions of the firm. Systems at this level support purchasing, accounts payable, accounts receivable, master scheduling and sales

  1. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  2. User-Centered Authentication: LDAP, WRAP, X.509, XML (SIG LAN: Library Automation and Networks).

    ERIC Educational Resources Information Center

    Coble, Jim

    2000-01-01

    Presents an abstract for a planned panel session on technologies for user-centered authentication and authorization currently deployed in pilot or production implementations in academic computing. Presentations included: "Implementing LSAP for Single-Password Access to Campus Resources" (Layne Nordgren); "Implementing a Scalable…

  3. Sun Valley Elementary School Reading and Writing Assessment Project: Final Report.

    ERIC Educational Resources Information Center

    Zakaluk, Beverley L.

    A study investigated the effectiveness of integrating computer technology (multimedia learning resources in a "virtual" classroom) with content area and reading and writing curriculum. All students in grades 2 through 5 at Sun Valley Elementary School, Canada, had their reading and writing assessed. In addition, the writing performance…

  4. The Essen Learning Model--A Step towards a Representation of Learning Objectives.

    ERIC Educational Resources Information Center

    Bick, Markus; Pawlowski, Jan M.; Veith, Patrick

    The importance of the Extensible Markup Language (XML) technology family in the field of Computer Assisted Learning (CAL) can not be denied. The Instructional Management Systems Project (IMS), for example, provides a learning resource XML binding specification. Considering this specification and other implementations using XML to represent…

  5. What Students Really Want in Science Class

    ERIC Educational Resources Information Center

    Goldenberg, Lauren B.

    2011-01-01

    Nowadays, there are lots of digital resources available to teachers. Tools such as Teachers' Domain, an online digital library (see "On the web"); interactive whiteboards; computer projection devices; laptop carts; and robust wireless internet services make it easy for teachers to use technology in the classroom. In fact, in one…

  6. Intellectual Honesty in the Era of Computing.

    ERIC Educational Resources Information Center

    Connolly, Frank W.

    1995-01-01

    Discusses the need for intellectual honesty in using technology. Topics include intellectual property laws; ethics; indirect results of copying software and images; the need for institutional policy; and the provision of facilities and resources that encourage respect for policy. A sidebar provides "A Bill of Rights and Responsibilities for…

  7. Electronic Resources and the Education of History Professionals

    ERIC Educational Resources Information Center

    Mulligan, William H., Jr.

    2001-01-01

    The transforming effects of the tremendous advances in technology that have reshaped the economy and many other elements of American society have had an equally profound impact on historical agencies. The personal computer, the Internet, and associated electronic communications developments have already transformed the museum and historical agency…

  8. ORE's GENeric Evaluation SYStem: GENESYS 1988-89.

    ERIC Educational Resources Information Center

    Baenen, Nancy; And Others

    GENESYS--GENeric Evaluation SYStem--is a method of streamlining data collection and evaluation through the use of computer technology. GENESYS has allowed the Office of Research and Evaluation (ORE) of the Austin (Texas) Independent School District to evaluate a multitude of contrasting programs with limited resources. By standardizing methods and…

  9. Assessing Teaching Skills with a Mobile Simulation

    ERIC Educational Resources Information Center

    Gibson, David

    2013-01-01

    Because mobile technologies are overtaking personal computers as the primary tools of Internet access, and cloud-based resources are fundamentally transforming the world's knowledge, new forms of teaching and assessment are required to foster 21st century literacies, including those needed by K-12 teachers. A key feature of mobile technology…

  10. Chief Information Officers: New and Continuing Issues.

    ERIC Educational Resources Information Center

    Edutech Report, 1988

    1988-01-01

    Examines the functions of chief information officers on college campuses, and describes three major categories that the functions fall into, depending on the nature of computing within the institution; i.e., information technology as (1) a strategic resource, (2) an aid to operations, and (3) a source of confusion. (CLB)

  11. An Interactive Diagnosis Approach for Supporting Clinical Nursing Courses

    ERIC Educational Resources Information Center

    Wei, Chun-Wang; Lin, Yi-Chun; Lin, Yen-Ting

    2016-01-01

    Clinical resources in nursing schools are always insufficient for satisfying the practice requirements of each student at the same time during a formal course session. Although several studies have applied information and communication technology to develop computer-based learning tools for addressing this problem, most of these developments lack…

  12. A Suggested Model for a Working Cyberschool.

    ERIC Educational Resources Information Center

    Javid, Mahnaz A.

    2000-01-01

    Suggests a model for a working cyberschool based on a case study of Kamiak Cyberschool (Washington), a technology-driven public high school. Topics include flexible hours; one-to-one interaction with teachers; a supportive school environment; use of computers, interactive media, and online resources; and self-paced, project-based learning.…

  13. Networking Biology: The Origins of Sequence-Sharing Practices in Genomics.

    PubMed

    Stevens, Hallam

    2015-10-01

    The wide sharing of biological data, especially nucleotide sequences, is now considered to be a key feature of genomics. Historians and sociologists have attempted to account for the rise of this sharing by pointing to precedents in model organism communities and in natural history. This article supplements these approaches by examining the role that electronic networking technologies played in generating the specific forms of sharing that emerged in genomics. The links between early computer users at the Stanford Artificial Intelligence Laboratory in the 1960s, biologists using local computer networks in the 1970s, and GenBank in the 1980s, show how networking technologies carried particular practices of communication, circulation, and data distribution from computing into biology. In particular, networking practices helped to transform sequences themselves into objects that had value as a community resource.

  14. Dynamic Transportation Navigation

    NASA Astrophysics Data System (ADS)

    Meng, Xiaofeng; Chen, Jidong

    Miniaturization of computing devices, and advances in wireless communication and sensor technology are some of the forces that are propagating computing from the stationary desktop to the mobile outdoors. Some important classes of new applications that will be enabled by this revolutionary development include intelligent traffic management, location-based services, tourist services, mobile electronic commerce, and digital battlefield. Some existing application classes that will benefit from the development include transportation and air traffic control, weather forecasting, emergency response, mobile resource management, and mobile workforce. Location management, i.e., the management of transient location information, is an enabling technology for all these applications. In this chapter, we present the applications of moving objects management and their functionalities, in particular, the application of dynamic traffic navigation, which is a challenge due to the highly variable traffic state and the requirement of fast, on-line computations.

  15. Enabling campus grids with open science grid technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less

  16. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  17. Streaming support for data intensive cloud-based sequence analysis.

    PubMed

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  18. Models and Simulations as a Service: Exploring the Use of Galaxy for Delivering Computational Models

    PubMed Central

    Walker, Mark A.; Madduri, Ravi; Rodriguez, Alex; Greenstein, Joseph L.; Winslow, Raimond L.

    2016-01-01

    We describe the ways in which Galaxy, a web-based reproducible research platform, can be used for web-based sharing of complex computational models. Galaxy allows users to seamlessly customize and run simulations on cloud computing resources, a concept we refer to as Models and Simulations as a Service (MaSS). To illustrate this application of Galaxy, we have developed a tool suite for simulating a high spatial-resolution model of the cardiac Ca2+ spark that requires supercomputing resources for execution. We also present tools for simulating models encoded in the SBML and CellML model description languages, thus demonstrating how Galaxy’s reproducible research features can be leveraged by existing technologies. Finally, we demonstrate how the Galaxy workflow editor can be used to compose integrative models from constituent submodules. This work represents an important novel approach, to our knowledge, to making computational simulations more accessible to the broader scientific community. PMID:26958881

  19. Incorporating electronic media into medical student education: a survey of AMSER members on computer and web use in radiology courses. Alliance of Medical Student Educators in Radiology.

    PubMed

    Durfee, Sara M; Jain, Sidney; Shaffer, Kitt

    2003-02-01

    The purpose of this study was to define the current use of information technology in radiology tutorials for medical students. The authors conducted a Web-based survey of directors of medical school courses in radiology. The survey dealt with the details of the courses and the use of computers and the Web during the courses. There were 48 responses. Most radiology courses were elective (73%) and were offered monthly. Most institutions (79%) had picture archiving and communication systems (PACS) available or were completely filmless. The teaching case presentations, however, often included film images displayed on a view box or by an overhead projector. Computers dedicated to student use were uncommon (28%). The Web was used infrequently as a teaching resource, and a Web site was not available in most courses. Computer technical support was variable and usually provided by the course director. Course directors at institutions with PACS were more likely to use digital technology for case presentations and more likely to use the Web for teaching purposes. Despite the widespread use of digital technology and PACS in the field of radiology, digital technology is underused in radiology courses. However, departments with PACS tend to use digital technology more frequently in education than do departments without PACS.

  20. Efficient operating system level virtualization techniques for cloud resources

    NASA Astrophysics Data System (ADS)

    Ansu, R.; Samiksha; Anju, S.; Singh, K. John

    2017-11-01

    Cloud computing is an advancing technology which provides the servcies of Infrastructure, Platform and Software. Virtualization and Computer utility are the keys of Cloud computing. The numbers of cloud users are increasing day by day. So it is the need of the hour to make resources available on demand to satisfy user requirements. The technique in which resources namely storage, processing power, memory and network or I/O are abstracted is known as Virtualization. For executing the operating systems various virtualization techniques are available. They are: Full System Virtualization and Para Virtualization. In Full Virtualization, the whole architecture of hardware is duplicated virtually. No modifications are required in Guest OS as the OS deals with the VM hypervisor directly. In Para Virtualization, modifications of OS is required to run in parallel with other OS. For the Guest OS to access the hardware, the host OS must provide a Virtual Machine Interface. OS virtualization has many advantages such as migrating applications transparently, consolidation of server, online maintenance of OS and providing security. This paper briefs both the virtualization techniques and discusses the issues in OS level virtualization.

Top