Sample records for laboratory computing resource

  1. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  2. Shared-resource computing for small research labs.

    PubMed

    Ackerman, M J

    1982-04-01

    A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.

  3. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    NASA Astrophysics Data System (ADS)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  4. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  5. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  6. Strengthening laboratory systems in resource-limited settings.

    PubMed

    Olmsted, Stuart S; Moore, Melinda; Meili, Robin C; Duber, Herbert C; Wasserman, Jeffrey; Sama, Preethi; Mundell, Ben; Hilborne, Lee H

    2010-09-01

    Considerable resources have been invested in recent years to improve laboratory systems in resource-limited settings. We reviewed published reports, interviewed major donor organizations, and conducted case studies of laboratory systems in 3 countries to assess how countries and donors have worked together to improve laboratory services. While infrastructure and the provision of services have seen improvement, important opportunities remain for further advancement. Implementation of national laboratory plans is inconsistent, human resources are limited, and quality laboratory services rarely extend to lower tier laboratories (eg, health clinics, district hospitals). Coordination within, between, and among governments and donor organizations is also frequently problematic. Laboratory standardization and quality control are improving but remain challenging, making accreditation a difficult goal. Host country governments and their external funding partners should coordinate their efforts effectively around a host country's own national laboratory plan to advance sustainable capacity development throughout a country's laboratory system.

  7. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  8. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  9. Idaho National Laboratory Cultural Resource Management Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowrey, Diana Lee

    As a federal agency, the U.S. Department of Energy has been directed by Congress, the U.S. president, and the American public to provide leadership in the preservation of prehistoric, historic, and other cultural resources on the lands it administers. This mandate to preserve cultural resources in a spirit of stewardship for the future is outlined in various federal preservation laws, regulations, and guidelines such as the National Historic Preservation Act, the Archaeological Resources Protection Act, and the National Environmental Policy Act. The purpose of this Cultural Resource Management Plan is to describe how the Department of Energy, Idaho Operations Officemore » will meet these responsibilities at the Idaho National Laboratory. This Laboratory, which is located in southeastern Idaho, is home to a wide variety of important cultural resources representing at least 13,500 years of human occupation in the southeastern Idaho area. These resources are nonrenewable; bear valuable physical and intangible legacies; and yield important information about the past, present, and perhaps the future. There are special challenges associated with balancing the preservation of these sites with the management and ongoing operation of an active scientific laboratory. The Department of Energy, Idaho Operations Office is committed to a cultural resource management program that accepts these challenges in a manner reflecting both the spirit and intent of the legislative mandates. This document is designed for multiple uses and is intended to be flexible and responsive to future changes in law or mission. Document flexibility and responsiveness will be assured through annual reviews and as-needed updates. Document content includes summaries of Laboratory cultural resource philosophy and overall Department of Energy policy; brief contextual overviews of Laboratory missions, environment, and cultural history; and an overview of cultural resource management practices. A series of

  10. Idaho National Laboratory Cultural Resource Management Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowrey, Diana Lee

    2009-02-01

    As a federal agency, the U.S. Department of Energy has been directed by Congress, the U.S. president, and the American public to provide leadership in the preservation of prehistoric, historic, and other cultural resources on the lands it administers. This mandate to preserve cultural resources in a spirit of stewardship for the future is outlined in various federal preservation laws, regulations, and guidelines such as the National Historic Preservation Act, the Archaeological Resources Protection Act, and the National Environmental Policy Act. The purpose of this Cultural Resource Management Plan is to describe how the Department of Energy, Idaho Operations Officemore » will meet these responsibilities at the Idaho National Laboratory. This Laboratory, which is located in southeastern Idaho, is home to a wide variety of important cultural resources representing at least 13,500 years of human occupation in the southeastern Idaho area. These resources are nonrenewable; bear valuable physical and intangible legacies; and yield important information about the past, present, and perhaps the future. There are special challenges associated with balancing the preservation of these sites with the management and ongoing operation of an active scientific laboratory. The Department of Energy, Idaho Operations Office is committed to a cultural resource management program that accepts these challenges in a manner reflecting both the spirit and intent of the legislative mandates. This document is designed for multiple uses and is intended to be flexible and responsive to future changes in law or mission. Document flexibility and responsiveness will be assured through annual reviews and as-needed updates. Document content includes summaries of Laboratory cultural resource philosophy and overall Department of Energy policy; brief contextual overviews of Laboratory missions, environment, and cultural history; and an overview of cultural resource management practices. A series of

  11. Idaho National Laboratory Cultural Resource Management Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Julie Braun Williams

    As a federal agency, the U.S. Department of Energy has been directed by Congress, the U.S. president, and the American public to provide leadership in the preservation of prehistoric, historic, and other cultural resources on the lands it administers. This mandate to preserve cultural resources in a spirit of stewardship for the future is outlined in various federal preservation laws, regulations, and guidelines such as the National Historic Preservation Act, the Archaeological Resources Protection Act, and the National Environmental Policy Act. The purpose of this Cultural Resource Management Plan is to describe how the Department of Energy, Idaho Operations Officemore » will meet these responsibilities at Idaho National Laboratory in southeastern Idaho. The Idaho National Laboratory is home to a wide variety of important cultural resources representing at least 13,500 years of human occupation in the southeastern Idaho area. These resources are nonrenewable, bear valuable physical and intangible legacies, and yield important information about the past, present, and perhaps the future. There are special challenges associated with balancing the preservation of these sites with the management and ongoing operation of an active scientific laboratory. The Department of Energy, Idaho Operations Office is committed to a cultural resource management program that accepts these challenges in a manner reflecting both the spirit and intent of the legislative mandates. This document is designed for multiple uses and is intended to be flexible and responsive to future changes in law or mission. Document flexibility and responsiveness will be assured through regular reviews and as-needed updates. Document content includes summaries of Laboratory cultural resource philosophy and overall Department of Energy policy; brief contextual overviews of Laboratory missions, environment, and cultural history; and an overview of cultural resource management practices. A series of

  12. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  13. National Laboratory Planning: Developing Sustainable Biocontainment Laboratories in Limited Resource Areas.

    PubMed

    Yeh, Kenneth B; Adams, Martin; Stamper, Paul D; Dasgupta, Debanjana; Hewson, Roger; Buck, Charles D; Richards, Allen L; Hay, John

    2016-01-01

    Strategic laboratory planning in limited resource areas is essential for addressing global health security issues. Establishing a national reference laboratory, especially one with BSL-3 or -4 biocontainment facilities, requires a heavy investment of resources, a multisectoral approach, and commitments from multiple stakeholders. We make the case for donor organizations and recipient partners to develop a comprehensive laboratory operations roadmap that addresses factors such as mission and roles, engaging national and political support, securing financial support, defining stakeholder involvement, fostering partnerships, and building trust. Successful development occurred with projects in African countries and in Azerbaijan, where strong leadership and a clear management framework have been key to success. A clearly identified and agreed management framework facilitate identifying the responsibility for developing laboratory capabilities and support services, including biosafety and biosecurity, quality assurance, equipment maintenance, supply chain establishment, staff certification and training, retention of human resources, and sustainable operating revenue. These capabilities and support services pose rate-limiting yet necessary challenges. Laboratory capabilities depend on mission and role, as determined by all stakeholders, and demonstrate the need for relevant metrics to monitor the success of the laboratory, including support for internal and external audits. Our analysis concludes that alternative frameworks for success exist for developing and implementing capabilities at regional and national levels in limited resource areas. Thus, achieving a balance for standardizing practices between local procedures and accepted international standards is a prerequisite for integrating new facilities into a country's existing public health infrastructure and into the overall international scientific community.

  14. National Laboratory Planning: Developing Sustainable Biocontainment Laboratories in Limited Resource Areas

    PubMed Central

    Adams, Martin; Stamper, Paul D.; Dasgupta, Debanjana; Hewson, Roger; Buck, Charles D.; Richards, Allen L.; Hay, John

    2016-01-01

    Strategic laboratory planning in limited resource areas is essential for addressing global health security issues. Establishing a national reference laboratory, especially one with BSL-3 or -4 biocontainment facilities, requires a heavy investment of resources, a multisectoral approach, and commitments from multiple stakeholders. We make the case for donor organizations and recipient partners to develop a comprehensive laboratory operations roadmap that addresses factors such as mission and roles, engaging national and political support, securing financial support, defining stakeholder involvement, fostering partnerships, and building trust. Successful development occurred with projects in African countries and in Azerbaijan, where strong leadership and a clear management framework have been key to success. A clearly identified and agreed management framework facilitate identifying the responsibility for developing laboratory capabilities and support services, including biosafety and biosecurity, quality assurance, equipment maintenance, supply chain establishment, staff certification and training, retention of human resources, and sustainable operating revenue. These capabilities and support services pose rate-limiting yet necessary challenges. Laboratory capabilities depend on mission and role, as determined by all stakeholders, and demonstrate the need for relevant metrics to monitor the success of the laboratory, including support for internal and external audits. Our analysis concludes that alternative frameworks for success exist for developing and implementing capabilities at regional and national levels in limited resource areas. Thus, achieving a balance for standardizing practices between local procedures and accepted international standards is a prerequisite for integrating new facilities into a country's existing public health infrastructure and into the overall international scientific community. PMID:27559843

  15. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  16. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  17. Statistics Online Computational Resource for Education

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  18. An imputed genotype resource for the laboratory mouse

    PubMed Central

    Szatkiewicz, Jin P.; Beane, Glen L.; Ding, Yueming; Hutchins, Lucie; de Villena, Fernando Pardo-Manuel; Churchill, Gary A.

    2009-01-01

    We have created a high-density SNP resource encompassing 7.87 million polymorphic loci across 49 inbred mouse strains of the laboratory mouse by combining data available from public databases and training a hidden Markov model to impute missing genotypes in the combined data. The strong linkage disequilibrium found in dense sets of SNP markers in the laboratory mouse provides the basis for accurate imputation. Using genotypes from eight independent SNP resources, we empirically validated the quality of the imputed genotypes and demonstrate that they are highly reliable for most inbred strains. The imputed SNP resource will be useful for studies of natural variation and complex traits. It will facilitate association study designs by providing high density SNP genotypes for large numbers of mouse strains. We anticipate that this resource will continue to evolve as new genotype data become available for laboratory mouse strains. The data are available for bulk download or query at http://cgd.jax.org/. PMID:18301946

  19. Computer laboratory in medical education for medical students.

    PubMed

    Hercigonja-Szekeres, Mira; Marinović, Darko; Kern, Josipa

    2009-01-01

    Five generations of second year students at the Zagreb University School of Medicine were interviewed through an anonymous questionnaire on their use of personal computers, Internet, computer laboratories and computer-assisted education in general. Results show an advance in students' usage of information and communication technology during the period from 1998/99 to 2002/03. However, their positive opinion about computer laboratory depends on installed capacities: the better the computer laboratory technology, the better the students' acceptance and use of it.

  20. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    NASA Astrophysics Data System (ADS)

    Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.

    2011-12-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  1. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  2. The Workstation Approach to Laboratory Computing

    PubMed Central

    Crosby, P.A.; Malachowski, G.C.; Hall, B.R.; Stevens, V.; Gunn, B.J.; Hudson, S.; Schlosser, D.

    1985-01-01

    There is a need for a Laboratory Workstation which specifically addresses the problems associated with computing in the scientific laboratory. A workstation based on the IBM PC architecture and including a front end data acquisition system which communicates with a host computer via a high speed communications link; a new graphics display controller with hardware window management and window scrolling; and an integrated software package is described.

  3. NATURAL RESOURCE MANAGEMENT PLAN FOR BROOKHAVEN NATIONAL LABORATORY.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GREEN,T.ET AL.

    2003-12-31

    Brookhaven National Laboratory (BNL) is located near the geographic center of Long Island, New York. The Laboratory is situated on 5,265 acres of land composed of Pine Barrens habitat with a central area developed for Laboratory work. In the mid-1990s BNL began developing a wildlife management program. This program was guided by the Wildlife Management Plan (WMP), which was reviewed and approved by various state and federal agencies in September 1999. The WMP primarily addressed concerns with the protection of New York State threatened, endangered, or species of concern, as well as deer populations, invasive species management, and the revegetationmore » of the area surrounding the Relativistic Heavy Ion Collider (RHIC). The WMP provided a strong and sound basis for wildlife management and established a basis for forward motion and the development of this document, the Natural Resource Management Plan (NRMP), which will guide the natural resource management program for BNL. The body of this plan establishes the management goals and actions necessary for managing the natural resources at BNL. The appendices provide specific management requirements for threatened and endangered amphibians and fish (Appendices A and B respectively), lists of actions in tabular format (Appendix C), and regulatory drivers for the Natural Resource Program (Appendix D). The purpose of the Natural Resource Management Plan is to provide management guidance, promote stewardship of the natural resources found at BNL, and to integrate their protection with pursuit of the Laboratory's mission. The philosophy or guiding principles of the NRMP are stewardship, adaptive ecosystem management, compliance, integration with other plans and requirements, and incorporation of community involvement, where applicable.« less

  4. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dirk

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  5. Enabling opportunistic resources for CMS Computing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hufnagel, Dick

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  6. Enabling opportunistic resources for CMS Computing Operations

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  7. Earth Resources Laboratory research and technology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The accomplishments of the Earth Resources Laboratory's research and technology program are reported. Sensors and data systems, the AGRISTARS project, applied research and data analysis, joint research projects, test and evaluation studies, and space station support activities are addressed.

  8. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  9. Oklahoma's Mobile Computer Graphics Laboratory.

    ERIC Educational Resources Information Center

    McClain, Gerald R.

    This Computer Graphics Laboratory houses an IBM 1130 computer, U.C.C. plotter, printer, card reader, two key punch machines, and seminar-type classroom furniture. A "General Drafting Graphics System" (GDGS) is used, based on repetitive use of basic coordinate and plot generating commands. The system is used by 12 institutions of higher education…

  10. Classical multiparty computation using quantum resources

    NASA Astrophysics Data System (ADS)

    Clementi, Marco; Pappa, Anna; Eckstein, Andreas; Walmsley, Ian A.; Kashefi, Elham; Barz, Stefanie

    2017-12-01

    In this work, we demonstrate a way to perform classical multiparty computing among parties with limited computational resources. Our method harnesses quantum resources to increase the computational power of the individual parties. We show how a set of clients restricted to linear classical processing are able to jointly compute a nonlinear multivariable function that lies beyond their individual capabilities. The clients are only allowed to perform classical xor gates and single-qubit gates on quantum states. We also examine the type of security that can be achieved in this limited setting. Finally, we provide a proof-of-concept implementation using photonic qubits that allows four clients to compute a specific example of a multiparty function, the pairwise and.

  11. Computer-Assisted Laboratory Stations.

    ERIC Educational Resources Information Center

    Snyder, William J., Hanyak, Michael E.

    1985-01-01

    Describes the advantages and features of computer-assisted laboratory stations for use in a chemical engineering program. Also describes a typical experiment at such a station: determining the response times of a solid state humidity sensor at various humidity conditions and developing an empirical model for the sensor. (JN)

  12. Lipid and lipoprotein testing in resource-limited laboratories.

    PubMed

    Myers, Gary L

    2003-01-01

    The role of total cholesterol (TC) and lipoproteins in the assessment of coronary heart disease (CHD) is firmly established from population and intervention studies. Total and low-density lipoprotein cholesterol (LDLC) levels are positively associated with CHD, and high-density lipoprotein cholesterol (HDLC) levels are negatively associated with CHD. Efforts to identify and treat people at increased risk based on cholesterol and lipoprotein levels have led to more lipid testing and the need for very reliable test results. Thus, quality laboratory services are an essential component of healthcare delivery and play a vital role in any strategy to reduce morbidity and mortality from CHD. In laboratories with limited resources, establishing laboratory capability to measure CHD risk markers may be a considerable challenge. Laboratories face problems in selecting proper techniques, difficulties in equipment availability and maintenance, and shortage of supplies, staffing, and supervision. The Centers for Disease Control and Prevention (CDC) has been providing technical assistance for more than 30 years to laboratories that measure lipids and lipoproteins and is willing to provide technical assistance as needed for other laboratories to develop this capability. CDC can provide technical assistance to establish lipid and lipoprotein testing capability to support a CHD public health program in areas with limited laboratory resources. This assistance includes: selecting a suitable testing instrument; providing training for laboratory technicians; establishing a simple quality control plan; and instructing staff on how to prepare frozen serum control materials suitable for assessing accuracy of lipid and lipoprotein testing.

  13. Teaching Cardiovascular Integrations with Computer Laboratories.

    ERIC Educational Resources Information Center

    Peterson, Nils S.; Campbell, Kenneth B.

    1985-01-01

    Describes a computer-based instructional unit in cardiovascular physiology. The program (which employs simulated laboratory experimental techniques with a problem-solving format is designed to supplement an animal laboratory and to offer students an integrative approach to physiology through use of microcomputers. Also presents an overview of the…

  14. Resource Provisioning in SLA-Based Cluster Computing

    NASA Astrophysics Data System (ADS)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  15. Computing the Envelope for Stepwise-Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Computing tight resource-level bounds is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with nodes equal to the events and edges equal to the necessary predecessor links between events. A staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. Each stage has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible and promising for use in the inner loop of flexible-time scheduling algorithms.

  16. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  17. System Resource Allocations | High-Performance Computing | NREL

    Science.gov Websites

    Allocations System Resource Allocations To use NREL's high-performance computing (HPC) resources : Compute hours on NREL HPC Systems including Peregrine and Eagle Storage space (in Terabytes) on Peregrine , Eagle and Gyrfalcon. Allocations are principally done in response to an annual call for allocation

  18. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse.

    PubMed

    Eppig, Janan T

    2017-07-01

    The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. © The Author 2017. Published by Oxford University Press.

  19. Mouse Genome Informatics (MGI) Resource: Genetic, Genomic, and Biological Knowledgebase for the Laboratory Mouse

    PubMed Central

    Eppig, Janan T.

    2017-01-01

    Abstract The Mouse Genome Informatics (MGI) Resource supports basic, translational, and computational research by providing high-quality, integrated data on the genetics, genomics, and biology of the laboratory mouse. MGI serves a strategic role for the scientific community in facilitating biomedical, experimental, and computational studies investigating the genetics and processes of diseases and enabling the development and testing of new disease models and therapeutic interventions. This review describes the nexus of the body of growing genetic and biological data and the advances in computer technology in the late 1980s, including the World Wide Web, that together launched the beginnings of MGI. MGI develops and maintains a gold-standard resource that reflects the current state of knowledge, provides semantic and contextual data integration that fosters hypothesis testing, continually develops new and improved tools for searching and analysis, and partners with the scientific community to assure research data needs are met. Here we describe one slice of MGI relating to the development of community-wide large-scale mutagenesis and phenotyping projects and introduce ways to access and use these MGI data. References and links to additional MGI aspects are provided. PMID:28838066

  20. Implementing Computer Based Laboratories

    NASA Astrophysics Data System (ADS)

    Peterson, David

    2001-11-01

    Physics students at Francis Marion University will complete several required laboratory exercises utilizing computer-based Vernier probes. The simple pendulum, the acceleration due to gravity, simple harmonic motion, radioactive half lives, and radiation inverse square law experiments will be incorporated into calculus-based and algebra-based physics courses. Assessment of student learning and faculty satisfaction will be carried out by surveys and test results. Cost effectiveness and time effectiveness assessments will be presented. Majors in Computational Physics, Health Physics, Engineering, Chemistry, Mathematics and Biology take these courses, and assessments will be categorized by major. To enhance the computer skills of students enrolled in the courses, MAPLE will be used for further analysis of the data acquired during the experiments. Assessment of these enhancement exercises will also be presented.

  1. A Software Laboratory Environment for Computer-Based Problem Solving.

    ERIC Educational Resources Information Center

    Kurtz, Barry L.; O'Neal, Micheal B.

    This paper describes a National Science Foundation-sponsored project at Louisiana Technological University to develop computer-based laboratories for "hands-on" introductions to major topics of computer science. The underlying strategy is to develop structured laboratory environments that present abstract concepts through the use of…

  2. Annotated Bibliography of the Air Force Human Resources Laboratory Technical Reports - 1979.

    DTIC Science & Technology

    1981-05-01

    Force Human Resources Laboratory, March 1980. (Covers all AFHRL projects.) NTIS. This document provides the academic and industrial R&D community with...D-AI02 04𔃾 AIR FORCE HUMAN RESOURCES LAB BROOKS AF TX F/G 5/2 ANNOTATED BIBLIOGRAPHY OF THE AIR FORCE HUMAN RESOURCES LABORAT--ETC(U) MAY 81 E M...OF THE AIR FORCE HUMAN RESOURCES LABORATORY TECHNICAL REPORTS - 1979U M By M Esther M. Barlow A N TECHNICAL SERVICES DIVISION Brooks Air Force Base

  3. Determination of Absolute Zero Using a Computer-Based Laboratory

    ERIC Educational Resources Information Center

    Amrani, D.

    2007-01-01

    We present a simple computer-based laboratory experiment for evaluating absolute zero in degrees Celsius, which can be performed in college and undergraduate physical sciences laboratory courses. With a computer, absolute zero apparatus can help demonstrators or students to observe the relationship between temperature and pressure and use…

  4. Implementing a resource management program for accreditation process at the medical laboratory.

    PubMed

    Yenice, Sedef

    2009-03-01

    To plan for and provide adequate resources to meet the mission and goals of a medical laboratory in compliance with the requirements for laboratory accreditation by Joint Commission International. The related policies and procedures were developed based on standard requirements for resource management. Competency assessment provided continuing education and performance feedback to laboratory employees. Laboratory areas were designed for the efficient and safe performance of laboratory work. A physical environment was built up where hazards were controlled and personnel activities were managed to reduce the risk of injuries. An Employees Occupational Safety and Health Program (EOSHP) was developed to address all types of hazardous materials and wastes. Guidelines were defined to verify that the methods would produce accurate and reliable results. An active resource management program will be an effective way of assuring that systems are in control and continuous improvement is in progress.

  5. Computing the Envelope for Stepwise Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Estimating tight resource level is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with noises equal to the events and edges equal to the necessary predecessor links between events. The incremental solution of a staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. The staged algorithm has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible for use in the inner loop of search-based scheduling algorithms.

  6. NACA Computer at the Lewis Flight Propulsion Laboratory

    NASA Image and Video Library

    1951-02-21

    A female computer at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory with a slide rule and Friden adding machine to make computations. The computer staff was introduced during World War II to relieve short-handed research engineers of some of the tedious computational work. The Computing Section was staffed by “computers,” young female employees, who often worked overnight when most of the tests were run. The computers obtained test data from the manometers and other instruments, made the initial computations, and plotted the data graphically. Researchers then analyzed the data and summarized the findings in a report or made modifications and ran the test again. There were over 400 female employees at the laboratory in 1944, including 100 computers. The use of computers was originally planned only for the duration of the war. The system was so successful that it was extended into the 1960s. The computers and analysts were located in the Altitude Wind Tunnel Shop and Office Building office wing during the 1940s and transferred to the new 8- by 6-Foot Supersonic Wind Tunnel in 1948.

  7. Real-Time, Sensor-Based Computing in the Laboratory.

    ERIC Educational Resources Information Center

    Badmus, O. O.; And Others

    1996-01-01

    Demonstrates the importance of Real-Time, Sensor-Based (RTSB) computing and how it can be easily and effectively integrated into university student laboratories. Describes the experimental processes, the process instrumentation and process-computer interface, the computer and communications systems, and typical software. Provides much technical…

  8. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  9. Framework Resources Multiply Computing Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  10. Computation in Physics: Resources and Support

    NASA Astrophysics Data System (ADS)

    Engelhardt, Larry; Caballero, Marcos; Chonacky, Norman; Hilborn, Robert; Lopez Del Puerto, Marie; Roos, Kelly

    We will describe exciting new resources and support opportunities that have been developed by ``PICUP'' to help faculty to integrate computation into their physics courses. (``PICUP'' is the ``Partnership for Integration of Computation into Undergraduate Physics''). These resources include editable curricular materials that can be downloaded from the PICUP Collection of the ComPADRE Digital Library: www.compadre.org/PICUP. Support opportunities include week-long workshops during the summer and single-day workshops at national AAPT and APS meetings. This project is funded by the National Science Foundation under DUE IUSE Grants 1524128, 1524493, 1524963, 1525062, and 1525525.

  11. ACToR A Aggregated Computational Toxicology Resource ...

    EPA Pesticide Factsheets

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  12. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba);…

  13. Computing arrival times of firefighting resources for initial attack

    Treesearch

    Romain M. Mees

    1978-01-01

    Dispatching of firefighting resources requires instantaneous or precalculated decisions. A FORTRAN computer program has been developed that can provide a list of resources in order of computed arrival time for initial attack on a fire. The program requires an accurate description of the existing road system and a list of all resources available on a planning unit....

  14. Laboratory challenges conducting international clinical research in resource-limited settings.

    PubMed

    Fitzgibbon, Joseph E; Wallis, Carole L

    2014-01-01

    There are many challenges to performing clinical research in resource-limited settings. Here, we discuss several of the most common laboratory issues that must be addressed. These include issues relating to organization and personnel, laboratory facilities and equipment, standard operating procedures, external quality assurance, shipping, laboratory capacity, and data management. Although much progress has been made, innovative ways of addressing some of these issues are still very much needed.

  15. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  16. A computer-based physics laboratory apparatus: Signal generator software

    NASA Astrophysics Data System (ADS)

    Thanakittiviroon, Tharest; Liangrocapart, Sompong

    2005-09-01

    This paper describes a computer-based physics laboratory apparatus to replace expensive instruments such as high-precision signal generators. This apparatus uses a sound card in a common personal computer to give sinusoidal signals with an accurate frequency that can be programmed to give different frequency signals repeatedly. An experiment on standing waves on an oscillating string uses this apparatus. In conjunction with interactive lab manuals, which have been developed using personal computers in our university, we achieve a complete set of low-cost, accurate, and easy-to-use equipment for teaching a physics laboratory.

  17. Research on elastic resource management for multi-queue under cloud computing environment

    NASA Astrophysics Data System (ADS)

    CHENG, Zhenjing; LI, Haibo; HUANG, Qiulan; Cheng, Yaodong; CHEN, Gang

    2017-10-01

    As a new approach to manage computing resource, virtualization technology is more and more widely applied in the high-energy physics field. A virtual computing cluster based on Openstack was built at IHEP, using HTCondor as the job queue management system. In a traditional static cluster, a fixed number of virtual machines are pre-allocated to the job queue of different experiments. However this method cannot be well adapted to the volatility of computing resource requirements. To solve this problem, an elastic computing resource management system under cloud computing environment has been designed. This system performs unified management of virtual computing nodes on the basis of job queue in HTCondor based on dual resource thresholds as well as the quota service. A two-stage pool is designed to improve the efficiency of resource pool expansion. This paper will present several use cases of the elastic resource management system in IHEPCloud. The practical run shows virtual computing resource dynamically expanded or shrunk while computing requirements change. Additionally, the CPU utilization ratio of computing resource was significantly increased when compared with traditional resource management. The system also has good performance when there are multiple condor schedulers and multiple job queues.

  18. Environmental Resource Management Issues in Agronomy: A Lecture/Laboratory Course

    ERIC Educational Resources Information Center

    Munn, D. A.

    2004-01-01

    Environmental Sciences Technology T272 is a course with a laboratory addressing problems in soil and water quality and organic wastes utilization to serve students from associate degree programs in laboratory science and environmental resources management at a 2-year technical college. Goals are to build basic lab skills and understand the role…

  19. Report on Computing and Networking in the Space Science Laboratory by the SSL Computer Committee

    NASA Technical Reports Server (NTRS)

    Gallagher, D. L. (Editor)

    1993-01-01

    The Space Science Laboratory (SSL) at Marshall Space Flight Center is a multiprogram facility. Scientific research is conducted in four discipline areas: earth science and applications, solar-terrestrial physics, astrophysics, and microgravity science and applications. Representatives from each of these discipline areas participate in a Laboratory computer requirements committee, which developed this document. The purpose is to establish and discuss Laboratory objectives for computing and networking in support of science. The purpose is also to lay the foundation for a collective, multiprogram approach to providing these services. Special recognition is given to the importance of the national and international efforts of our research communities toward the development of interoperable, network-based computer applications.

  20. Computers in the General Physics Laboratory.

    ERIC Educational Resources Information Center

    Preston, Daryl W.; Good, R. H.

    1996-01-01

    Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)

  1. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  2. A study of computer graphics technology in application of communication resource management

    NASA Astrophysics Data System (ADS)

    Li, Jing; Zhou, Liang; Yang, Fei

    2017-08-01

    With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.

  3. Computer Network Resources for Physical Geography Instruction.

    ERIC Educational Resources Information Center

    Bishop, Michael P.; And Others

    1993-01-01

    Asserts that the use of computer networks provides an important and effective resource for geography instruction. Describes the use of the Internet network in physical geography instruction. Provides an example of the use of Internet resources in a climatology/meteorology course. (CFR)

  4. National resource for computation in chemistry, phase I: evaluation and recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1980-05-01

    The National Resource for Computation in Chemistry (NRCC) was inaugurated at the Lawrence Berkeley Laboratory (LBL) in October 1977, with joint funding by the Department of Energy (DOE) and the National Science Foundation (NSF). The chief activities of the NRCC include: assembling a staff of eight postdoctoral computational chemists, establishing an office complex at LBL, purchasing a midi-computer and graphics display system, administering grants of computer time, conducting nine workshops in selected areas of computational chemistry, compiling a library of computer programs with adaptations and improvements, initiating a software distribution system, providing user assistance and consultation on request. This reportmore » presents assessments and recommendations of an Ad Hoc Review Committee appointed by the DOE and NSF in January 1980. The recommendations are that NRCC should: (1) not fund grants for computing time or research but leave that to the relevant agencies, (2) continue the Workshop Program in a mode similar to Phase I, (3) abandon in-house program development and establish instead a competitive external postdoctoral program in chemistry software development administered by the Policy Board and Director, and (4) not attempt a software distribution system (leaving that function to the QCPE). Furthermore, (5) DOE should continue to make its computational facilities available to outside users (at normal cost rates) and should find some way to allow the chemical community to gain occasional access to a CRAY-level computer.« less

  5. A computer-managed undergraduate physics laboratory

    NASA Astrophysics Data System (ADS)

    Kalman, C. S.

    1987-01-01

    Seventeen one-semester undergraduate laboratory courses are managed by a microcomputer system at Concordia University. Students may perform experiments at any time during operating hours. The computer administers pre- and post-tests. Considerable savings in manpower costs is achieved. The system also provides many pedagogical advantages.

  6. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  7. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  8. Mathematics and Computer Science | Argonne National Laboratory

    Science.gov Websites

    Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center

  9. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    DTIC Science & Technology

    1991-06-01

    Proceedings of The National Conference on Artificial Intelligence , pages 181-184, The American Association for Aritificial Intelligence , Pittsburgh...Intermediary Resource: Intelligent Executive Computer Communication John Lyman and Carla J. Conaway University of California at Los Angeles for Contracting...Include Security Classification) Interim Report: Distributed Problem Solving: Adaptive Networks With a Computer Intermediary Resource: Intelligent

  10. Accuracy of a laboratory-based computer implant guiding system.

    PubMed

    Barnea, Eitan; Alt, Ido; Kolerman, Roni; Nissan, Joseph

    2010-05-01

    Computer-guided implant placement is a growing treatment modality in partially and totally edentulous patients, though data about the accuracy of some systems for computer-guided surgery is limited. The purpose of this study was to evaluate the accuracy of a laboratory computer-guided system. A laboratory-based computer guiding system (M Guide; MIS technologies, Shlomi, Israel) was used to place implants in a fresh sheep mandible. A second computerized tomography (CT) scan was taken after placing the implants . The drill plan figures of the planned implants were positioned using assigned software (Med3D, Heidelberg, Germany) on the second CT scan to compare the implant position with the initial planning. Values representing the implant locations of the original drill plan were compared with that of the placed implants using SPSS software. Six measurements (3 vertical, 3 horizontal) were made on each implant to assess the deviation from the initial implant planning. A repeated-measurement analysis of variance was performed comparing the location of measurement (center, abutment, apex) and type of deviation (vertical vs. horizontal). The vertical deviation (mean -0.168) was significantly smaller than the horizontal deviation (mean 1.148). The laboratory computer-based guiding system may be a viable treatment concept for placing implants. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  11. ACToR A Aggregated Computational Toxicology Resource (S) ...

    EPA Pesticide Factsheets

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  12. A resource-sharing model based on a repeated game in fog computing.

    PubMed

    Sun, Yan; Zhang, Nan

    2017-03-01

    With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  13. Industry and Academic Consortium for Computer Based Subsurface Geology Laboratory

    NASA Astrophysics Data System (ADS)

    Brown, A. L.; Nunn, J. A.; Sears, S. O.

    2008-12-01

    Twenty two licenses for Petrel Software acquired through a grant from Schlumberger are being used to redesign the laboratory portion of Subsurface Geology at Louisiana State University. The course redesign is a cooperative effort between LSU's Geology and Geophysics and Petroleum Engineering Departments and Schlumberger's Technical Training Division. In spring 2008, two laboratory sections were taught with 22 students in each section. The class contained geology majors, petroleum engineering majors, and geology graduate students. Limited enrollments and 3 hour labs make it possible to incorporate hands-on visualization, animation, manipulation of data and images, and access to geological data available online. 24/7 access to the laboratory and step by step instructions for Petrel exercises strongly promoted peer instruction and individual learning. Goals of the course redesign include: enhancing visualization of earth materials; strengthening student's ability to acquire, manage, and interpret multifaceted geological information; fostering critical thinking, the scientific method; improving student communication skills; providing cross training between geologists and engineers and increasing the quantity, quality, and diversity of students pursuing Earth Science and Petroleum Engineering careers. IT resources available in the laboratory provide students with sophisticated visualization tools, allowing them to switch between 2-D and 3-D reconstructions more seamlessly, and enabling them to manipulate larger integrated data-sets, thus permitting more time for critical thinking and hypothesis testing. IT resources also enable faculty and students to simultaneously work with the software to visually interrogate a 3D data set and immediately test hypothesis formulated in class. Preliminary evaluation of class results indicate that students found MS-Windows based Petrel easy to learn. By the end of the semester, students were able to not only map horizons and faults

  14. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  15. Natural Resource Management Plan for Brookhaven National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    green, T.

    This comprehensive Natural Resource Management Plan (NRMP) for Brookhaven National Laboratory (BNL) was built on the successful foundation of the Wildlife Management Plan for BNL, which it replaces. This update to the 2003 plan continues to build on successes and efforts to better understand the ecosystems and natural resources found on the BNL site. The plan establishes the basis for managing the varied natural resources located on the 5,265 acre BNL site, setting goals and actions to achieve those goals. The planning of this document is based on the knowledge and expertise gained over the past 10 years by themore » Natural Resources management staff at BNL in concert with local natural resource agencies including the New York State Department of Environmental Conservation, Long Island Pine Barrens Joint Planning and Policy Commission, The Nature Conservancy, and others. The development of this plan is an attempt at sound ecological management that not only benefits BNL's ecosystems but also benefits the greater Pine Barrens habitats in which BNL is situated. This plan applies equally to the Upton Ecological and Research Reserve (Upton Reserve). Any difference in management between the larger BNL area and the Upton Reserve are noted in the text. The purpose of the Natural Resource Management Plan (NRMP) is to provide management guidance, promote stewardship of the natural resources found at BNL, and to sustainably integrate their protection with pursuit of the Laboratory's mission. The philosophy or guiding principles of the NRMP are stewardship, sustainability, adaptive ecosystem management, compliance, integration with other plans and requirements, and the incorporation of community involvement, where applicable. The NRMP is periodically reviewed and updated, typically every five years. This review and update was delayed to develop documents associated with a new third party facility, the Long Island Solar Farm. This two hundred acre facility will result in

  16. Utilization of Educationally Oriented Microcomputer Based Laboratories

    ERIC Educational Resources Information Center

    Fitzpatrick, Michael J.; Howard, James A.

    1977-01-01

    Describes one approach to supplying engineering and computer science educators with an economical portable digital systems laboratory centered around microprocessors. Expansion of the microcomputer based laboratory concept to include Learning Resource Aided Instruction (LRAI) systems is explored. (Author)

  17. Voting with Their Seats: Computer Laboratory Design and the Casual User

    ERIC Educational Resources Information Center

    Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David

    2007-01-01

    Student computer laboratories are provided by most teaching institutions around the world; however, what is the most effective layout for such facilities? The log-in data files from computer laboratories at a regional university in Australia were analysed to determine whether there was a pattern in student seating. In particular, it was…

  18. A resource management architecture based on complex network theory in cloud computing federation

    NASA Astrophysics Data System (ADS)

    Zhang, Zehua; Zhang, Xuejie

    2011-10-01

    Cloud Computing Federation is a main trend of Cloud Computing. Resource Management has significant effect on the design, realization, and efficiency of Cloud Computing Federation. Cloud Computing Federation has the typical characteristic of the Complex System, therefore, we propose a resource management architecture based on complex network theory for Cloud Computing Federation (abbreviated as RMABC) in this paper, with the detailed design of the resource discovery and resource announcement mechanisms. Compare with the existing resource management mechanisms in distributed computing systems, a Task Manager in RMABC can use the historical information and current state data get from other Task Managers for the evolution of the complex network which is composed of Task Managers, thus has the advantages in resource discovery speed, fault tolerance and adaptive ability. The result of the model experiment confirmed the advantage of RMABC in resource discovery performance.

  19. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    NASA Astrophysics Data System (ADS)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  20. Contextuality as a Resource for Models of Quantum Computation with Qubits

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Delfosse, Nicolas; Browne, Dan E.; Okay, Cihan; Raussendorf, Robert

    2017-09-01

    A central question in quantum computation is to identify the resources that are responsible for quantum speed-up. Quantum contextuality has been recently shown to be a resource for quantum computation with magic states for odd-prime dimensional qudits and two-dimensional systems with real wave functions. The phenomenon of state-independent contextuality poses a priori an obstruction to characterizing the case of regular qubits, the fundamental building block of quantum computation. Here, we establish contextuality of magic states as a necessary resource for a large class of quantum computation schemes on qubits. We illustrate our result with a concrete scheme related to measurement-based quantum computation.

  1. A multipurpose computing center with distributed resources

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  2. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  3. The Advanced Labs Website: resources for upper-level laboratories

    NASA Astrophysics Data System (ADS)

    Torres-Isea, Ramon

    2012-03-01

    The Advanced Labs web resource collection is an effort to create a central, comprehensive information base for college/university faculty who teach upper-level undergraduate laboratories. The website is produced by the American Association of Physics Teachers (AAPT). It is a part of ComPADRE, the online collection of resources in physics and astronomy education, which itself is a part of the National Science Foundation-funded National Science Digital Library (NSDL). After a brief review of its history, we will discuss the current status of the website while describing the various types of resources available at the site and presenting examples of each. We will detail a step-by-step procedure for submitting resources to the website. The resource collection is designed to be a community effort and thus welcomes input and contributions from its users. We will also present plans, and will seek audience feedback, for additional website services and features. The constraints, roadblocks, and rewards of this project will also be addressed.

  4. Using OSG Computing Resources with (iLC)Dirac

    NASA Astrophysics Data System (ADS)

    Sailer, A.; Petric, M.; CLICdp Collaboration

    2017-10-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.

  5. COMPUTATIONAL RESOURCES FOR BIOFUEL FEEDSTOCK SPECIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buell, Carol Robin; Childs, Kevin L

    2013-05-07

    While current production of ethanol as a biofuel relies on starch and sugar inputs, it is anticipated that sustainable production of ethanol for biofuel use will utilize lignocellulosic feedstocks. Candidate plant species to be used for lignocellulosic ethanol production include a large number of species within the Grass, Pine and Birch plant families. For these biofuel feedstock species, there are variable amounts of genome sequence resources available, ranging from complete genome sequences (e.g. sorghum, poplar) to transcriptome data sets (e.g. switchgrass, pine). These data sets are not only dispersed in location but also disparate in content. It will be essentialmore » to leverage and improve these genomic data sets for the improvement of biofuel feedstock production. The objectives of this project were to provide computational tools and resources for data-mining genome sequence/annotation and large-scale functional genomic datasets available for biofuel feedstock species. We have created a Bioenergy Feedstock Genomics Resource that provides a web-based portal or clearing house for genomic data for plant species relevant to biofuel feedstock production. Sequence data from a total of 54 plant species are included in the Bioenergy Feedstock Genomics Resource including model plant species that permit leveraging of knowledge across taxa to biofuel feedstock species.We have generated additional computational analyses of these data, including uniform annotation, to facilitate genomic approaches to improved biofuel feedstock production. These data have been centralized in the publicly available Bioenergy Feedstock Genomics Resource (http://bfgr.plantbiology.msu.edu/).« less

  6. An Overview of the Computational Physics and Methods Group at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Randal Scott

    CCS Division was formed to strengthen the visibility and impact of computer science and computational physics research on strategic directions for the Laboratory. Both computer science and computational science are now central to scientific discovery and innovation. They have become indispensable tools for all other scientific missions at the Laboratory. CCS Division forms a bridge between external partners and Laboratory programs, bringing new ideas and technologies to bear on today’s important problems and attracting high-quality technical staff members to the Laboratory. The Computational Physics and Methods Group CCS-2 conducts methods research and develops scientific software aimed at the latest andmore » emerging HPC systems.« less

  7. Interfacing laboratory instruments to multiuser, virtual memory computers

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.; Stang, David B.; Roth, Don J.

    1989-01-01

    Incentives, problems and solutions associated with interfacing laboratory equipment with multiuser, virtual memory computers are presented. The major difficulty concerns how to utilize these computers effectively in a medium sized research group. This entails optimization of hardware interconnections and software to facilitate multiple instrument control, data acquisition and processing. The architecture of the system that was devised, and associated programming and subroutines are described. An example program involving computer controlled hardware for ultrasonic scan imaging is provided to illustrate the operational features.

  8. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  9. A Choice of Terminals: Spatial Patterning in Computer Laboratories

    ERIC Educational Resources Information Center

    Spennemann, Dirk; Cornforth, David; Atkinson, John

    2007-01-01

    Purpose: This paper seeks to examine the spatial patterns of student use of machines in each laboratory to whether there are underlying commonalities. Design/methodology/approach: The research was carried out by assessing the user behaviour in 16 computer laboratories at a regional university in Australia. Findings: The study found that computers…

  10. A Low Cost Microcomputer Laboratory for Investigating Computer Architecture.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    1980-01-01

    Described is a microcomputer laboratory at the United States Military Academy at West Point, New York, which provides easy access to non-volatile memory and a single input/output file system for 16 microcomputer laboratory positions. A microcomputer network that has a centralized data base is implemented using the concepts of computer network…

  11. Lawrence Livermore National Laboratory`s Computer Security Short Subjects Videos: Hidden Password, The Incident, Dangerous Games and The Mess; Computer Security Awareness Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.

  12. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  13. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    ERIC Educational Resources Information Center

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  14. Enabling Grid Computing resources within the KM3NeT computing model

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  15. Meta-analysis of the effectiveness of computer-based laboratory versus traditional hands-on laboratory in college and pre-college science instructions

    NASA Astrophysics Data System (ADS)

    Onuoha, Cajetan O.

    The purpose of this research study was to determine the overall effectiveness of computer-based laboratory compared with the traditional hands-on laboratory for improving students' science academic achievement and attitudes towards science subjects at the college and pre-college levels of education in the United States. Meta-analysis was used to synthesis the findings from 38 primary research studies conducted and/or reported in the United States between 1996 and 2006 that compared the effectiveness of computer-based laboratory with the traditional hands-on laboratory on measures related to science academic achievements and attitudes towards science subjects. The 38 primary research studies, with total subjects of 3,824 generated a total of 67 weighted individual effect sizes that were used in this meta-analysis. The study found that computer-based laboratory had small positive effect sizes over the traditional hands-on laboratory (ES = +0.26) on measures related to students' science academic achievements and attitudes towards science subjects (ES = +0.22). It was also found that computer-based laboratory produced more significant effects on physical science subjects compared to biological sciences (ES = +0.34, +0.17).

  16. Computers as learning resources in the health sciences: impact and issues.

    PubMed Central

    Ellis, L B; Hannigan, G G

    1986-01-01

    Starting with two computer terminals in 1972, the Health Sciences Learning Resources Center of the University of Minnesota Bio-Medical Library expanded its instructional facilities to ten terminals and thirty-five microcomputers by 1985. Computer use accounted for 28% of total center circulation. The impact of these resources on health sciences curricula is described and issues related to use, support, and planning are raised and discussed. Judged by their acceptance and educational value, computers are successful health sciences learning resources at the University of Minnesota. PMID:3518843

  17. Automating usability of ATLAS Distributed Computing resources

    NASA Astrophysics Data System (ADS)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  18. Computer-generated reminders and quality of pediatric HIV care in a resource-limited setting.

    PubMed

    Were, Martin C; Nyandiko, Winstone M; Huang, Kristin T L; Slaven, James E; Shen, Changyu; Tierney, William M; Vreeman, Rachel C

    2013-03-01

    To evaluate the impact of clinician-targeted computer-generated reminders on compliance with HIV care guidelines in a resource-limited setting. We conducted this randomized, controlled trial in an HIV referral clinic in Kenya caring for HIV-infected and HIV-exposed children (<14 years of age). For children randomly assigned to the intervention group, printed patient summaries containing computer-generated patient-specific reminders for overdue care recommendations were provided to the clinician at the time of the child's clinic visit. For children in the control group, clinicians received the summaries, but no computer-generated reminders. We compared differences between the intervention and control groups in completion of overdue tasks, including HIV testing, laboratory monitoring, initiating antiretroviral therapy, and making referrals. During the 5-month study period, 1611 patients (49% female, 70% HIV-infected) were eligible to receive at least 1 computer-generated reminder (ie, had an overdue clinical task). We observed a fourfold increase in the completion of overdue clinical tasks when reminders were availed to providers over the course of the study (68% intervention vs 18% control, P < .001). Orders also occurred earlier for the intervention group (77 days, SD 2.4 days) compared with the control group (104 days, SD 1.2 days) (P < .001). Response rates to reminders varied significantly by type of reminder and between clinicians. Clinician-targeted, computer-generated clinical reminders are associated with a significant increase in completion of overdue clinical tasks for HIV-infected and exposed children in a resource-limited setting.

  19. Performance Evaluation of Resource Management in Cloud Computing Environments.

    PubMed

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  20. Performance Evaluation of Resource Management in Cloud Computing Environments

    PubMed Central

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price. PMID:26555730

  1. Laboratory Sequence in Computational Methods for Introductory Chemistry

    NASA Astrophysics Data System (ADS)

    Cody, Jason A.; Wiser, Dawn C.

    2003-07-01

    A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.

  2. Development of Computer-Based Resources for Textile Education.

    ERIC Educational Resources Information Center

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  3. Laboratory Diagnosis of Tuberculosis in Resource-Poor Countries: Challenges and Opportunities

    PubMed Central

    Parsons, Linda M.; Somoskövi, Ákos; Gutierrez, Cristina; Lee, Evan; Paramasivan, C. N.; Abimiku, Alash'le; Spector, Steven; Roscigno, Giorgio; Nkengasong, John

    2011-01-01

    Summary: With an estimated 9.4 million new cases globally, tuberculosis (TB) continues to be a major public health concern. Eighty percent of all cases worldwide occur in 22 high-burden, mainly resource-poor settings. This devastating impact of tuberculosis on vulnerable populations is also driven by its deadly synergy with HIV. Therefore, building capacity and enhancing universal access to rapid and accurate laboratory diagnostics are necessary to control TB and HIV-TB coinfections in resource-limited countries. The present review describes several new and established methods as well as the issues and challenges associated with implementing quality tuberculosis laboratory services in such countries. Recently, the WHO has endorsed some of these novel methods, and they have been made available at discounted prices for procurement by the public health sector of high-burden countries. In addition, international and national laboratory partners and donors are currently evaluating other new diagnostics that will allow further and more rapid testing in point-of-care settings. While some techniques are simple, others have complex requirements, and therefore, it is important to carefully determine how to link these new tests and incorporate them within a country's national diagnostic algorithm. Finally, the successful implementation of these methods is dependent on key partnerships in the international laboratory community and ensuring that adequate quality assurance programs are inherent in each country's laboratory network. PMID:21482728

  4. Examining Student Outcomes in University Computer Laboratory Environments: Issues for Educational Management

    ERIC Educational Resources Information Center

    Newby, Michael; Marcoulides, Laura D.

    2008-01-01

    Purpose: The purpose of this paper is to model the relationship between student performance, student attitudes, and computer laboratory environments. Design/methodology/approach: Data were collected from 234 college students enrolled in courses that involved the use of a computer to solve problems and provided the laboratory experience by means of…

  5. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    NASA Astrophysics Data System (ADS)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea

    2017-10-01

    The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.

  6. Teacher's Resource Guide on Acidic Precipitation with Laboratory Activities.

    ERIC Educational Resources Information Center

    Barrow, Lloyd H.

    The purpose of this teacher's resource guide is to help science teachers incorporate the topic of acidic precipitation into their curricula. A survey of recent junior high school science textbooks found a maximum of one paragraph devoted to the subject; in addition, none of these books had any related laboratory activities. It was on the basis of…

  7. Computer-Aided Drafting and Design Series. Educational Resources for the Machine Tool Industry, Course Syllabi, [and] Instructor's Handbook. Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment in computer-aided drafting and design in the machine tool industry. The program was developed through a modification of the DACUM (Developing a Curriculum)…

  8. Computer-Based Resource Accounting Model for Automobile Technology Impact Assessment

    DOT National Transportation Integrated Search

    1976-10-01

    A computer-implemented resource accounting model has been developed for assessing resource impacts of future automobile technology options. The resources tracked are materials, energy, capital, and labor. The model has been used in support of the Int...

  9. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  10. Idaho National Laboratory Cultural Resource Management Annual Report FY 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton F. Marler; Julie Braun; Hollie Gilbert

    2007-04-01

    The Idaho National Laboratory Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500-year span of human occupation in the region. As a federal agency, the Department of Energy Idaho Operations Office has legal responsibility for the management and protection of those resources and has delegated these responsibilities to its primary contractor, Battelle Energy Alliance (BEA). The INL Cultural Resource Management Office, staffed by BEA professionals, is committed to maintaining a cultural resource management program that accepts these challenges in a manner reflecting the resources’ importance in local, regional, and nationalmore » history. This annual report summarizes activities performed by the INL Cultural Resource Management Office staff during Fiscal Year 2006. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be both informative to internal and external stakeholders, and to serve as a planning tool for future cultural resource management work to be conducted on the INL.« less

  11. Methods and systems for providing reconfigurable and recoverable computing resources

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A method for optimizing the use of digital computing resources to achieve reliability and availability of the computing resources is disclosed. The method comprises providing one or more processors with a recovery mechanism, the one or more processors executing one or more applications. A determination is made whether the one or more processors needs to be reconfigured. A rapid recovery is employed to reconfigure the one or more processors when needed. A computing system that provides reconfigurable and recoverable computing resources is also disclosed. The system comprises one or more processors with a recovery mechanism, with the one or more processors configured to execute a first application, and an additional processor configured to execute a second application different than the first application. The additional processor is reconfigurable with rapid recovery such that the additional processor can execute the first application when one of the one more processors fails.

  12. Environmental resource document for the Idaho National Engineering Laboratory. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Irving, J.S.

    This document contains information related to the environmental characterization of the Idaho National Engineering Laboratory (INEL). The INEL is a major US Department of Energy facility in southeastern Idaho dedicated to nuclear research, waste management, environmental restoration, and other activities related to the development of technology. Environmental information covered in this document includes land, air, water, and ecological resources; socioeconomic characteristics and land use; and cultural, aesthetic, and scenic resources.

  13. The Gain of Resource Delegation in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander

    In this paper, we address job scheduling in Distributed Computing Infrastructures, that is a loosely coupled network of autonomous acting High Performance Computing systems. In contrast to the common approach of mutual workload exchange, we consider the more intuitive operator's viewpoint of load-dependent resource reconfiguration. In case of a site's over-utilization, the scheduling system is able to lease resources from other sites to keep up service quality for its local user community. Contrary, the granting of idle resources can increase utilization in times of low local workload and thus ensure higher efficiency. The evaluation considers real workload data and is done with respect to common service quality indicators. For two simple resource exchange policies and three basic setups we show the possible gain of this approach and analyze the dynamics in workload-adaptive reconfiguration behavior.

  14. A Virtual Embedded Microcontroller Laboratory for Undergraduate Education: Development and Evaluation

    ERIC Educational Resources Information Center

    Richardson, Jeffrey J.; Adamo-Villani, Nicoletta

    2010-01-01

    Laboratory instruction is a major component of the engineering and technology undergraduate curricula. Traditional laboratory instruction is hampered by several factors including limited access to resources by students and high laboratory maintenance cost. A photorealistic 3D computer-simulated laboratory for undergraduate instruction in…

  15. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  16. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE PAGES

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; ...

    2017-10-01

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  17. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  18. ACToR A Aggregated Computational Toxicology Resource

    EPA Science Inventory

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  19. ANL site response for the DOE FY1994 information resources management long-range plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory`s ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory`s previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory`s Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, ``Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less

  20. Idaho National Laboratory Cultural Resource Management Annual Report FY 2007

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Julie Braun; Hollie Gilbert; Dino Lowrey

    2008-03-01

    The Idaho National Laboratory (INL) Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500-year span of human land use in the region. As a federal agency, the Department of Energy Idaho Operations Office has legal responsibility for the management and protection of those resources and has delegated these responsibilities to its primary contractor, Battelle Energy Alliance (BEA). The BEA professional staff is committed to maintaining a cultural resource management program that accepts these challenges in a manner reflecting the resources’ importance in local, regional, and national history. This annual reportmore » summarizes activities performed by the INL Cultural Resource Management Office (CRMO) staff during fiscal year 2007. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be both informative to internal and external stakeholders, and to serve as a planning tool for future cultural resource management work to be conducted on the INL.« less

  1. Lawrence Berkeley Laboratory, Institutional Plan FY 1994--1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-09-01

    The Institutional Plan provides an overview of the Lawrence Berkeley Laboratory mission, strategic plan, scientific initiatives, research programs, environment and safety program plans, educational and technology transfer efforts, human resources, and facilities needs. For FY 1994-1999 the Institutional Plan reflects significant revisions based on the Laboratory`s strategic planning process. The Strategic Plan section identifies long-range conditions that will influence the Laboratory, as well as potential research trends and management implications. The Initiatives section identifies potential new research programs that represent major long-term opportunities for the Laboratory, and the resources required for their implementation. The Scientific and Technical Programs section summarizesmore » current programs and potential changes in research program activity. The Environment, Safety, and Health section describes the management systems and programs underway at the Laboratory to protect the environment, the public, and the employees. The Technology Transfer and Education programs section describes current and planned programs to enhance the nation`s scientific literacy and human infrastructure and to improve economic competitiveness. The Human Resources section identifies LBL staff diversity and development program. The section on Site and Facilities discusses resources required to sustain and improve the physical plant and its equipment. The new section on Information Resources reflects the importance of computing and communication resources to the Laboratory. The Resource Projections are estimates of required budgetary authority for the Laboratory`s ongoing research programs. The Institutional Plan is a management report for integration with the Department of Energy`s strategic planning activities, developed through an annual planning process.« less

  2. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    ERIC Educational Resources Information Center

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  3. Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory

    ERIC Educational Resources Information Center

    Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo

    2005-01-01

    We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…

  4. Childhood as a Resource and Laboratory for the Self-Project

    ERIC Educational Resources Information Center

    Buhler-Niederberger, Doris; Konig, Alexandra

    2011-01-01

    The biographies of individuals in today's societies are characterized by the need to exert effort and make decisions in planning one's life course. A "self-project" has to be worked out both retrospectively and prospectively; childhood becomes important as a resource and a laboratory for the self-project. This empirical study analyses how the…

  5. Resource Letter SPE-1: Single-Photon Experiments in the Undergraduate Laboratory

    NASA Astrophysics Data System (ADS)

    Galvez, Enrique J.

    2014-11-01

    This Resource Letter lists undergraduate-laboratory adaptations of landmark optical experiments on the fundamentals of quantum physics. Journal articles and websites give technical details of the adaptations, which offer students unique hands-on access to testing fundamental concepts and predictions of quantum mechanics. A selection of the original research articles that led to the implementations is included. These developments have motivated a rethinking of the way quantum mechanics is taught, so this Resource Letter also lists textbooks that provide these new approaches.

  6. Aggregated Computational Toxicology Online Resource

    EPA Pesticide Factsheets

    Aggregated Computational Toxicology Online Resource (AcTOR) is EPA's online aggregator of all the public sources of chemical toxicity data. ACToR aggregates data from over 1,000 public sources on over 500,000 chemicals and is searchable by chemical name, other identifiers and by chemical structure. It can be used to query a specific chemical and find all publicly available hazard, exposure and risk assessment data. It also provides access to EPA's ToxCast, ToxRefDB, DSSTox, Dashboard and DSSTox data.

  7. The Computer Explosion: Implications for Educational Equity. Resource Notebook.

    ERIC Educational Resources Information Center

    Denbo, Sheryl, Comp.

    This notebook was prepared to provide resources for educators interested in using computers to increase opportunities for all students. The notebook contains specially prepared materials and selected newspaper and journal articles. The first section reviews the issues related to computer equity (equal access, tracking through different…

  8. Computer Security Awareness Guide for Department of Energy Laboratories, Government Agencies, and others for use with Lawrence Livermore National Laboratory`s (LLNL): Computer security short subjects videos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less

  9. Economic models for management of resources in peer-to-peer and grid computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  10. Idaho National Laboratory Cultural Resource Management Office FY 2010 Activity Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollie K. Gilbert; Clayton F. Marler; Christina L. Olson

    2011-09-01

    The Idaho National Laboratory (INL) Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500 year span of human land use in the region. As a federal agency, the Department of Energy, Idaho Operations Office (DOE-ID) has legal responsibility for the management and protection of the resources and has contracted these responsibilities to Battelle Energy Alliance (BEA). The BEA professional staff is committed to maintaining a cultural resource management program that accepts the challenge of preserving INL cultural resources in a manner reflecting their importance in local, regional, and national history.more » This report summarizes activities performed by the INL Cultural Resource Management Office (CRMO) staff during fiscal year 2010. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be informative to both internal and external stakeholders and to serve as a planning tool for future INL cultural resource management work.« less

  11. ACToR: Aggregated Computational Toxicology Resource (T)

    EPA Science Inventory

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information ...

  12. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  13. A computational model of the human hand 93-ERI-053

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack ofmore » biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.« less

  14. A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-10-01

    Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    PubMed

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  16. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing

    PubMed Central

    Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network. PMID:28030553

  17. Elementary and Advanced Computer Projects for the Physics Classroom and Laboratory

    DTIC Science & Technology

    1992-12-01

    are SPF/PC, MS Word, n3, Symphony, Mathematics, and FORTRAN. The authors’ programs assist data analysis in particular laboratory experiments and make...assist data analysis in particular laboratory experiments and make use of the Monte Carlo and other numerical techniques in computer simulation and...the language of science and engineering in industry and government laboratories (alth..4h C is becoming a powerful competitor ). RM/FORTRAN (cost $400

  18. Biomedical laboratory science education: standardising teaching content in resource-limited countries.

    PubMed

    Arneson, Wendy; Robinson, Cathy; Nyary, Bryan

    2013-01-01

    There is a worldwide shortage of qualified laboratory personnel to provide adequate testing for the detection and monitoring of diseases. In an effort to increase laboratory capacity in developing countries, new skills have been introduced into laboratory services. Curriculum revision with a focus on good laboratory practice is an important aspect of supplying entry-level graduates with the competencies needed to meet the current needs. Gaps in application and problem-solving competencies of newly graduated laboratory personnel were discovered in Ethiopia, Tanzania and Kenya. New medical laboratory teaching content was developed in Ethiopia, Tanzania and Kenya using national instructors, tutors, and experts and consulting medical laboratory educators from the United States of America (USA). Workshops were held in Ethiopia to create standardised biomedical laboratory science (BMLS) lessons based on recently-revised course objectives with an emphasis on application of skills. In Tanzania, course-module teaching guides with objectives were developed based on established competency outcomes and tasks. In Kenya, example interactive presentations and lesson plans were developed by the USA medical laboratory educators prior to the workshop to serve as resources and templates for the development of lessons within the country itself. The new teaching materials were implemented and faculty, students and other stakeholders reported successful outcomes. These approaches to updating curricula may be helpful as biomedical laboratory schools in other countries address gaps in the competencies of entry-level graduates.

  19. Junior High Computer Studies: Teacher Resource Manual.

    ERIC Educational Resources Information Center

    Alberta Dept. of Education, Edmonton. Curriculum Branch.

    This manual is designed to help classroom teachers in Alberta, Canada implement the Junior High Computer Studies Program. The first eight sections cover the following material: (1) introduction to the teacher resource manual; (2) program rationale and philosophy; (3) general learner expectations; (4) program framework and flexibility; (5) program…

  20. Computer validation in toxicology: historical review for FDA and EPA good laboratory practice.

    PubMed

    Brodish, D L

    1998-01-01

    The application of computer validation principles to Good Laboratory Practice is a fairly recent phenomenon. As automated data collection systems have become more common in toxicology facilities, the U.S. Food and Drug Administration and the U.S. Environmental Protection Agency have begun to focus inspections in this area. This historical review documents the development of regulatory guidance on computer validation in toxicology over the past several decades. An overview of the components of a computer life cycle is presented, including the development of systems descriptions, validation plans, validation testing, system maintenance, SOPs, change control, security considerations, and system retirement. Examples are provided for implementation of computer validation principles on laboratory computer systems in a toxicology facility.

  1. The effect of introducing computers into an introductory physics problem-solving laboratory

    NASA Astrophysics Data System (ADS)

    McCullough, Laura Ellen

    2000-10-01

    Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted

  2. ACToR – Aggregated Computational Toxicology Resource ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a collection of databases collated or developed by the US EPA National Center for Computational Toxicology (NCCT). More than 200 sources of publicly available data on environmental chemicals have been brought together and made searchable by chemical name and other identifiers, and by chemical structure. Data includes chemical structure, physico-chemical values, in vitro assay data and in vivo toxicology data. Chemicals include, but are not limited to, high and medium production volume industrial chemicals, pesticides (active and inert ingredients), and potential ground and drinking water contaminants.

  3. Diversity in computing technologies and strategies for dynamic resource allocation

    DOE PAGES

    Garzoglio, G.; Gutsche, O.

    2015-12-23

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  4. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  5. Applications of computer-aided text analysis in natural resources.

    Treesearch

    David N. Bengston

    2000-01-01

    Ten contributed papers describe the use of a variety of approaches to computer-aided text analysis and their application to a wide range of research questions related to natural resources and the environment. Taken together, these papers paint a picture of a growing and vital area of research on the human dimensions of natural resource management.

  6. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    ERIC Educational Resources Information Center

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  7. VECTR: Virtual Environment Computational Training Resource

    NASA Technical Reports Server (NTRS)

    Little, William L.

    2018-01-01

    The Westridge Middle School Curriculum and Community Night is an annual event designed to introduce students and parents to potential employers in the Central Florida area. NASA participated in the event in 2017, and has been asked to come back for the 2018 event on January 25. We will be demonstrating our Microsoft Hololens Virtual Rovers project, and the Virtual Environment Computational Training Resource (VECTR) virtual reality tool.

  8. A Review of Resources for Evaluating K-12 Computer Science Education Programs

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Hartikainen, Elina

    2004-01-01

    Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…

  9. ACToR A Aggregated Computational Toxicology Resource (S)

    EPA Science Inventory

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  10. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    NASA Astrophysics Data System (ADS)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  11. Atmospheric transmission computer program CP

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Barnett, T. L.; Korb, C. L.; Hanby, W.; Dillinger, A. E.

    1974-01-01

    A computer program is described which allows for calculation of the effects of carbon dioxide, water vapor, methane, ozone, carbon monoxide, and nitrous oxide on earth resources remote sensing techniques. A flow chart of the program and operating instructions are provided. Comparisons are made between the atmospheric transmission obtained from laboratory and spacecraft spectrometer data and that obtained from a computer prediction using a model atmosphere and radiosonde data. Limitations of the model atmosphere are discussed. The computer program listings, input card formats, and sample runs for both radiosonde data and laboratory data are included.

  12. An Easily Assembled Laboratory Exercise in Computed Tomography

    ERIC Educational Resources Information Center

    Mylott, Elliot; Klepetka, Ryan; Dunlap, Justin C.; Widenhorn, Ralf

    2011-01-01

    In this paper, we present a laboratory activity in computed tomography (CT) primarily composed of a photogate and a rotary motion sensor that can be assembled quickly and partially automates data collection and analysis. We use an enclosure made with a light filter that is largely opaque in the visible spectrum but mostly transparent to the near…

  13. Idaho National Laboratory Cultural Resource Management Office FY 2011 Activity Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Julie Braun Williams; Brenda R. Pace; Hollie K. Gilbert

    The Idaho National Laboratory (INL) Site is home to vast numbers and a wide variety of important cultural resources representing at least a 13,500 year span of human land use in the region. As a federal agency, the Department of Energy, Idaho Operations Office (DOE-ID) has legal responsibility for the management and protection of the resources and has contracted these responsibilities to Battelle Energy Alliance (BEA). The BEA professional staff is committed to maintaining a cultural resource management program that accepts the challenge of preserving INL cultural resources in a manner reflecting their importance in local, regional, and national history.more » This report is intended as a stand-alone document that summarizes activities performed by the INL Cultural Resource Management Office (CRMO) staff during fiscal year 2011. This work is diverse, far-reaching and though generally confined to INL cultural resource compliance, also includes a myriad of professional and voluntary community activities. This document is intended to be informative to both internal and external stakeholders, serve as a planning tool for future INL cultural resource management work, and meet an agreed upon legal requirement.« less

  14. Biomedical laboratory science education: standardising teaching content in resource-limited countries

    PubMed Central

    Robinson, Cathy; Nyary, Bryan

    2013-01-01

    Background There is a worldwide shortage of qualified laboratory personnel to provide adequate testing for the detection and monitoring of diseases. In an effort to increase laboratory capacity in developing countries, new skills have been introduced into laboratory services. Curriculum revision with a focus on good laboratory practice is an important aspect of supplying entry-level graduates with the competencies needed to meet the current needs. Objectives Gaps in application and problem-solving competencies of newly graduated laboratory personnel were discovered in Ethiopia, Tanzania and Kenya. New medical laboratory teaching content was developed in Ethiopia, Tanzania and Kenya using national instructors, tutors, and experts and consulting medical laboratory educators from the United States of America (USA). Method Workshops were held in Ethiopia to create standardised biomedical laboratory science (BMLS) lessons based on recently-revised course objectives with an emphasis on application of skills. In Tanzania, course-module teaching guides with objectives were developed based on established competency outcomes and tasks. In Kenya, example interactive presentations and lesson plans were developed by the USA medical laboratory educators prior to the workshop to serve as resources and templates for the development of lessons within the country itself. Results The new teaching materials were implemented and faculty, students and other stakeholders reported successful outcomes. Conclusions These approaches to updating curricula may be helpful as biomedical laboratory schools in other countries address gaps in the competencies of entry-level graduates. PMID:29043162

  15. A User Assessment of Workspaces in Selected Music Education Computer Laboratories.

    ERIC Educational Resources Information Center

    Badolato, Michael Jeremy

    A study of 120 students selected from the user populations of four music education computer laboratories was conducted to determine the applicability of current ergonomic and environmental design guidelines in satisfying the needs of users of educational computing workspaces. Eleven categories of workspace factors were organized into a…

  16. ANL site response for the DOE FY1994 information resources management long-range plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boxberger, L.M.

    1992-03-01

    Argonne National Laboratory's ANL Site Response for the DOE FY1994 Information Resources Management (IRM) Long-Range Plan (ANL/TM 500) is one of many contributions to the DOE information resources management long-range planning process and, as such, is an integral part of the DOE policy and program planning system. The Laboratory has constructed this response according to instructions in a Call issued in September 1991 by the DOE Office of IRM Policy, Plans and Oversight. As one of a continuing series, this Site Response is an update and extension of the Laboratory's previous submissions. The response contains both narrative and tabular material.more » It covers an eight-year period consisting of the base year (FY1991), the current year (FY1992), the budget year (FY1993), the plan year (FY1994), and the out years (FY1995-FY1998). This Site Response was compiled by Argonne National Laboratory's Computing and Telecommunications Division (CTD), which has the responsibility to provide leadership in optimizing computing and information services and disseminating computer-related technologies throughout the Laboratory. The Site Response consists of 5 parts: (1) a site overview, describes the ANL mission, overall organization structure, the strategic approach to meet information resource needs, the planning process, major issues and points of contact. (2) a software plan for DOE contractors, Part 2B, Software Plan FMS plan for DOE organizations, (3) computing resources telecommunications, (4) telecommunications, (5) printing and publishing.« less

  17. Creating and Using a Computer Networking and Systems Administration Laboratory Built under Relaxed Financial Constraints

    ERIC Educational Resources Information Center

    Conlon, Michael P.; Mullins, Paul

    2011-01-01

    The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…

  18. ACToR-Aggregated Computational Resource | Science ...

    EPA Pesticide Factsheets

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food & Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high throughput environmental chemical screening and prioritization program called ToxCast(TM).

  19. Computational Science at the Argonne Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  20. A computer-based maintenance reminder and record-keeping system for clinical laboratories.

    PubMed

    Roberts, B I; Mathews, C L; Walton, C J; Frazier, G

    1982-09-01

    "Maintenance" is all the activity an organization devotes to keeping instruments within performance specifications to assure accurate and precise operation. The increasing use of complex analytical instruments as "workhorses" in clinical laboratories requires more maintenance awareness by laboratory personnel. Record-keeping systems that document maintenance completion and that should prompt the continued performance of maintenance tasks have not kept up with instrumentation development. We report here a computer-based record-keeping and reminder system that lists weekly the maintenance items due for each work station in the laboratory, including the time required to complete each item. Written in BASIC, the system uses a DATABOSS data base management system running on a time-shared Digital Equipment Corporation PDP 11/60 computer with a RSTS V 7.0 operating system.

  1. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  2. Computer listing of the effects of drugs on laboratory data

    PubMed Central

    Young, D. S.; Thomas, D. W.; Friedman, R. B.

    1972-01-01

    A listing of approximately 10000 effects of drugs on tests performed in clinical laboratories has been developed in a time-shared computer. The list contains a directory for matching proprietary and generic names of drugs and an explanation for the mode of action of the drug on each test. Each entry is supported by a bibliographical reference that contains the author's names, and the title of the article and journal. It is possible to search for specific `character strings' (word or words, number, etc) to obtain all the effects of a particular drug, or all drugs that affect a particular test, or even to search for a specific explanation for an effect. The system is undergoing trial in the Department's own computer to permit of automatic correlation of the effects of drugs with laboratory data from patients in one hospital ward. PMID:4648544

  3. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    PubMed

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate

  4. ANL statement of site strategy for computing workstations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less

  5. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic

  6. ACToR - Aggregated Computational Toxicology Resource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judson, Richard; Richard, Ann; Dix, David

    2008-11-15

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Centermore » for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast{sup TM}.« less

  7. Computer-aided resource planning and scheduling for radiological services

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.; Ge, Yiqun; Khan, Javed I.

    1996-05-01

    There exists tremendous opportunity in hospital-wide resource optimization based on system integration. This paper defines the resource planning and scheduling requirements integral to PACS, RIS and HIS integration. An multi-site case study is conducted to define the requirements. A well-tested planning and scheduling methodology, called Constrained Resource Planning model, has been applied to the chosen problem of radiological service optimization. This investigation focuses on resource optimization issues for minimizing the turnaround time to increase clinical efficiency and customer satisfaction, particularly in cases where the scheduling of multiple exams are required for a patient. How best to combine the information system efficiency and human intelligence in improving radiological services is described. Finally, an architecture for interfacing a computer-aided resource planning and scheduling tool with the existing PACS, HIS and RIS implementation is presented.

  8. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  9. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  10. Final Report National Laboratory Professional Development Workshop for Underrepresented Participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources neededmore » to be successful at the national laboratories.« less

  11. Guidelines for Developing Computer Based Resource Units. Revised.

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. Coll. at Buffalo. Educational Research and Development Complex.

    Presented for use with normal and handicapped children are guidelines for the development of computer based resource units organized into two operations: one of which is the production of software which includes the writing of instructional objectives, content, activities, materials, and measuring devices; and the other the coding of the software…

  12. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  13. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  14. Computer-simulated laboratory explorations for middle school life, earth, and physical Science

    NASA Astrophysics Data System (ADS)

    von Blum, Ruth

    1992-06-01

    Explorations in Middle School Science is a set of 72 computer-simulated laboratory lessons in life, earth, and physical Science for grades 6 9 developed by Jostens Learning Corporation with grants from the California State Department of Education and the National Science Foundation.3 At the heart of each lesson is a computer-simulated laboratory that actively involves students in doing science improving their: (1) understanding of science concepts by applying critical thinking to solve real problems; (2) skills in scientific processes and communications; and (3) attitudes about science. Students use on-line tools (notebook, calculator, word processor) to undertake in-depth investigations of phenomena (like motion in outer space, disease transmission, volcanic eruptions, or the structure of the atom) that would be too difficult, dangerous, or outright impossible to do in a “live” laboratory. Suggested extension activities lead students to hands-on investigations, away from the computer. This article presents the underlying rationale, instructional model, and process by which Explorations was designed and developed. It also describes the general courseware structure and three lesson's in detail, as well as presenting preliminary data from the evaluation. Finally, it suggests a model for incorporating technology into the science classroom.

  15. Geology and mineral and energy resources, Roswell Resource Area, New Mexico; an interactive computer presentation

    USGS Publications Warehouse

    Tidball, Ronald R.; Bartsch-Winkler, S. B.

    1995-01-01

    This Compact Disc-Read Only Memory (CD-ROM) contains a program illustrating the geology and mineral and energy resources of the Roswell Resource Area, an administrative unit of the U.S. Bureau of Land Management in east-central New Mexico. The program enables the user to access information on the geology, geochemistry, geophysics, mining history, metallic and industrial mineral commodities, hydrocarbons, and assessments of the area. The program was created with the display software, SuperCard, version 1.5, by Aldus. The program will run only on a Macintosh personal computer. This CD-ROM was produced in accordance with Macintosh HFS standards. The program was developed on a Macintosh II-series computer with system 7.0.1. The program is a compiled, executable form that is nonproprietary and does not require the presence of the SuperCard software.

  16. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  17. Williams uses computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-11

    ISS013-E-05853 (11 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  18. Integration of Openstack cloud resources in BES III computing cluster

    NASA Astrophysics Data System (ADS)

    Li, Haibo; Cheng, Yaodong; Huang, Qiulan; Cheng, Zhenjing; Shi, Jingyan

    2017-10-01

    Cloud computing provides a new technical means for data processing of high energy physics experiment. However, the resource of each queue is fixed and the usage of the resource is static in traditional job management system. In order to make it simple and transparent for physicist to use, we developed a virtual cluster system (vpmanager) to integrate IHEPCloud and different batch systems such as Torque and HTCondor. Vpmanager provides dynamic virtual machines scheduling according to the job queue. The BES III use case results show that resource efficiency is greatly improved.

  19. An Evaluation of Student Perceptions of Screen Presentations in Computer-based Laboratory Simulations.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Evaluates the importance of realism in the screen presentation of the plant in computer-based laboratory simulations for part-time engineering students. Concludes that simulations are less effective than actual laboratories but that realism minimizes the disadvantages. The schematic approach was preferred for ease of use. (AIM)

  20. Learning with Computers. AECA Resource Book Series, Volume 3, Number 2.

    ERIC Educational Resources Information Center

    Elliott, Alison

    1996-01-01

    Research has supported the idea that the use of computers in the education of young children promotes social interaction and academic achievement. This resource booklet provides an introduction to computers in early childhood settings to enrich learning opportunities and provides guidance to teachers to find developmentally appropriate software…

  1. Appropriate Use Policy | High-Performance Computing | NREL

    Science.gov Websites

    users of the National Renewable Energy Laboratory (NREL) High Performance Computing (HPC) resources government agency, National Laboratory, University, or private entity, the intellectual property terms (if issued a multifactor token which may be a physical token or a virtual token used with one-time password

  2. Computational resources for ribosome profiling: from database to Web server and software.

    PubMed

    Wang, Hongwei; Wang, Yan; Xie, Zhi

    2017-08-14

    Ribosome profiling is emerging as a powerful technique that enables genome-wide investigation of in vivo translation at sub-codon resolution. The increasing application of ribosome profiling in recent years has achieved remarkable progress toward understanding the composition, regulation and mechanism of translation. This benefits from not only the awesome power of ribosome profiling but also an extensive range of computational resources available for ribosome profiling. At present, however, a comprehensive review on these resources is still lacking. Here, we survey the recent computational advances guided by ribosome profiling, with a focus on databases, Web servers and software tools for storing, visualizing and analyzing ribosome profiling data. This review is intended to provide experimental and computational biologists with a reference to make appropriate choices among existing resources for the question at hand. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. The Laboratory for Terrestrial Physics

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The Laboratory for Terrestrial Physics is dedicated to the advancement of knowledge in Earth and planetary science, by conducting innovative research using space technology. The Laboratory's mission and activities support the work and new initiatives at NASA's Goddard Space Flight Center (GSFC). The Laboratory's success contributes to the Earth Science Directorate as a national resource for studies of Earth from Space. The Laboratory is part of the Earth Science Directorate based at the GSFC in Greenbelt, MD. The Directorate itself is comprised of the Global Change Data Center (GCDC), the Space Data and Computing Division (SDCD), and four science Laboratories, including Laboratory for Terrestrial Physics, Laboratory for Atmospheres, and Laboratory for Hydrospheric Processes all in Greenbelt, MD. The fourth research organization, Goddard Institute for Space Studies (GISS), is in New York, NY. Relevant to NASA's Strategic Plan, the Laboratory ensures that all work undertaken and completed is within the vision of GSFC. The philosophy of the Laboratory is to balance the completion of near term goals, while building on the Laboratory's achievements as a foundation for the scientific challenges in the future.

  4. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    NASA Astrophysics Data System (ADS)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  5. Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution

    PubMed Central

    Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn

    2012-01-01

    Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907

  6. 76 FR 35935 - In the Matter of: BP International, Inc., CyGene Laboratories, Inc., Delek Resources, Inc...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-20

    ... SECURITIES AND EXCHANGE COMMISSION [File No. 500-1] In the Matter of: BP International, Inc., CyGene Laboratories, Inc., Delek Resources, Inc., Flooring America, Inc., International Diversified... there is a lack of current and accurate information concerning the securities of CyGene Laboratories...

  7. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  8. Function Package for Computing Quantum Resource Measures

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-05-01

    In this paper, we present a function package for to calculate quantum resource measures and dynamics of open systems. Our package includes common operators and operator lists, frequently-used functions for computing quantum entanglement, quantum correlation, quantum coherence, quantum Fisher information and dynamics in noisy environments. We briefly explain the functions of the package and illustrate how to use the package with several typical examples. We expect that this package is a useful tool for future research and education.

  9. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that

  10. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  11. Computational simulation of laboratory-scale volcanic jets

    NASA Astrophysics Data System (ADS)

    Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.

    2017-12-01

    Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with

  12. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    ERIC Educational Resources Information Center

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  13. Roles of laboratories and laboratory systems in effective tuberculosis programmes.

    PubMed

    Ridderhof, John C; van Deun, Armand; Kam, Kai Man; Narayanan, P R; Aziz, Mohamed Abdul

    2007-05-01

    Laboratories and laboratory networks are a fundamental component of tuberculosis (TB) control, providing testing for diagnosis, surveillance and treatment monitoring at every level of the health-care system. New initiatives and resources to strengthen laboratory capacity and implement rapid and new diagnostic tests for TB will require recognition that laboratories are systems that require quality standards, appropriate human resources, and attention to safety in addition to supplies and equipment. To prepare the laboratory networks for new diagnostics and expanded capacity, we need to focus efforts on strengthening quality management systems (QMS) through additional resources for external quality assessment programmes for microscopy, culture, drug susceptibility testing (DST) and molecular diagnostics. QMS should also promote development of accreditation programmes to ensure adherence to standards to improve both the quality and credibility of the laboratory system within TB programmes. Corresponding attention must be given to addressing human resources at every level of the laboratory, with special consideration being given to new programmes for laboratory management and leadership skills. Strengthening laboratory networks will also involve setting up partnerships between TB programmes and those seeking to control other diseases in order to pool resources and to promote advocacy for quality standards, to develop strategies to integrate laboratories functions and to extend control programme activities to the private sector. Improving the laboratory system will assure that increased resources, in the form of supplies, equipment and facilities, will be invested in networks that are capable of providing effective testing to meet the goals of the Global Plan to Stop TB.

  14. Exploring the links between quality assurance and laboratory resources. An audit-based study.

    PubMed

    Singh, Navjeevan; Panwar, Aru; Masih, Vipin Fazal; Arora, Vinod K; Bhatia, Arati

    2003-01-01

    To investigate and rectify the problems related to Ziehl-Neelsen (Z-N) staining in a cytology laboratory in the context of quality assurance. An audit based quality assurance study of 1,421 patients with clinical diagnoses of tubercular lymphadenopathy who underwent fine needle aspiration cytology. Data from 8 months were audited (group 1). Laboratory practices related to selection of smears for Z-N staining were studied. A 2-step corrective measure based on results of the audit was introduced for 2 months (group 2). Results were subjected to statistical analysis using the chi 2 test. Of 1,172 patients in group 1,368 had diagnoses other than tuberculosis. Overall acid-fast bacillus (AFB) positivity was 42%. AFB positivity in 249 patients in group 2 was 89% (P < .0001). Several issues in the laboratory are linked to quality assurance. Solving everyday problems can have far-reaching benefits for the performance of laboratory personnel, resources and work flow.

  15. Effects of Combined Hands-on Laboratory and Computer Modeling on Student Learning of Gas Laws: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng

    2006-01-01

    Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…

  16. Roles of laboratories and laboratory systems in effective tuberculosis programmes

    PubMed Central

    van Deun, Armand; Kam, Kai Man; Narayanan, PR; Aziz, Mohamed Abdul

    2007-01-01

    Abstract Laboratories and laboratory networks are a fundamental component of tuberculosis (TB) control, providing testing for diagnosis, surveillance and treatment monitoring at every level of the health-care system. New initiatives and resources to strengthen laboratory capacity and implement rapid and new diagnostic tests for TB will require recognition that laboratories are systems that require quality standards, appropriate human resources, and attention to safety in addition to supplies and equipment. To prepare the laboratory networks for new diagnostics and expanded capacity, we need to focus efforts on strengthening quality management systems (QMS) through additional resources for external quality assessment programmes for microscopy, culture, drug susceptibility testing (DST) and molecular diagnostics. QMS should also promote development of accreditation programmes to ensure adherence to standards to improve both the quality and credibility of the laboratory system within TB programmes. Corresponding attention must be given to addressing human resources at every level of the laboratory, with special consideration being given to new programmes for laboratory management and leadership skills. Strengthening laboratory networks will also involve setting up partnerships between TB programmes and those seeking to control other diseases in order to pool resources and to promote advocacy for quality standards, to develop strategies to integrate laboratories’ functions and to extend control programme activities to the private sector. Improving the laboratory system will assure that increased resources, in the form of supplies, equipment and facilities, will be invested in networks that are capable of providing effective testing to meet the goals of the Global Plan to Stop TB. PMID:17639219

  17. A set of devices for Mechanics Laboratory assisted by a Computer

    NASA Astrophysics Data System (ADS)

    Rusu, Alexandru; Pirtac, Constantin

    2015-12-01

    The booklet give a description of a set of devices designed for unified work out of a number of Laboratory works in Mechanics for students at Technical Universities. It consists of a clock, adjusted to a computer, which allows to compute times with an error not greater than 0.0001 s. It allows also to make the calculations of the physical quantities measured in the experience and present the compilation of the final report. The least square method is used throughout the workshop.

  18. Williams works on computer in the U.S. Laboratory during Expedition 13

    NASA Image and Video Library

    2006-04-15

    ISS013-E-07975 (15 April 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  19. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and

  20. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the

  1. Strengthening LLNL Missions through Laboratory Directed Research and Development in High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willis, D. K.

    2016-12-01

    High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less

  2. NMRbox: A Resource for Biomolecular NMR Computation.

    PubMed

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  3. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGES

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; ...

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  4. Resource Manual on the Use of Computers in Schooling.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Technology Applications.

    This resource manual is designed to provide educators with timely information on the use of computers and related technology in schools. Section one includes a review of the new Bureau of Technology Applications' goal, functions, and major programs and activities; a description of the Model Schools Program, which has been conceptually derived from…

  5. Designing a Hands-On Brain Computer Interface Laboratory Course

    PubMed Central

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2017-01-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI. PMID:28268946

  6. Designing a hands-on brain computer interface laboratory course.

    PubMed

    Khalighinejad, Bahar; Long, Laura Kathleen; Mesgarani, Nima

    2016-08-01

    Devices and systems that interact with the brain have become a growing field of research and development in recent years. Engineering students are well positioned to contribute to both hardware development and signal analysis techniques in this field. However, this area has been left out of most engineering curricula. We developed an electroencephalography (EEG) based brain computer interface (BCI) laboratory course to educate students through hands-on experiments. The course is offered jointly by the Biomedical Engineering, Electrical Engineering, and Computer Science Departments of Columbia University in the City of New York and is open to senior undergraduate and graduate students. The course provides an effective introduction to the experimental design, neuroscience concepts, data analysis techniques, and technical skills required in the field of BCI.

  7. Students' Cognitive Focus during a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab

    ERIC Educational Resources Information Center

    Winberg, T. Mikael; Berg, C. Anders R.

    2007-01-01

    To enhance the learning outcomes achieved by students, learners undertook a computer-simulated activity based on an acid-base titration prior to a university-level chemistry laboratory activity. Students were categorized with respect to their attitudes toward learning. During the laboratory exercise, questions that students asked their assistant…

  8. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  9. A novel resource management method of providing operating system as a service for mobile transparent computing.

    PubMed

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  10. New design for interfacing computers to the Octopus network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sloan, L.J.

    1977-03-14

    The Lawrence Livermore Laboratory has several large-scale computers which are connected to the Octopus network. Several difficulties arise in providing adequate resources along with reliable performance. To alleviate some of these problems a new method of bringing large computers into the Octopus environment is proposed.

  11. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  12. The Center for Computational Biology: resources, achievements, and challenges

    PubMed Central

    Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2011-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221

  13. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  14. Optimization of analytical laboratory work using computer networking and databasing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upp, D.L.; Metcalf, R.A.

    1996-06-01

    The Health Physics Analysis Laboratory (HPAL) performs around 600,000 analyses for radioactive nuclides each year at Los Alamos National Laboratory (LANL). Analysis matrices vary from nasal swipes, air filters, work area swipes, liquids, to the bottoms of shoes and cat litter. HPAL uses 8 liquid scintillation counters, 8 gas proportional counters, and 9 high purity germanium detectors in 5 laboratories to perform these analyses. HPAL has developed a computer network between the labs and software to produce analysis results. The software and hardware package includes barcode sample tracking, log-in, chain of custody, analysis calculations, analysis result printing, and utility programs.more » All data are written to a database, mirrored on a central server, and eventually written to CD-ROM to provide for online historical results. This system has greatly reduced the work required to provide for analysis results as well as improving the quality of the work performed.« less

  15. A guide to Laboratory practicum on oscillations assisted by a computer

    NASA Astrophysics Data System (ADS)

    Russu, A. S.; Russu, S. S.; Pitac, C.

    2013-12-01

    The booklet contains descriptions of 3 Laboratory works on oscillations (n.9, 10,11) for students of Chisinau Technical University. They represent a modernized versions by a computer assistance of older ones which were first put in 1964. In each case it includes theoretical outlines, the work instruction, control questions.

  16. JSC earth resources data analysis capabilities available to EOD revision B

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A list and summary description of all Johnson Space Center electronic laboratory and photographic laboratory capabilities available to earth resources division personnel for processing earth resources data are provided. The electronic capabilities pertain to those facilities and systems that use electronic and/or photographic products as output. The photographic capabilities pertain to equipment that uses photographic images as input and electronic and/or table summarizes processing steps. A general hardware description is presented for each of the data processing systems, and the titles of computer programs are used to identify the capabilities and data flow.

  17. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  18. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  19. A remote laboratory for USRP-based software defined radio

    NASA Astrophysics Data System (ADS)

    Gandhinagar Ekanthappa, Rudresh; Escobar, Rodrigo; Matevossian, Achot; Akopian, David

    2014-02-01

    Electrical and computer engineering graduates need practical working skills with real-world electronic devices, which are addressed to some extent by hands-on laboratories. Deployment capacity of hands-on laboratories is typically constrained due to insufficient equipment availability, facility shortages, and lack of human resources for in-class support and maintenance. At the same time, at many sites, existing experimental systems are usually underutilized due to class scheduling bottlenecks. Nowadays, online education gains popularity and remote laboratories have been suggested to broaden access to experimentation resources. Remote laboratories resolve many problems as various costs can be shared, and student access to instrumentation is facilitated in terms of access time and locations. Labs are converted to homeworks that can be done without physical presence in laboratories. Even though they are not providing full sense of hands-on experimentation, remote labs are a viable alternatives for underserved educational sites. This paper studies remote modality of USRP-based radio-communication labs offered by National Instruments (NI). The labs are offered to graduate and undergraduate students and tentative assessments support feasibility of remote deployments.

  20. The Ever-Present Demand for Public Computing Resources. CDS Spotlight

    ERIC Educational Resources Information Center

    Pirani, Judith A.

    2014-01-01

    This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…

  1. Williams uses laptop computer in the U.S. Laboratory taken during Expedition 13

    NASA Image and Video Library

    2006-06-22

    ISS013-E-40000 (22 June 2006) --- Astronaut Jeffrey N. Williams, Expedition 13 NASA space station science officer and flight engineer, uses a computer in the Destiny laboratory of the International Space Station.

  2. The importance of employing computational resources for the automation of drug discovery.

    PubMed

    Rosales-Hernández, Martha Cecilia; Correa-Basurto, José

    2015-03-01

    The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.

  3. 3D chemical imaging in the laboratory by hyperspectral X-ray computed tomography

    PubMed Central

    Egan, C. K.; Jacques, S. D. M.; Wilson, M. D.; Veale, M. C.; Seller, P.; Beale, A. M.; Pattrick, R. A. D.; Withers, P. J.; Cernik, R. J.

    2015-01-01

    We report the development of laboratory based hyperspectral X-ray computed tomography which allows the internal elemental chemistry of an object to be reconstructed and visualised in three dimensions. The method employs a spectroscopic X-ray imaging detector with sufficient energy resolution to distinguish individual elemental absorption edges. Elemental distributions can then be made by K-edge subtraction, or alternatively by voxel-wise spectral fitting to give relative atomic concentrations. We demonstrate its application to two material systems: studying the distribution of catalyst material on porous substrates for industrial scale chemical processing; and mapping of minerals and inclusion phases inside a mineralised ore sample. The method makes use of a standard laboratory X-ray source with measurement times similar to that required for conventional computed tomography. PMID:26514938

  4. Expanding Your Laboratory by Accessing Collaboratory Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, David W.; Burton, Sarah D.; Peterson, Michael R.

    2004-03-01

    The Environmental Molecular Sciences Laboratory (EMSL) in Richland, Washington, is the home of a research facility setup by the United States Department of Energy (DOE). The facility is atypical because it houses over 100 cutting-edge research systems for the use of researchers all over the United States and the world. Access to the lab is requested through a peer-review proposal process and the scientists who use the facility are generally referred to as ‘users’. There are six main research facilities housed in EMSL, all of which host visiting researchers. Several of these facilities also participate in the EMSL Collaboratory, amore » remote access capability supported by EMSL operations funds. Of these, the High-Field Magnetic Resonance Facility (HFMRF) and Molecular Science Computing Facility (MSCF) have a significant number of their users performing remote work. The HFMRF in EMSL currently houses 12 NMR spectrometers that range in magnet field strength from 7.05T to 21.1T. Staff associated with the NMR facility offers scientific expertise in the areas of structural biology, solid-state materials/catalyst characterization, and magnetic resonance imaging (MRI) techniques. The way in which the HFMRF operates, with a high level of dedication to remote operation across the full suite of High-Field NMR spectrometers, has earned it the name “Virtual NMR Facility”. This review will focus on the operational aspects of remote research done in the High-Field Magnetic Resonance Facility and the computer tools that make remote experiments possible.« less

  5. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and

  6. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    NASA Astrophysics Data System (ADS)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  7. Knowledge Retention for Computer Simulations: A study comparing virtual and hands-on laboratories

    NASA Astrophysics Data System (ADS)

    Croom, John R., III

    The use of virtual laboratories has the potential to change physics education. These low-cost, interactive computer activities interest students, allow for easy setup, and give educators a way to teach laboratory based online classes. This study investigated whether virtual laboratories could replace traditional hands-on laboratories and whether students could retain the same long-term knowledge in virtual laboratories as compared to hands-on laboratories. This study is a quantitative quasi-experiment that used a multiple posttest design to determine if students using virtual laboratories would retain the same knowledge as students who performed hands-on laboratories after 9 weeks. The study was composed of 336 students from 14 school districts. Students had their performances on the laboratories and their retention of the laboratories compared to a series of factors that might have affected their retention using a pretest and two posttests, which were compared using a t test. The results showed no significant difference in short-term learning between the hands-on laboratory groups and virtual laboratory groups. There was, however, a significant difference (p = .005) between the groups in long-term retention; students in the hands-on laboratory groups retained more information than those in the virtual laboratory groups. These results suggest that long-term learning is enhanced when a laboratory contains a hands-on component. Finally, the results showed that both groups of students felt their particular laboratory style was superior to the alternative method. The findings of this study can be used to improve the integration of virtual laboratories into science curriculum.

  8. Chemical decontamination technical resources at Los Alamos National Laboratory (2008)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, Murray E

    This document supplies information resources for a person seeking to create planning or pre-planning documents for chemical decontamination operations. A building decontamination plan can be separated into four different sections: Pre-planning, Characterization, Decontamination (Initial response and also complete cleanup), and Clearance. Of the identified Los Alamos resources, they can be matched with these four sections: Pre-planning -- Dave Seidel, EO-EPP, Emergency Planning and Preparedness; David DeCroix and Bruce Letellier, D-3, Computational fluids modeling of structures; Murray E. Moore, RP-2, Aerosol sampling and ventilation engineering. Characterization (this can include development projects) -- Beth Perry, IAT-3, Nuclear Counterterrorism Response (SNIPER database); Fernandomore » Garzon, MPA-11, Sensors and Electrochemical Devices (development); George Havrilla, C-CDE, Chemical Diagnostics and Engineering; Kristen McCabe, B-7, Biosecurity and Public Health. Decontamination -- Adam Stively, EO-ER, Emergency Response; Dina Matz, IHS-IP, Industrial hygiene; Don Hickmott, EES-6, Chemical cleanup. Clearance (validation) -- Larry Ticknor, CCS-6, Statistical Sciences.« less

  9. An integrated system for land resources supervision based on the IoT and cloud computing

    NASA Astrophysics Data System (ADS)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  10. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  11. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  12. Testing a computer-based ostomy care training resource for staff nurses.

    PubMed

    Bales, Isabel

    2010-05-01

    Fragmented teaching and ostomy care provided by nonspecialized clinicians unfamiliar with state-of-the-art care and products have been identified as problems in teaching ostomy care to the new ostomate. After conducting a literature review of theories and concepts related to the impact of nurse behaviors and confidence on ostomy care, the author developed a computer-based learning resource and assessed its effect on staff nurse confidence. Of 189 staff nurses with a minimum of 1 year acute-care experience employed in the acute care, emergency, and rehabilitation departments of an acute care facility in the Midwestern US, 103 agreed to participate and returned completed pre- and post-tests, each comprising the same eight statements about providing ostomy care. F and P values were computed for differences between pre- and post test scores. Based on a scale where 1 = totally disagree and 5 = totally agree with the statement, baseline confidence and perceived mean knowledge scores averaged 3.8 and after viewing the resource program post-test mean scores averaged 4.51, a statistically significant improvement (P = 0.000). The largest difference between pre- and post test scores involved feeling confident in having the resources to learn ostomy skills independently. The availability of an electronic ostomy care resource was rated highly in both pre- and post testing. Studies to assess the effects of increased confidence and knowledge on the quality and provision of care are warranted.

  13. Anderson uses laptop computer in the U.S. Laboratory during Joint Operations

    NASA Image and Video Library

    2007-06-13

    S117-E-07134 (12 June 2007) --- Astronaut Clayton Anderson, Expedition 15 flight engineer, uses a computer near the Microgravity Science Glovebox (MSG) in the Destiny laboratory of the International Space Station while Space Shuttle Atlantis (STS-117) was docked with the station. Astronaut Sunita Williams, flight engineer, is at right.

  14. Zebrafish Health Conditions in the China Zebrafish Resource Center and 20 Major Chinese Zebrafish Laboratories.

    PubMed

    Liu, Liyue; Pan, Luyuan; Li, Kuoyu; Zhang, Yun; Zhu, Zuoyan; Sun, Yonghua

    2016-07-01

    In China, the use of zebrafish as an experimental animal in the past 15 years has widely expanded. The China Zebrafish Resource Center (CZRC), which was established in 2012, is becoming one of the major resource centers in the global zebrafish community. Large-scale use and regular exchange of zebrafish resources have put forward higher requirements on zebrafish health issues in China. This article reports the current aquatic infrastructure design, animal husbandry, and health-monitoring programs in the CZRC. Meanwhile, through a survey of 20 Chinese zebrafish laboratories, we also describe the current health status of major zebrafish facilities in China. We conclude that it is of great importance to establish a widely accepted health standard and health-monitoring strategy in the Chinese zebrafish research community.

  15. A Text-Computer Assisted Instruction Program as a Viable Alternative for Continuing Education in Laboratory Medicine.

    ERIC Educational Resources Information Center

    Bruce, A. Wayne

    1986-01-01

    Describes reasons for developing combined text and computer assisted instruction (CAI) teaching programs for delivery of continuing education to laboratory professionals, and mechanisms used for developing a CAI program on method evaluation in the clinical laboratory. Results of an evaluation of the software's cost effectiveness and instructional…

  16. The use of computer-aided learning in chemistry laboratory instruction

    NASA Astrophysics Data System (ADS)

    Allred, Brian Robert Tracy

    This research involves developing and implementing computer software for chemistry laboratory instruction. The specific goal is to design the software and investigate whether it can be used to introduce concepts and laboratory procedures without a lecture format. This would allow students to conduct an experiment even though they may not have been introduced to the chemical concept in their lecture course. This would also allow for another type of interaction for those students who respond more positively to a visual approach to instruction. The first module developed was devoted to using computer software to help introduce students to the concepts related to thin-layer chromatography and setting up and running an experiment. This was achieved through the use of digitized pictures and digitized video clips along with written information. A review quiz was used to help reinforce the learned information. The second module was devoted to the concept of the "dry lab". This module presented students with relevant information regarding the chemical concepts and then showed them the outcome of mixing solutions. By these observations, they were to determine the composition of unknown solutions based on provided descriptions and comparison with their written observations. The third piece of the software designed was a computer game. This program followed the first two modules in providing information the students were to learn. The difference here, though, was incorporating a game scenario for students to use to help reinforce the learning. Students were then assessed to see how much information they retained after playing the game. In each of the three cases, a control group exposed to the traditional lecture format was used. Their results were compared to the experimental group using the computer modules. Based upon the findings, it can be concluded that using technology to aid in the instructional process is definitely of benefit and students were more successful in

  17. Pricing the Computing Resources: Reading Between the Lines and Beyond

    NASA Technical Reports Server (NTRS)

    Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)

    2001-01-01

    Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.

  18. Ground data systems resource allocation process

    NASA Technical Reports Server (NTRS)

    Berner, Carol A.; Durham, Ralph; Reilly, Norman B.

    1989-01-01

    The Ground Data Systems Resource Allocation Process at the Jet Propulsion Laboratory provides medium- and long-range planning for the use of Deep Space Network and Mission Control and Computing Center resources in support of NASA's deep space missions and Earth-based science. Resources consist of radio antenna complexes and associated data processing and control computer networks. A semi-automated system was developed that allows operations personnel to interactively generate, edit, and revise allocation plans spanning periods of up to ten years (as opposed to only two or three weeks under the manual system) based on the relative merit of mission events. It also enhances scientific data return. A software system known as the Resource Allocation and Planning Helper (RALPH) merges the conventional methods of operations research, rule-based knowledge engineering, and advanced data base structures. RALPH employs a generic, highly modular architecture capable of solving a wide variety of scheduling and resource sequencing problems. The rule-based RALPH system has saved significant labor in resource allocation. Its successful use affirms the importance of establishing and applying event priorities based on scientific merit, and the benefit of continuity in planning provided by knowledge-based engineering. The RALPH system exhibits a strong potential for minimizing development cycles of resource and payload planning systems throughout NASA and the private sector.

  19. [Human resource capacity building on TB laboratory work for TB control program--through the experience of international TB laboratory training course for TB control at the Research Institute of Tuberculosis, JATA, Japan].

    PubMed

    Fujiki, Akiko; Kato, Seiya

    2008-06-01

    The international training course on TB laboratory work for national tuberculosis program (NTP) has been conducted at the Research Institute of Tuberculosis since 1975 funded by Japan International Cooperation Agency in collaboration with WHO Western Pacific Regional Office. The aim of the course is to train key personnel in TB laboratory field for NTP in resource-limited countries. The course has trained 265 national key personnel in TB laboratory service from 57 resource-limited countries in the last 33 years. The number of participants trained may sound too small in the fight against the large TB problem in resource-limited countries. However, every participant is playing an important role as a core and catalyst for the TB control program in his/her own country when they were back home. The curriculum is composed of technical aspects on TB examination, mainly sputum microscopy in addition since microscopy service is provided at many centers that are deployed in a widely spread area, the managerial aspect of maintaining quality TB laboratory work at the field laboratory is another component of the curriculum. Effective teaching methods using materials such as artificial sputum, which is useful for panel slide preparation, and technical manuals with illustrations and pictures of training procedure have been developed through the experience of the course. These manuals are highly appreciated and widely used by the front line TB workers. The course has also contributed to the expansion of EQA (External Quality Assessment) system on AFB microscopy for the improvement of the quality of TB laboratory service of NTP. The course is well-known for not only having a long history, but also for its unique learning method emphasizing "Participatory Training", particularly for practicum sessions to master the skills on AFB microscopy. The method in learning AFB microscopy, which was developed by the course, was published as a training manual by IUATLD, RIT and USAID. As it is

  20. The WHO/PEPFAR collaboration to prepare an operations manual for HIV prevention, care, and treatment at primary health centers in high-prevalence, resource-constrained settings: defining laboratory services.

    PubMed

    Spira, Thomas; Lindegren, Mary Lou; Ferris, Robert; Habiyambere, Vincent; Ellerbrock, Tedd

    2009-06-01

    The expansion of HIV/AIDS care and treatment in resource-constrained countries, especially in sub-Saharan Africa, has generally developed in a top-down manner. Further expansion will involve primary health centers where human and other resources are limited. This article describes the World Health Organization/President's Emergency Plan for AIDS Relief collaboration formed to help scale up HIV services in primary health centers in high-prevalence, resource-constrained settings. It reviews the contents of the Operations Manual developed, with emphasis on the Laboratory Services chapter, which discusses essential laboratory services, both at the center and the district hospital level, laboratory safety, laboratory testing, specimen transport, how to set up a laboratory, human resources, equipment maintenance, training materials, and references. The chapter provides specific information on essential tests and generic job aids for them. It also includes annexes containing a list of laboratory supplies for the health center and sample forms.

  1. Adolescents, Health Education, and Computers: The Body Awareness Resource Network (BARN).

    ERIC Educational Resources Information Center

    Bosworth, Kris; And Others

    1983-01-01

    The Body Awareness Resource Network (BARN) is a computer-based system designed as a confidential, nonjudgmental source of health information for adolescents. Topics include alcohol and other drugs, diet and activity, family communication, human sexuality, smoking, and stress management; programs are available for high school and middle school…

  2. U.S. hydropower resource assessment for Idaho

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conner, A.M.; Francfort, J.E.

    1998-08-01

    The US Department of Energy is developing an estimate of the undeveloped hydropower potential in the US. The Hydropower Evaluation Software (HES) is a computer model that was developed by the Idaho National Engineering and Environmental Laboratory for this purpose. HES measures the undeveloped hydropower resources available in the US, using uniform criteria for measurement. The software was developed and tested using hydropower information and data provided by the Southwestern Power Administration. It is a menu-driven program that allows the personal computer user to assign environmental attributes to potential hydropower sites, calculate development suitability factors for each site based onmore » the environmental attributes present, and generate reports based on these suitability factors. This report describes the resource assessment results for the State of Idaho.« less

  3. Improved dissection efficiency in the human gross anatomy laboratory by the integration of computers and modern technology.

    PubMed

    Reeves, Rustin E; Aschenbrenner, John E; Wordinger, Robert J; Roque, Rouel S; Sheedlo, Harold J

    2004-05-01

    The need to increase the efficiency of dissection in the gross anatomy laboratory has been the driving force behind the technologic changes we have recently implemented. With the introduction of an integrated systems-based medical curriculum and a reduction in laboratory teaching hours, anatomy faculty at the University of North Texas Health Science Center (UNTHSC) developed a computer-based dissection manual to adjust to these curricular changes and time constraints. At each cadaver workstation, Apple iMac computers were added and a new dissection manual, running in a browser-based format, was installed. Within the text of the manual, anatomical structures required for dissection were linked to digital images from prosected materials; in addition, for each body system, the dissection manual included images from cross sections, radiographs, CT scans, and histology. Although we have placed a high priority on computerization of the anatomy laboratory, we remain strong advocates of the importance of cadaver dissection. It is our belief that the utilization of computers for dissection is a natural evolution of technology and fosters creative teaching strategies adapted for anatomy laboratories in the 21st century. Our strategy has significantly enhanced the independence and proficiency of our students, the efficiency of their dissection time, and the quality of laboratory instruction by the faculty. Copyright 2004 Wiley-Liss, Inc.

  4. Justification of Filter Selection for Robot Balancing in Conditions of Limited Computational Resources

    NASA Astrophysics Data System (ADS)

    Momot, M. V.; Politsinskaia, E. V.; Sushko, A. V.; Semerenko, I. A.

    2016-08-01

    The paper considers the problem of mathematical filter selection, used for balancing of wheeled robot in conditions of limited computational resources. The solution based on complementary filter is proposed.

  5. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, Tom; Yang, Xi

    generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less

  6. SARANA: language, compiler and run-time system support for spatially aware and resource-aware mobile computing.

    PubMed

    Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei

    2008-10-28

    Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.

  7. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    PubMed

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  8. Emotor control: computations underlying bodily resource allocation, emotions, and confidence

    PubMed Central

    Kepecs, Adam; Mensh, Brett D.

    2015-01-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience—approaching subjective behavior as the result of mental computations instantiated in the brain—to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This “emotor” control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on “confidence.” Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior. PMID:26869840

  9. Modified stretched exponential model of computer system resources management limitations-The case of cache memory

    NASA Astrophysics Data System (ADS)

    Strzałka, Dominik; Dymora, Paweł; Mazurek, Mirosław

    2018-02-01

    In this paper we present some preliminary results in the field of computer systems management with relation to Tsallis thermostatistics and the ubiquitous problem of hardware limited resources. In the case of systems with non-deterministic behaviour, management of their resources is a key point that guarantees theirs acceptable performance and proper working. This is very wide problem that stands for many challenges in financial, transport, water and food, health, etc. areas. We focus on computer systems with attention paid to cache memory and propose to use an analytical model that is able to connect non-extensive entropy formalism, long-range dependencies, management of system resources and queuing theory. Obtained analytical results are related to the practical experiment showing interesting and valuable results.

  10. Design and implementation of a hospital-based usability laboratory: insights from a Department of Veterans Affairs laboratory for health information technology.

    PubMed

    Russ, Alissa L; Weiner, Michael; Russell, Scott A; Baker, Darrell A; Fahner, W Jeffrey; Saleem, Jason J

    2012-12-01

    Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.

  11. LINCS: Livermore's network architecture. [Octopus computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less

  12. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  13. Computation Directorate Annual Report 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L; McGraw, J R; Ashby, S F

    employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less

  14. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  15. Strategies for combining physics videos and virtual laboratories in the training of physics teachers

    NASA Astrophysics Data System (ADS)

    Dickman, Adriana; Vertchenko, Lev; Martins, Maria Inés

    2007-03-01

    Among the multimedia resources used in physics education, the most prominent are virtual laboratories and videos. On one hand, computer simulations and applets have very attractive graphic interfaces, showing an incredible amount of detail and movement. On the other hand, videos, offer the possibility of displaying high quality images, and are becoming more feasible with the increasing availability of digital resources. We believe it is important to discuss, throughout the teacher training program, both the functionality of information and communication technology (ICT) in physics education and, the varied applications of these resources. In our work we suggest the introduction of ICT resources in a sequence integrating these important tools in the teacher training program, as opposed to the traditional approach, in which virtual laboratories and videos are introduced separately. In this perspective, when we introduce and utilize virtual laboratory techniques we also provide for its use in videos, taking advantage of graphic interfaces. Thus the students in our program learn to use instructional software in the production of videos for classroom use.

  16. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational

  17. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views. © 2013 American Association of Anatomists.

  18. Routine operation of an Elliott 903 computer in a clinical chemistry laboratory

    PubMed Central

    Whitby, L. G.; Simpson, D.

    1973-01-01

    Experience gained in the last four years concerning the capabilities and limitations of an 8K Elliott 903 (18-bit word) computer with magnetic tape backing store in the routine operation of a clinical chemistry laboratory is described. Designed as a total system, routine operation has latterly had to be confined to data acquisition and process control functions, due primarily to limitations imposed by the choice of hardware early in the project. In this final report of a partially successful experiment the opportunity is taken to review mistakes made, especially at the start of the project, to warn potential computer users of pitfalls to be avoided. PMID:4580240

  19. Student teaching and research laboratory focusing on brain-computer interface paradigms--A creative environment for computer science students.

    PubMed

    Rutkowski, Tomasz M

    2015-08-01

    This paper presents an applied concept of a brain-computer interface (BCI) student research laboratory (BCI-LAB) at the Life Science Center of TARA, University of Tsukuba, Japan. Several successful case studies of the student projects are reviewed together with the BCI Research Award 2014 winner case. The BCI-LAB design and project-based teaching philosophy is also explained. Future teaching and research directions summarize the review.

  20. VirTUal remoTe labORatories managEment System (TUTORES): Using Cloud Computing to Acquire University Practical Skills

    ERIC Educational Resources Information Center

    Caminero, Agustín C.; Ros, Salvador; Hernández, Roberto; Robles-Gómez, Antonio; Tobarra, Llanos; Tolbaños Granjo, Pedro J.

    2016-01-01

    The use of practical laboratories is a key in engineering education in order to provide our students with the resources needed to acquire practical skills. This is specially true in the case of distance education, where no physical interactions between lecturers and students take place, so virtual or remote laboratories must be used. UNED has…

  1. Introduction of Digital Computer Technology Into the Undergraduate Chemistry Laboratory. Final Technical Report.

    ERIC Educational Resources Information Center

    Perone, Sam P.

    The objective of this project has been the development of a successful approach for the incorporation of on-line computer technology into the undergraduate chemistry laboratory. This approach assumes no prior programing, electronics or instrumental analysis experience on the part of the student; it does not displace the chemistry content with…

  2. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  3. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  4. Discussing sexual and relationship health with young people in a children's hospital: evaluation of a computer-based resource.

    PubMed

    Bray, Lucy; Sanders, Caroline; McKenna, Jacqueline

    2013-12-01

    To investigate health professionals' evaluation of a computer-based resource designed to improve discussions about sexual and relationship health with young people. Evidence suggests that some health professionals can experience discomfort discussing sexual health and relationship issues with young people. Professionals within hospital settings should have the knowledge, competencies and skills to be able to ask young people sexual health questions and provide accurate sexual health education. Despite some educational material being available for community and adult services, there are no resources available, which are directly relevant to holding opportunistic discussions with young people within an acute children's hospital. A descriptive survey design. One hundred and fourteen health professionals from a children's hospital in the UK were involved in evaluating a computer-based resource. All completed an online questionnaire survey comprising of closed and open questions. The health professionals reported that the computer-based resource had a positive influence on their knowledge and clinical practice. The videos as well as the concise nature of the resource were evaluated highly. Learning was facilitated by professionals being able to control their learning through rerunning and accessing the resource on numerous occasions. An engaging, accessible computer-based resource has the capability to positively impact on health professionals' knowledge of, and skills in, starting and holding sexual health conversations with young people accessing a children's hospital. Health professionals working with children and young people value accessible, relevant and short computer-based training. This can facilitate knowledge and skill acquisition despite variation in working patterns. Improving the knowledge and skills of professionals working with young people to facilitate appropriate yet opportunistic sexual health discussions is important within the public health agenda

  5. City University of New York--Availability of Student Computer Resources. Report.

    ERIC Educational Resources Information Center

    McCall, H. Carl

    This audit reports on the availability of computer resources at the City University of New York's (CUNY) senior colleges. CUNY is the largest urban and the third largest public university system in the United States. Of the 19 CUNY campuses located throughout the five boroughs, 11 are senior colleges offering four-year degrees. For the fall 2001…

  6. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  7. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  8. Performance Audit of the U.S. Geological Survey, Energy Resource Program Inorganic Geochemistry Laboratory

    USGS Publications Warehouse

    Luppens, James A.; Janke, Louis G.; McCord, Jamey D.; Bullock, John H.; Brazeau, Lisa; Affronter, Ronald H.

    2007-01-01

    A performance audit of the U.S. Geological Survey (USGS), Energy Resource Program (ERP) Inorganic Geochemistry Laboratory (IGL) was conducted between August, 2003 and October, 2005. The goals were to ensure that a high level of analytical performance was maintained and identify any areas that could be enhanced. The audit was subdivided into three phases. Phase 1 was a preliminary assessment of current performance based on recent performance on CANSPEX samples. IGL performance was also compared to laboratories world-wide with similar scope. Phase 2 consisted of the implementation of the recommended changes made in Phase 1. Phase 3 of the audit consisted of a reassessment effort to evaluate the effectiveness of the recommendations made in the Phase 1 and an on-site audit of the laboratory facilities. Phases 1 and 3 required summary reports that are included in Appendices A and B of this report. The audit found that the IGL was one of the top two laboratories compared for trace element analyses. Several recommendations to enhance performance on major and minor elemental parameters were made and implemented. Demonstrated performance improvements as a result of the recommended changes were documented. Several initiatives to sustain the performance improvements gained from the audit have been implemented.

  9. Cultural Resource Protection Plan for the Remote-Handled Low-Level Waste Disposal Facility at the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pace, Brenda Ringe; Gilbert, Hollie Kae

    2015-05-01

    This plan addresses cultural resource protection procedures to be implemented during construction of the Remote Handled Low Level Waste project at the Idaho National Laboratory. The plan proposes pre-construction review of proposed ground disturbing activities to confirm avoidance of cultural resources. Depending on the final project footprint, cultural resource protection strategies might also include additional survey, protective fencing, cultural resource mapping and relocation of surface artifacts, collection of surface artifacts for permanent curation, confirmation of undisturbed historic canal segments outside the area of potential effects for construction, and/or archaeological test excavations to assess potential subsurface cultural deposits at known culturalmore » resource locations. Additionally, all initial ground disturbing activities will be monitored for subsurface cultural resource finds, cultural resource sensitivity training will be conducted for all construction field personnel, and a stop work procedure will be implemented to guide assessment and protection of any unanticipated discoveries after initial monitoring of ground disturbance.« less

  10. Critical role of developing national strategic plans as a guide to strengthen laboratory health systems in resource-poor settings.

    PubMed

    Nkengasong, John N; Mesele, Tsehaynesh; Orloff, Sherry; Kebede, Yenew; Fonjungo, Peter N; Timperi, Ralph; Birx, Deborah

    2009-06-01

    Medical laboratory services are an essential, yet often neglected, component of health systems in developing countries. Their central role in public health, disease control and surveillance, and patient management is often poorly recognized by governments and donors. However, medical laboratory services in developing countries can be strengthened by leveraging funding from other sources of HIV/AIDS prevention, care, surveillance, and treatment programs. Strengthening these services will require coordinated efforts by national governments and partners and can be achieved by establishing and implementing national laboratory strategic plans and policies that integrate laboratory systems to combat major infectious diseases. These plans should take into account policy, legal, and regulatory frameworks; the administrative and technical management structure of the laboratories; human resources and retention strategies; laboratory quality management systems; monitoring and evaluation systems; procurement and maintenance of equipment; and laboratory infrastructure enhancement. Several countries have developed or are in the process of developing their laboratory plans, and others, such as Ethiopia, have implemented and evaluated their plan.

  11. Known structure, unknown function: An inquiry-based undergraduate biochemistry laboratory course.

    PubMed

    Gray, Cynthia; Price, Carol W; Lee, Christopher T; Dewald, Alison H; Cline, Matthew A; McAnany, Charles E; Columbus, Linda; Mura, Cameron

    2015-01-01

    Undergraduate biochemistry laboratory courses often do not provide students with an authentic research experience, particularly when the express purpose of the laboratory is purely instructional. However, an instructional laboratory course that is inquiry- and research-based could simultaneously impart scientific knowledge and foster a student's research expertise and confidence. We have developed a year-long undergraduate biochemistry laboratory curriculum wherein students determine, via experiment and computation, the function of a protein of known three-dimensional structure. The first half of the course is inquiry-based and modular in design; students learn general biochemical techniques while gaining preparation for research experiments in the second semester. Having learned standard biochemical methods in the first semester, students independently pursue their own (original) research projects in the second semester. This new curriculum has yielded an improvement in student performance and confidence as assessed by various metrics. To disseminate teaching resources to students and instructors alike, a freely accessible Biochemistry Laboratory Education resource is available at http://biochemlab.org. © 2015 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology.

  12. Known structure, unknown function: An inquiry‐based undergraduate biochemistry laboratory course

    PubMed Central

    Gray, Cynthia; Price, Carol W.; Lee, Christopher T.; Dewald, Alison H.; Cline, Matthew A.; McAnany, Charles E.

    2015-01-01

    Abstract Undergraduate biochemistry laboratory courses often do not provide students with an authentic research experience, particularly when the express purpose of the laboratory is purely instructional. However, an instructional laboratory course that is inquiry‐ and research‐based could simultaneously impart scientific knowledge and foster a student's research expertise and confidence. We have developed a year‐long undergraduate biochemistry laboratory curriculum wherein students determine, via experiment and computation, the function of a protein of known three‐dimensional structure. The first half of the course is inquiry‐based and modular in design; students learn general biochemical techniques while gaining preparation for research experiments in the second semester. Having learned standard biochemical methods in the first semester, students independently pursue their own (original) research projects in the second semester. This new curriculum has yielded an improvement in student performance and confidence as assessed by various metrics. To disseminate teaching resources to students and instructors alike, a freely accessible Biochemistry Laboratory Education resource is available at http://biochemlab.org. © 2015 The Authors Biochemistry and Molecular Biology Education published by Wiley Periodicals, Inc. on behalf of International Union of Biochemistry and Molecular Biology, 43(4):245–262, 2015. PMID:26148241

  13. Evidence-based approach to the maintenance of laboratory and medical equipment in resource-poor settings.

    PubMed

    Malkin, Robert; Keane, Allison

    2010-07-01

    Much of the laboratory and medical equipment in resource-poor settings is out-of-service. The most commonly cited reasons are (1) a lack of spare parts and (2) a lack of highly trained technicians. However, there is little data to support these hypotheses, or to generate evidence-based solutions to the problem. We studied 2,849 equipment-repair requests (of which 2,529 were out-of-service medical equipment) from 60 resource-poor hospitals located in 11 nations in Africa, Europe, Asia, and Central America. Each piece of equipment was analyzed by an engineer or an engineering student and a repair was attempted using only locally available materials. If the piece was placed back into service, we assumed that the engineer's problem analysis was correct. A total of 1,821 pieces of medical equipment were placed back into service, or 72%, without requiring the use of imported spare parts. Of those pieces repaired, 1,704 were sufficiently documented to determine what knowledge was required to place the equipment back into service. We found that six domains of knowledge were required to accomplish 99% of the repairs: electrical (18%), mechanical (18%), power supply (14%), plumbing (19%), motors (5%), and installation or user training (25%). A further analysis of the domains shows that 66% of the out-of-service equipment was placed back into service using only 107 skills covering basic knowledge in each domain; far less knowledge than that required of a biomedical engineer or biomedical engineering technician. We conclude that a great majority of laboratory and medical equipment can be put back into service without importing spare parts and using only basic knowledge. Capacity building in resource-poor settings should first focus on a limited set of knowledge; a body of knowledge that we call the biomedical technician's assistant (BTA). This data set suggests that a supported BTA could place 66% of the out-of-service laboratory and medical equipment in their hospital back

  14. A validated methodology for determination of laboratory instrument computer interface efficacy

    NASA Astrophysics Data System (ADS)

    1984-12-01

    This report is intended to provide a methodology for determining when, and for which instruments, direct interfacing of laboratory instrument and laboratory computers is beneficial. This methodology has been developed to assist the Tri-Service Medical Information Systems Program Office in making future decisions regarding laboratory instrument interfaces. We have calculated the time savings required to reach a break-even point for a range of instrument interface prices and corresponding average annual costs. The break-even analyses used empirical data to estimate the number of data points run per day that are required to meet the break-even point. The results indicate, for example, that at a purchase price of $3,000, an instrument interface will be cost-effective if the instrument is utilized for at least 154 data points per day if operated in the continuous mode, or 216 points per day if operated in the discrete mode. Although this model can help to ensure that instrument interfaces are cost effective, additional information should be considered in making the interface decisions. A reduction in results transcription errors may be a major benefit of instrument interfacing.

  15. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    NASA Astrophysics Data System (ADS)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  16. Technology Systems. Laboratory Activities.

    ERIC Educational Resources Information Center

    Brame, Ray; And Others

    This guide contains 43 modules of laboratory activities for technology education courses. Each module includes an instructor's resource sheet and the student laboratory activity. Instructor's resource sheets include some or all of the following elements: module number, course title, activity topic, estimated time, essential elements, objectives,…

  17. ON states as resource units for universal quantum computation with photonic architectures

    NASA Astrophysics Data System (ADS)

    Sabapathy, Krishna Kumar; Weedbrook, Christian

    2018-06-01

    Universal quantum computation using photonic systems requires gates the Hamiltonians of which are of order greater than quadratic in the quadrature operators. We first review previous proposals to implement such gates, where specific non-Gaussian states are used as resources in conjunction with entangling gates such as the continuous-variable versions of controlled-phase and controlled-not gates. We then propose ON states which are superpositions of the vacuum and the N th Fock state, for use as non-Gaussian resource states. We show that ON states can be used to implement the cubic and higher-order quadrature phase gates to first order in gate strength. There are several advantages to this method such as reduced number of superpositions in the resource state preparation and greater control over the final gate. We also introduce useful figures of merit to characterize gate performance. Utilizing a supply of on-demand resource states one can potentially scale up implementation to greater accuracy, by repeated application of the basic circuit.

  18. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  19. DOE pushes for useful quantum computing

    NASA Astrophysics Data System (ADS)

    Cho, Adrian

    2018-01-01

    The U.S. Department of Energy (DOE) is joining the quest to develop quantum computers, devices that would exploit quantum mechanics to crack problems that overwhelm conventional computers. The initiative comes as Google and other companies race to build a quantum computer that can demonstrate "quantum supremacy" by beating classical computers on a test problem. But reaching that milestone will not mean practical uses are at hand, and the new $40 million DOE effort is intended to spur the development of useful quantum computing algorithms for its work in chemistry, materials science, nuclear physics, and particle physics. With the resources at its 17 national laboratories, DOE could play a key role in developing the machines, researchers say, although finding problems with which quantum computers can help isn't so easy.

  20. Conversion and improvement of the Rutherford Laboratory's magnetostatic computer code GFUN3D to the NMFECC CDC 7600

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, T.C.

    1980-06-01

    The implementation of a version of the Rutherford Laboratory's magnetostatic computer code GFUN3D on the CDC 7600 at the National Magnetic Fusion Energy Computer Center is reported. A new iteration technique that greatly increases the probability of convergence and reduces computation time by about 30% for calculations with nonlinear, ferromagnetic materials is included. The use of GFUN3D on the NMFE network is discussed, and suggestions for future work are presented. Appendix A consists of revisions to the GFUN3D User Guide (published by Rutherford Laboratory( that are necessary to use this version. Appendix B contains input and output for some samplemore » calculations. Appendix C is a detailed discussion of the old and new iteration techniques.« less

  1. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  2. Catalog of Research Abstracts, 1993: Partnership opportunities at Lawrence Berkeley Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-09-01

    The 1993 edition of Lawrence Berkeley Laboratory`s Catalog of Research Abstracts is a comprehensive listing of ongoing research projects in LBL`s ten research divisions. Lawrence Berkeley Laboratory (LBL) is a major multi-program national laboratory managed by the University of California for the US Department of Energy (DOE). LBL has more than 3000 employees, including over 1000 scientists and engineers. With an annual budget of approximately $250 million, LBL conducts a wide range of research activities, many that address the long-term needs of American industry and have the potential for a positive impact on US competitiveness. LBL actively seeks to sharemore » its expertise with the private sector to increase US competitiveness in world markets. LBL has transferable expertise in conservation and renewable energy, environmental remediation, materials sciences, computing sciences, and biotechnology, which includes fundamental genetic research and nuclear medicine. This catalog gives an excellent overview of LBL`s expertise, and is a good resource for those seeking partnerships with national laboratories. Such partnerships allow private enterprise access to the exceptional scientific and engineering capabilities of the federal laboratory systems. Such arrangements also leverage the research and development resources of the private partner. Most importantly, they are a means of accessing the cutting-edge technologies and innovations being discovered every day in our federal laboratories.« less

  3. Mississippi Sound remote sensing study. [NASA Earth Resources Laboratory seasonal experiments

    NASA Technical Reports Server (NTRS)

    Atwell, B. H.; Thomann, G. C.

    1973-01-01

    A study of the Mississippi Sound was initiated in early 1971 by personnel of NASA Earth Resources Laboratory. Four separate seasonal experiments consisting of quasi-synoptic remote and surface measurements over the entire area were planned. Approximately 80 stations distributed throughout Mississippi Sound were occupied. Surface water temperature and secchi extinction depth were measured at each station and water samples were collected for water quality analyses. The surface distribution of three water parameters of interest from a remote sensing standpoint - temperature, salinity and chlorophyll content - are displayed in map form. Areal variations in these parameters are related to tides and winds. A brief discussion of the general problem of radiative measurements of water temperature is followed by a comparison of remotely measured temperatures (PRT-5) to surface vessel measurements.

  4. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Conti, C.; Barbero, C.; Galeão, A. P.

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of {sub Λ}{sup 5}He, {sub Λ}{sup 12}C and {sub Λ}{sup 13}C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions.

  5. Laboratory capacity building for the International Health Regulations (IHR[2005]) in resource-poor countries: the experience of the African Field Epidemiology Network (AFENET).

    PubMed

    Masanza, Monica Musenero; Nqobile, Ndlovu; Mukanga, David; Gitta, Sheba Nakacubo

    2010-12-03

    Laboratory is one of the core capacities that countries must develop for the implementation of the International Health Regulations (IHR[2005]) since laboratory services play a major role in all the key processes of detection, assessment, response, notification, and monitoring of events. While developed countries easily adapt their well-organized routine laboratory services, resource-limited countries need considerable capacity building as many gaps still exist. In this paper, we discuss some of the efforts made by the African Field Epidemiology Network (AFENET) in supporting laboratory capacity development in the Africa region. The efforts range from promoting graduate level training programs to building advanced technical, managerial and leadership skills to in-service short course training for peripheral laboratory staff. A number of specific projects focus on external quality assurance, basic laboratory information systems, strengthening laboratory management towards accreditation, equipment calibration, harmonization of training materials, networking and provision of pre-packaged laboratory kits to support outbreak investigation. Available evidence indicates a positive effect of these efforts on laboratory capacity in the region. However, many opportunities exist, especially to support the roll-out of these projects as well as attending to some additional critical areas such as biosafety and biosecuity. We conclude that AFENET's approach of strengthening national and sub-national systems provide a model that could be adopted in resource-limited settings such as sub-Saharan Africa.

  6. Strategic establishment of an International Pharmacology Specialty Laboratory in a resource-limited setting.

    PubMed

    Mtisi, Takudzwa J; Maponga, Charles; Monera-Penduka, Tsitsi G; Mudzviti, Tinashe; Chagwena, Dexter; Makita-Chingombe, Faithful; DiFranchesco, Robin; Morse, Gene D

    2018-01-01

    A growing number of drug development studies that include pharmacokinetic evaluations are conducted in regions lacking a specialised pharmacology laboratory. This necessitated the development of an International Pharmacology Specialty Laboratory (IPSL) in Zimbabwe. The aim of this article is to describe the development of an IPSL in Zimbabwe. The IPSL was developed collaboratively by the University of Zimbabwe and the University at Buffalo Center for Integrated Global Biomedical Sciences. Key stages included infrastructure development, establishment of quality management systems and collaborative mentorship in clinical pharmacology study design and chromatographic assay development and validation. Two high performance liquid chromatography instruments were donated by an instrument manufacturer and a contract research organisation. Laboratory space was acquired through association with the Zimbabwe national drug regulatory authority. Operational policies, standard operating procedures and a document control system were established. Scientists and technicians were trained in aspects relevant to IPSL operations. A high-performance liquid chromatography method for nevirapine was developed with the guidance of the Clinical Pharmacology Quality Assurance programme and approved by the assay method review programme. The University of Zimbabwe IPSL is engaged with the United States National Institute of Allergy and Infectious Diseases Division of AIDS research networks and is poised to begin drug assays and pharmacokinetic analyses. An IPSL has been successfully established in a resource-limited setting through the efforts of an external partnership providing technical guidance and motivated internal faculty and staff. Strategic partnerships were beneficial in navigating challenges leading to laboratory development and training new investigators. The IPSL is now engaged in clinical pharmacology research.

  7. Usnic Acid and the Intramolecular Hydrogen Bond: A Computational Experiment for the Organic Laboratory

    ERIC Educational Resources Information Center

    Green, Thomas K.; Lane, Charles A.

    2006-01-01

    A computational experiment is described for the organic chemistry laboratory that allows students to estimate the relative strengths of the intramolecular hydrogen bonds of usnic and isousnic acids, two related lichen secondary metabolites. Students first extract and purify usnic acid from common lichens and obtain [superscript 1]H NMR and IR…

  8. Computational Science in Armenia (Invited Talk)

    NASA Astrophysics Data System (ADS)

    Marandjian, H.; Shoukourian, Yu.

    This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.

  9. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    PubMed

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  10. Economic Education Laboratory: Initiating a Meaningful Economic Learning through Laboratory

    ERIC Educational Resources Information Center

    Noviani, Leny; Soetjipto, Budi Eko; Sabandi, Muhammad

    2015-01-01

    Laboratory is considered as one of the resources in supporting the learning process. The laboratory can be used as facilities to deepen the concepts, learning methods and enriching students' knowledge and skills. Learning process by utilizing the laboratory facilities can help lecturers and students in grasping the concept easily, constructing the…

  11. Leveraging Cloud Technology to Provide a Responsive, Reliable and Scalable Backend for the Virtual Ice Sheet Laboratory Using the Ice Sheet System Model and Amazon's Elastic Compute Cloud

    NASA Astrophysics Data System (ADS)

    Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.

    2015-12-01

    The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.

  12. Advanced CNC and CAM Series. Educational Resources for the Machine Tool Industry. Course Syllabi, Instructor's Handbook [and] Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and student laboratory manual for a 1-year vocational training program to prepare students for entry-level positions as advanced computer numerical control (CNC) and computer-assisted manufacturing (CAM) technicians.. The program was developed through a modification of the DACUM…

  13. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    PubMed

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  14. Overview of DOE Oil and Gas Field Laboratory Projects

    NASA Astrophysics Data System (ADS)

    Bromhal, G.; Ciferno, J.; Covatch, G.; Folio, E.; Melchert, E.; Ogunsola, O.; Renk, J., III; Vagnetti, R.

    2017-12-01

    America's abundant unconventional oil and natural gas (UOG) resources are critical components of our nation's energy portfolio. These resources need to be prudently developed to derive maximum benefits. In spite of the long history of hydraulic fracturing, the optimal number of fracturing stages during multi-stage fracture stimulation in horizontal wells is not known. In addition, there is the dire need of a comprehensive understanding of ways to improve the recovery of shale gas with little or no impacts on the environment. Research that seeks to expand our view of effective and environmentally sustainable ways to develop our nation's oil and natural gas resources can be done in the laboratory or at a computer; but, some experiments must be performed in a field setting. The Department of Energy (DOE) Field Lab Observatory projects are designed to address those research questions that must be studied in the field. The Department of Energy (DOE) is developing a suite of "field laboratory" test sites to carry out collaborative research that will help find ways of improving the recovery of energy resources as much as possible, with as little environmental impact as possible, from "unconventional" formations, such as shale and other low permeability rock formations. Currently there are three field laboratories in various stages of development and operation. Work is on-going at two of the sites: The Hydraulic Fracturing Test Site (HFTS) in the Permian Basin and the Marcellus Shale Energy and Environmental Lab (MSEEL) project in the Marcellus Shale Play. Agreement on the third site, the Utica Shale Energy and Environmental Lab (USEEL) project in the Utica Shale Play, was just recently finalized. Other field site opportunities may be forthcoming. This presentation will give an overview of the three field laboratory projects.

  15. AFHRL/FT [Air Force Human Resources Laboratory/Flight Training] Capabilities in Undergraduate Pilot Training Simulation Research: Executive Summary.

    ERIC Educational Resources Information Center

    Matheny, W. G.; And Others

    The document presents a summary description of the Air Force Human Resource Laboratory's Flying Training Division (AFHRL/FT) research capabilities for undergraduate pilot training. One of the research devices investigated is the Advanced Simulator for Undergraduate Pilot Training (ASUPT). The equipment includes the ASUPT, the instrumented T-37…

  16. Theme: Laboratory Instruction.

    ERIC Educational Resources Information Center

    Bruening, Thomas H.; And Others

    1992-01-01

    A series of theme articles discuss setting up laboratory hydroponics units, the school farm at the Zuni Pueblo in New Mexico, laboratory experiences in natural resources management and urban horticulture, the development of teaching labs at Derry (PA) High School, management of instructional laboratories, and industry involvement in agricultural…

  17. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  18. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    NASA Technical Reports Server (NTRS)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  19. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    NASA Astrophysics Data System (ADS)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  20. Report: EPA’s Radiation and Indoor Environments National Laboratory Should Improve Its Computer Room Security Controls

    EPA Pesticide Factsheets

    Report #12-P-0847, September 21, 2012.Our review of the security posture and in-place environmental controls of EPA’s Radiation and Indoor Environments National Laboratory computer room disclosed an array of security and environmental control deficiencies.

  1. Laboratory and software applications for clinical trials: the global laboratory environment.

    PubMed

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  2. QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations

    NASA Astrophysics Data System (ADS)

    Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas

    2008-10-01

    Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de

  3. "TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…

  4. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop

  5. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 1: Communication Aids. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on communication aids. The book's six chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by function, input/output…

  6. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  7. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  8. Administration of Computer Resources.

    ERIC Educational Resources Information Center

    Franklin, Gene F.

    Computing at Stanford University has, until recently, been performed at one of five facilities. The Stanford hospital operates an IBM 370/135 mainly for administrative use. The university business office has an IBM 370/145 for its administrative needs and support of the medical clinic. Under the supervision of the Stanford Computation Center are…

  9. Evaluation of hydrothermal resources of North Dakota. Phase II. Final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, K.L.; Howell, F.L.; Winczewski, L.M.

    1981-06-01

    This evaluation of the hydrothermal resources of North Dakota is based on existing data on file with the North Dakota Geological Survey (NDGS) and other state and federal agencies, and field and laboratory studies conducted. The principal sources of data used during the Phase II study were WELLFILE, the computer library of oil and gas well data developed during the Phase I study, and WATERCAT, a computer library system of water well data assembled during the Phase II study. A field survey of the shallow geothermal gradients present in selected groundwater observation holes was conducted. Laboratory determinations of the thermalmore » conductivity of core samples is being done to facilitate heat-flow calculations on those hole-of-convenience cased.« less

  10. Evaluation of hydrothermal resources of North Dakota. Phase III final technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, K.L.; Howell, F.L.; Wartman, B.L.

    1982-08-01

    The hydrothermal resources of North Dakota were evaluated. This evaluation was based on existing data on file with the North Dakota Geological Survey (NDGS) and other state and federal agencies, and field and laboratory studies conducted. The principal sources of data used during the study were WELLFILE, the computer library of oil and gas well data developed during the Phase I study, and WATERCAT, a computer library system of water well data assembled during the Phase II study. A field survey of the shallow geothermal gradients present in selected groundwater observation holes was conducted. Laboratory determinations of the thermal conductivitymore » of core samples were done to facilitate heat-flow calculations on those holes-of-convenience cased.« less

  11. Computer Processing 10-20-30. Teacher's Manual. Senior High School Teacher Resource Manual.

    ERIC Educational Resources Information Center

    Fisher, Mel; Lautt, Ray

    Designed to help teachers meet the program objectives for the computer processing curriculum for senior high schools in the province of Alberta, Canada, this resource manual includes the following sections: (1) program objectives; (2) a flowchart of curriculum modules; (3) suggestions for short- and long-range planning; (4) sample lesson plans;…

  12. A Virtual Rock Physics Laboratory Through Visualized and Interactive Experiments

    NASA Astrophysics Data System (ADS)

    Vanorio, T.; Di Bonito, C.; Clark, A. C.

    2014-12-01

    As new scientific challenges demand more comprehensive and multidisciplinary investigations, laboratory experiments are not expected to become simpler and/or faster. Experimental investigation is an indispensable element of scientific inquiry and must play a central role in the way current and future generations of scientist make decisions. To turn the complexity of laboratory work (and that of rocks!) into dexterity, engagement, and expanded learning opportunities, we are building an interactive, virtual laboratory reproducing in form and function the Stanford Rock Physics Laboratory, at Stanford University. The objective is to combine lectures on laboratory techniques and an online repository of visualized experiments consisting of interactive, 3-D renderings of equipment used to measure properties central to the study of rock physics (e.g., how to saturate rocks, how to measure porosity, permeability, and elastic wave velocity). We use a game creation system together with 3-D computer graphics, and a narrative voice to guide the user through the different phases of the experimental protocol. The main advantage gained in employing computer graphics over video footage is that students can virtually open the instrument, single out its components, and assemble it. Most importantly, it helps describe the processes occurring within the rock. These latter cannot be tracked while simply recording the physical experiment, but computer animation can efficiently illustrate what happens inside rock samples (e.g., describing acoustic waves, and/or fluid flow through a porous rock under pressure within an opaque core-holder - Figure 1). The repository of visualized experiments will complement lectures on laboratory techniques and constitute an on-line course offered through the EdX platform at Stanford. This will provide a virtual laboratory for anyone, anywhere to facilitate teaching/learning of introductory laboratory classes in Geophysics and expand the number of courses

  13. Laboratory and exterior decay of wood plastic composite boards: voids analysis and computed tomography

    Treesearch

    Grace Sun; Rebecca E. Ibach; Meghan Faillace; Marek Gnatowski; Jessie A. Glaeser; John Haight

    2016-01-01

    After exposure in the field and laboratory soil block culture testing, the void content of wood–plastic composite (WPC) decking boards was compared to unexposed samples. A void volume analysis was conducted based on calculations of sample density and from micro-computed tomography (microCT) data. It was found that reference WPC contains voids of different sizes from...

  14. Resource quality of a symmetry-protected topologically ordered phase for quantum computation.

    PubMed

    Miller, Jacob; Miyake, Akimasa

    2015-03-27

    We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing.

  15. Resource Quality of a Symmetry-Protected Topologically Ordered Phase for Quantum Computation

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Miyake, Akimasa

    2015-03-01

    We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing.

  16. Computing and information services at the Jet Propulsion Laboratory - A management approach to a diversity of needs

    NASA Technical Reports Server (NTRS)

    Felberg, F. H.

    1984-01-01

    The Jet Propulsion Laboratory, a research and development organization with about 5,000 employees, presents a complicated set of requirements for an institutional system of computing and informational services. The approach taken by JPL in meeting this challenge is one of controlled flexibility. A central communications network is provided, together with selected computing facilities for common use. At the same time, staff members are given considerable discretion in choosing the mini- and microcomputers that they believe will best serve their needs. Consultation services, computer education, and other support functions are also provided.

  17. ACToR: Aggregated Computational Toxicology Resource (T) ...

    EPA Pesticide Factsheets

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information on chemicals of interest to the EPA, and in particular to be a central source for the testing data on all chemicals regulated by all EPA programs; (2) To be a source of in vivo training data sets for building in vitro to in vivo computational models; (3) To serve as a central source of chemical structure and identity information for the ToxCastTM and Tox21 programs. There are 4 main databases, all linked through a common set of chemical information and a common structure linking chemicals to assay data: the public ACToR system (available at http://actor.epa.gov), the ToxMiner database holding ToxCast and Tox21 data, along with results form statistical analyses on these data; the Tox21 chemical repository which is managing the ordering and sample tracking process for the larger Tox21 project; and the public version of ToxRefDB. The public ACToR system contains information on ~500K compounds with toxicology, exposure and chemical property information from >400 public sources. The web site is visited by ~1,000 unique users per month and generates ~1,000 page requests per day on average. The databases are built on open source technology, which has allowed us to export them to a number of col

  18. Production Experiences with the Cray-Enabled TORQUE Resource Manager

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezell, Matthew A; Maxwell, Don E; Beer, David

    High performance computing resources utilize batch systems to manage the user workload. Cray systems are uniquely different from typical clusters due to Cray s Application Level Placement Scheduler (ALPS). ALPS manages binary transfer, job launch and monitoring, and error handling. Batch systems require special support to integrate with ALPS using an XML protocol called BASIL. Previous versions of Adaptive Computing s TORQUE and Moab batch suite integrated with ALPS from within Moab, using PERL scripts to interface with BASIL. This would occasionally lead to problems when all the components would become unsynchronized. Version 4.1 of the TORQUE Resource Manager introducedmore » new features that allow it to directly integrate with ALPS using BASIL. This paper describes production experiences at Oak Ridge National Laboratory using the new TORQUE software versions, as well as ongoing and future work to improve TORQUE.« less

  19. How Fifth Grade Latino/a Bilingual Students Use Their Linguistic Resources in the Classroom and Laboratory during Science Instruction

    ERIC Educational Resources Information Center

    Stevenson, Alma R.

    2013-01-01

    This qualitative, sociolinguistic research study examines how bilingual Latino/a students use their linguistic resources in the classroom and laboratory during science instruction. This study was conducted in a school in the southwestern United States serving an economically depressed, predominantly Latino population. The object of study was a…

  20. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  1. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  2. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    DOE PAGES

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; ...

    2015-07-08

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.

  3. FY04 Engineering Technology Reports Laboratory Directed Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, R M

    2005-01-27

    This report summarizes the science and technology research and development efforts in Lawrence Livermore National Laboratory's Engineering Directorate for FY2004, and exemplifies Engineering's more than 50-year history of developing the technologies needed to support the Laboratory's missions. Engineering has been a partner in every major program and project at the Laboratory throughout its existence and has prepared for this role with a skilled workforce and the technical resources developed through venues like the Laboratory Directed Research and Development Program (LDRD). This accomplishment is well summarized by Engineering's mission: ''Enable program success today and ensure the Laboratory's vitality tomorrow''. Engineering's investmentmore » in technologies is carried out through two programs, the ''Tech Base'' program and the LDRD program. LDRD is the vehicle for creating those technologies and competencies that are cutting edge. These require a significant level of research or contain some unknown that needs to be fully understood. Tech Base is used to apply technologies to a Laboratory need. The term commonly used for Tech Base projects is ''reduction to practice''. Therefore, the LDRD report covered here has a strong research emphasis. Areas that are presented all fall into those needed to accomplish our mission. For FY2004, Engineering's LDRD projects were focused on mesoscale target fabrication and characterization, development of engineering computational capability, material studies and modeling, remote sensing and communications, and microtechnology and nanotechnology for national security applications. Engineering's five Centers, in partnership with the Division Leaders and Department Heads, are responsible for guiding the long-term science and technology investments for the Directorate. The Centers represent technologies that have been identified as critical for the present and future work of the Laboratory, and are chartered to develop their

  4. SECURITY MODELING FOR MARITIME PORT DEFENSE RESOURCE ALLOCATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.; Dunn, D.

    2010-09-07

    Redeployment of existing law enforcement resources and optimal use of geographic terrain are examined for countering the threat of a maritime based small-vessel radiological or nuclear attack. The evaluation was based on modeling conducted by the Savannah River National Laboratory that involved the development of options for defensive resource allocation that can reduce the risk of a maritime based radiological or nuclear threat. A diverse range of potential attack scenarios has been assessed. As a result of identifying vulnerable pathways, effective countermeasures can be deployed using current resources. The modeling involved the use of the Automated Vulnerability Evaluation for Risksmore » of Terrorism (AVERT{reg_sign}) software to conduct computer based simulation modeling. The models provided estimates for the probability of encountering an adversary based on allocated resources including response boats, patrol boats and helicopters over various environmental conditions including day, night, rough seas and various traffic flow rates.« less

  5. Exploiting short-term memory in soft body dynamics as a computational resource

    PubMed Central

    Nakajima, K.; Li, T.; Hauser, H.; Pfeifer, R.

    2014-01-01

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. PMID:25185579

  6. Planning health education: Internet and computer resources in southwestern Nigeria. 2000-2001.

    PubMed

    Oyadoke, Adebola A; Salami, Kabiru K; Brieger, William R

    The use of the Internet as a health education tool and as a resource in health education planning is widely accepted as the norm in industrialized countries. Unfortunately, access to computers and the Internet is quite limited in developing countries. Not all licensed service providers operate, many users are actually foreign nationals, telephone connections are unreliable, and electricity supplies are intermittent. In this context, computer, e-mail, Internet, and CD-Rom use by health and health education program officers in five states in southwestern Nigeria were assessed to document their present access and use. Eight of the 30 organizations visited were government health ministry departments, while the remainder were non-governmental organizations (NGOs). Six NGOs and four State Ministry of Health (MOH) departments had no computers, but nearly two-thirds of both types of agency had e-mail, less than one-third had Web browsing facilities, and six had CD-Roms, all of whom were NGOs. Only 25 of the 48 individual respondents had computer use skills. Narrative responses from individual employees showed a qualitative difference between computer and Internet access and use and type of agency. NGO staff in organizations with computers indicated having relatively free access to a computer and the Internet and used these for both program planning and administrative purposes. In government offices it appeared that computers were more likely to be located in administrative or statistics offices and used for management tasks like salaries and correspondence, limiting the access of individual health staff. These two different organizational cultures must be considered when plans are made for increasing computer availability and skills for health education planning.

  7. A Distributed Laboratory for Event-Driven Coastal Prediction and Hazard Planning

    NASA Astrophysics Data System (ADS)

    Bogden, P.; Allen, G.; MacLaren, J.; Creager, G. J.; Flournoy, L.; Sheng, Y. P.; Graber, H.; Graves, S.; Conover, H.; Luettich, R.; Perrie, W.; Ramakrishnan, L.; Reed, D. A.; Wang, H. V.

    2006-12-01

    The 2005 Atlantic hurricane season was the most active in recorded history. Collectively, 2005 hurricanes caused more than 2,280 deaths and record damages of over 100 billion dollars. Of the storms that made landfall, Dennis, Emily, Katrina, Rita, and Wilma caused most of the destruction. Accurate predictions of storm-driven surge, wave height, and inundation can save lives and help keep recovery costs down, provided the information gets to emergency response managers in time. The information must be available well in advance of landfall so that responders can weigh the costs of unnecessary evacuation against the costs of inadequate preparation. The SURA Coastal Ocean Observing and Prediction (SCOOP) Program is a multi-institution collaboration implementing a modular, distributed service-oriented architecture for real time prediction and visualization of the impacts of extreme atmospheric events. The modular infrastructure enables real-time prediction of multi- scale, multi-model, dynamic, data-driven applications. SURA institutions are working together to create a virtual and distributed laboratory integrating coastal models, simulation data, and observations with computational resources and high speed networks. The loosely coupled architecture allows teams of computer and coastal scientists at multiple institutions to innovate complex system components that are interconnected with relatively stable interfaces. The operational system standardizes at the interface level to enable substantial innovation by complementary communities of coastal and computer scientists. This architectural philosophy solves a long-standing problem associated with the transition from research to operations. The SCOOP Program thereby implements a prototype laboratory consistent with the vision of a national, multi-agency initiative called the Integrated Ocean Observing System (IOOS). Several service- oriented components of the SCOOP enterprise architecture have already been designed and

  8. Infrastructure Systems for Advanced Computing in E-science applications

    NASA Astrophysics Data System (ADS)

    Terzo, Olivier

    2013-04-01

    In the e-science field are growing needs for having computing infrastructure more dynamic and customizable with a model of use "on demand" that follow the exact request in term of resources and storage capacities. The integration of grid and cloud infrastructure solutions allows us to offer services that can adapt the availability in terms of up scaling and downscaling resources. The main challenges for e-sciences domains will on implement infrastructure solutions for scientific computing that allow to adapt dynamically the demands of computing resources with a strong emphasis on optimizing the use of computing resources for reducing costs of investments. Instrumentation, data volumes, algorithms, analysis contribute to increase the complexity for applications who require high processing power and storage for a limited time and often exceeds the computational resources that equip the majority of laboratories, research Unit in an organization. Very often it is necessary to adapt or even tweak rethink tools, algorithms, and consolidate existing applications through a phase of reverse engineering in order to adapt them to a deployment on Cloud infrastructure. For example, in areas such as rainfall monitoring, meteorological analysis, Hydrometeorology, Climatology Bioinformatics Next Generation Sequencing, Computational Electromagnetic, Radio occultation, the complexity of the analysis raises several issues such as the processing time, the scheduling of tasks of processing, storage of results, a multi users environment. For these reasons, it is necessary to rethink the writing model of E-Science applications in order to be already adapted to exploit the potentiality of cloud computing services through the uses of IaaS, PaaS and SaaS layer. An other important focus is on create/use hybrid infrastructure typically a federation between Private and public cloud, in fact in this way when all resources owned by the organization are all used it will be easy with a federate

  9. Tetrahymena in the laboratory: strain resources, methods for culture, maintenance, and storage.

    PubMed

    Cassidy-Hanley, Donna M

    2012-01-01

    The ciliated protozoan Tetrahymena thermophila has been an important model system for biological research for many years. During that time, a variety of useful strains, including highly inbred stocks, a collection of diverse mutant strains, and wild cultivars from a variety of geographical locations have been identified. In addition, thanks to the efforts of many different laboratories, optimal conditions for growth, maintenance, and storage of Tetrahymena have been worked out. To facilitate the efficient use of Tetrahymena, especially by those new to the system, this chapter presents a brief description of many available Tetrahymena strains and lists possible resources for obtaining viable cultures of T. thermophila and other Tetrahymena species. Descriptions of commonly used media, methods for cell culture and maintenance, and protocols for short- and long-term storage are also presented. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  11. Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters

    PubMed Central

    Torres-Huitzil, Cesar

    2013-01-01

    Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k 2 − 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456

  12. Laboratory challenges in the scaling up of HIV, TB, and malaria programs: The interaction of health and laboratory systems, clinical research, and service delivery.

    PubMed

    Birx, Deborah; de Souza, Mark; Nkengasong, John N

    2009-06-01

    Strengthening national health laboratory systems in resource-poor countries is critical to meeting the United Nations Millennium Development Goals. Despite strong commitment from the international community to fight major infectious diseases, weak laboratory infrastructure remains a huge rate-limiting step. Some major challenges facing laboratory systems in resource-poor settings include dilapidated infrastructure; lack of human capacity, laboratory policies, and strategic plans; and limited synergies between clinical and research laboratories. Together, these factors compromise the quality of test results and impact patient management. With increased funding, the target of laboratory strengthening efforts in resource-poor countries should be the integrating of laboratory services across major diseases to leverage resources with respect to physical infrastructure; types of assays; supply chain management of reagents and equipment; and maintenance of equipment.

  13. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    PubMed Central

    Judson, Richard S.; Martin, Matthew T.; Egeghy, Peter; Gangwal, Sumit; Reif, David M.; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A.; Richard, Ann M.

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases. PMID:22408426

  14. Aggregating data for computational toxicology applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System.

    PubMed

    Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.

  15. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    PubMed Central

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-01-01

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies. DOI: http://dx.doi.org/10.7554/eLife.07661.001 PMID:26154972

  16. Idaho National Laboratory Cultural Resource Monitoring Report for 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Julie B.

    2013-10-01

    This report describes the cultural resource monitoring activities of the Idaho National Laboratory’s (INL) Cultural Resource Management (CRM) Office during 2013. Throughout the year, thirty-eight cultural resource localities were revisited including: two locations with Native American human remains, one of which is also a cave; fourteen additional caves; seven prehistoric archaeological sites ; four historic archaeological sites; one historic trail; one nuclear resource (Experimental Breeder Reactor-I, a designated National Historic Landmark); and nine historic structures located at the Central Facilities Area. Of the monitored resources, thirty-three were routinely monitored, and five were monitored to assess project compliance with cultural resourcemore » recommendations along with the effects of ongoing project activities. On six occasions, ground disturbing activities within the boundaries of the Power Burst Facility/Critical Infrastructure Test Range Complex (PBF/CITRC) were observed by INL CRM staff prepared to respond to any additional finds of Native American human remains. In addition, two resources were visited more than once as part of the routine monitoring schedule or to monitor for additional damage. Throughout the year, most of the cultural resources monitored had no visual adverse changes resulting in Type 1determinations. However, Type 2 impacts were noted at eight sites, indicating that although impacts were noted or that a project was operating outside of culturally cleared limitations, cultural resources retained integrity and noted impacts did not threaten National Register eligibility. No new Type 3 or any Type 4 impacts that adversely impacted cultural resources and threatened National Register eligibility were observed at cultural resources monitored in 2013.« less

  17. Role of medical, technical, and administrative leadership in the human resource management life cycle: a team approach to laboratory management.

    PubMed

    Wilkinson, D S; Dilts, T J

    1999-01-01

    We believe the team approach to laboratory management achieves the best outcomes. Laboratory management requires the integration of medical, technical, and administrative expertise to achieve optimal service, quality, and cost performance. Usually, a management team of two or more individuals must be assembled to achieve all of these critical leadership functions. The individual members of the management team must possess the requisite expertise in clinical medicine, laboratory science, technology management, and administration. They also must work together in a unified and collaborative manner, regardless of where individual team members appear on the organizational chart. The management team members share in executing the entire human resource management life cycle, creating the proper environment to maximize human performance. Above all, the management team provides visionary and credible leadership.

  18. Exploiting short-term memory in soft body dynamics as a computational resource.

    PubMed

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-06

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  19. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  20. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 2: Switches and Environmental Controls. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on switches and environmental controls. The book's three chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by…

  1. Institute of Laboratory Animal Resources (ILAR)

    DTIC Science & Technology

    1994-05-12

    Athens. Georgia Muriel T. Davisson. The Jackson Laboratory, Bar Harbor. Maine Neal L. First. University of Wisconsin, Madison , Wisconsin James W. Glosser...Hear, Wisconsin Regional Primate Research Center, Madison . Wisconsin Margaret Z. Jones. Michigan State University, East Lansing, Michigan Michael D...California School of Medicine, Los Angeles, California Henry C. Pitot III, University of Wisconsin. Madison , Wisconsin Paul G. Risser, Miami University

  2. Computerized provider order entry in the clinical laboratory

    PubMed Central

    Baron, Jason M.; Dighe, Anand S.

    2011-01-01

    Clinicians have traditionally ordered laboratory tests using paper-based orders and requisitions. However, paper orders are becoming increasingly incompatible with the complexities, challenges, and resource constraints of our modern healthcare systems and are being replaced by electronic order entry systems. Electronic systems that allow direct provider input of diagnostic testing or medication orders into a computer system are known as Computerized Provider Order Entry (CPOE) systems. Adoption of laboratory CPOE systems may offer institutions many benefits, including reduced test turnaround time, improved test utilization, and better adherence to practice guidelines. In this review, we outline the functionality of various CPOE implementations, review the reported benefits, and discuss strategies for using CPOE to improve the test ordering process. Further, we discuss barriers to the implementation of CPOE systems that have prevented their more widespread adoption. PMID:21886891

  3. Selecting, Evaluating and Creating Policies for Computer-Based Resources in the Behavioral Sciences and Education.

    ERIC Educational Resources Information Center

    Richardson, Linda B., Comp.; And Others

    This collection includes four handouts: (1) "Selection Critria Considerations for Computer-Based Resources" (Linda B. Richardson); (2) "Software Collection Policies in Academic Libraries" (a 24-item bibliography, Jane W. Johnson); (3) "Circulation and Security of Software" (a 19-item bibliography, Sara Elizabeth Williams); and (4) "Bibliography of…

  4. Development and evaluation of a computer program to grade student performance on peripheral blood smears

    NASA Astrophysics Data System (ADS)

    Lehman, Donald Clifford

    Today's medical laboratories are dealing with cost containment health care policies and unfilled laboratory positions. Because there may be fewer experienced clinical laboratory scientists, students graduating from clinical laboratory science (CLS) programs are expected by their employers to perform accurately in entry-level positions with minimal training. Information in the CLS field is increasing at a dramatic rate, and instructors are expected to teach more content in the same amount of time with the same resources. With this increase in teaching obligations, instructors could use a tool to facilitate grading. The research question was, "Can computer-assisted assessment evaluate students in an accurate and time efficient way?" A computer program was developed to assess CLS students' ability to evaluate peripheral blood smears. Automated grading permits students to get results quicker and allows the laboratory instructor to devote less time to grading. This computer program could improve instruction by providing more time to students and instructors for other activities. To be valuable, the program should provide the same quality of grading as the instructor. These benefits must outweigh potential problems such as the time necessary to develop and maintain the program, monitoring of student progress by the instructor, and the financial cost of the computer software and hardware. In this study, surveys of students and an interview with the laboratory instructor were performed to provide a formative evaluation of the computer program. In addition, the grading accuracy of the computer program was examined. These results will be used to improve the program for use in future courses.

  5. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    NASA Astrophysics Data System (ADS)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  6. Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.

    ERIC Educational Resources Information Center

    Rosenberg, R.C.; And Others

    These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…

  7. SInCRe—structural interactome computational resource for Mycobacterium tuberculosis

    PubMed Central

    Metri, Rahul; Hariharaputran, Sridhar; Ramakrishnan, Gayatri; Anand, Praveen; Raghavender, Upadhyayula S.; Ochoa-Montaño, Bernardo; Higueruelo, Alicia P.; Sowdhamini, Ramanathan; Chandra, Nagasuma R.; Blundell, Tom L.; Srinivasan, Narayanaswamy

    2015-01-01

    We have developed an integrated database for Mycobacterium tuberculosis H37Rv (Mtb) that collates information on protein sequences, domain assignments, functional annotation and 3D structural information along with protein–protein and protein–small molecule interactions. SInCRe (Structural Interactome Computational Resource) is developed out of CamBan (Cambridge and Bangalore) collaboration. The motivation for development of this database is to provide an integrated platform to allow easily access and interpretation of data and results obtained by all the groups in CamBan in the field of Mtb informatics. In-house algorithms and databases developed independently by various academic groups in CamBan are used to generate Mtb-specific datasets and are integrated in this database to provide a structural dimension to studies on tuberculosis. The SInCRe database readily provides information on identification of functional domains, genome-scale modelling of structures of Mtb proteins and characterization of the small-molecule binding sites within Mtb. The resource also provides structure-based function annotation, information on small-molecule binders including FDA (Food and Drug Administration)-approved drugs, protein–protein interactions (PPIs) and natural compounds that bind to pathogen proteins potentially and result in weakening or elimination of host–pathogen protein–protein interactions. Together they provide prerequisites for identification of off-target binding. Database URL: http://proline.biochem.iisc.ernet.in/sincre PMID:26130660

  8. Theme: Land Laboratories--Urban Settings, Liability, Natural Resources Labs.

    ERIC Educational Resources Information Center

    Whaley, David, Ed.; And Others

    1994-01-01

    Includes "With a Little Imagination"; "From Fallow to Fertile"; "Operating a School Enterprise in Agriculture"; "Using a Nontraditional Greenhouse to Enhance Lab Instruction"; "Risk Management for Liability in Operating Land Laboratories"; "Working Land and Water Laboratory for Natural…

  9. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    ERIC Educational Resources Information Center

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  10. The Laboratory-Based Economics Curriculum.

    ERIC Educational Resources Information Center

    King, Paul G.; LaRoe, Ross M.

    1991-01-01

    Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…

  11. Resources

    Science.gov Websites

    Science Programs Applied Energy Programs Civilian Nuclear Energy Programs Laboratory Directed Research Service Academies Research Associates (SARA) Postdocs, Students Employee, Retiree Resources Benefits New

  12. An inexpensive modification of the laboratory computer display changes emergency physicians' work habits and perceptions.

    PubMed

    Marinakis, Harry A; Zwemer, Frank L

    2003-02-01

    Little is known about how the availability of laboratory data affects emergency physicians' practice habits and satisfaction. We modified our clinical information system to display laboratory test status with continuous updates, similar to an airport arrival display. The objective of this study was to determine whether the laboratory test status display altered emergency physicians' work habits and increased satisfaction compared with the time period before implementation of laboratory test status. A retrospective analysis was performed of emergency physicians' actual use of the clinical information system before and after implementation of the laboratory test status display. Emergency physicians were retrospectively surveyed regarding the effect of laboratory test status display on their practice habits and clinical information system use. Survey responses were matched with actual use of the clinical information system. Data were analyzed by using dependent t tests and Pearson correlation coefficients. The study was conducted at a university hospital. Clinical information system use by 46 emergency physicians was analyzed. Twenty-five surveys were returned (71.4% of available emergency physicians). All emergency physicians perceived fewer clinical information system log ons per day after laboratory test status display. The actual average decrease was 19%. Emergency physicians who reported the greatest decrease in log ons per day tended to have the greatest actual decrease (r =-0.36). There was no significant correlation between actual and perceived total time logged on (r =0.08). In regard to effect on emergency physicians' practice habits, 95% reported increased efficiency, 80% reported improved satisfaction with data access, and 65% reported improved communication with patients. An inexpensive computer modification, laboratory test status display, significantly increased subjective efficiency, changed work habits, and improved satisfaction regarding data access

  13. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  14. Computer soundcard as an AC signal generator and oscilloscope for the physics laboratory

    NASA Astrophysics Data System (ADS)

    Sinlapanuntakul, Jinda; Kijamnajsuk, Puchong; Jetjamnong, Chanthawut; Chotikaprakhan, Sutharat

    2018-01-01

    The purpose of this paper is to develop both an AC signal generator and a dual-channel oscilloscope based on standard personal computer equipped with sound card as parts of the laboratory of the fundamental physics and the introduction to electronics classes. The setup turns the computer into the two channel measured device which can provides sample rate, simultaneous sampling, frequency range, filters and others essential capabilities required to perform amplitude, phase and frequency measurements of AC signal. The AC signal also generate from the same computer sound card output simultaneously in any waveform such as sine, square, triangle, saw-toothed pulsed, swept sine and white noise etc. These can convert an inexpensive PC sound card into powerful device, which allows the students to measure physical phenomena with their own PCs either at home or at university attendance. A graphic user interface software was developed for control and analysis, including facilities for data recording, signal processing and real time measurement display. The result is expanded utility of self-learning for the students in the field of electronics both AC and DC circuits, including the sound and vibration experiments.

  15. Reassigning the Structures of Natural Products Using NMR Chemical Shifts Computed with Quantum Mechanics: A Laboratory Exercise

    ERIC Educational Resources Information Center

    Palazzo, Teresa A.; Truong, Tiana T.; Wong, Shirley M. T.; Mack, Emma T.; Lodewyk, Michael W.; Harrison, Jason G.; Gamage, R. Alan; Siegel, Justin B.; Kurth, Mark J.; Tantillo, Dean J.

    2015-01-01

    An applied computational chemistry laboratory exercise is described in which students use modern quantum chemical calculations of chemical shifts to assign the structure of a recently isolated natural product. A pre/post assessment was used to measure student learning gains and verify that students demonstrated proficiency of key learning…

  16. 30 CFR 795.10 - Qualified laboratories.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 3 2011-07-01 2011-07-01 false Qualified laboratories. 795.10 Section 795.10... laboratories. (a) Basic qualifications. To be designated a qualified laboratory, a firm shall demonstrate that... necessary field samples and making hydrologic field measurements and analytical laboratory determinations by...

  17. Computing through Scientific Abstractions in SysBioPS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Stephan, Eric G.; Gracio, Deborah K.

    2004-10-13

    Today, biologists and bioinformaticists have a tremendous amount of computational power at their disposal. With the availability of supercomputers, burgeoning scientific databases and digital libraries such as GenBank and PubMed, and pervasive computational environments such as the Grid, biologists have access to a wealth of computational capabilities and scientific data at hand. Yet, the rapid development of computational technologies has far exceeded the typical biologist’s ability to effectively apply the technology in their research. Computational sciences research and development efforts such as the Biology Workbench, BioSPICE (Biological Simulation Program for Intra-Cellular Evaluation), and BioCoRE (Biological Collaborative Research Environment) are importantmore » in connecting biologists and their scientific problems to computational infrastructures. On the Computational Cell Environment and Heuristic Entity-Relationship Building Environment projects at the Pacific Northwest National Laboratory, we are jointly developing a new breed of scientific problem solving environment called SysBioPSE that will allow biologists to access and apply computational resources in the scientific research context. In contrast to other computational science environments, SysBioPSE operates as an abstraction layer above a computational infrastructure. The goal of SysBioPSE is to allow biologists to apply computational resources in the context of the scientific problems they are addressing and the scientific perspectives from which they conduct their research. More specifically, SysBioPSE allows biologists to capture and represent scientific concepts and theories and experimental processes, and to link these views to scientific applications, data repositories, and computer systems.« less

  18. Feasibility of establishing a biosafety level 3 tuberculosis culture laboratory of acceptable quality standards in a resource-limited setting: an experience from Uganda.

    PubMed

    Ssengooba, Willy; Gelderbloem, Sebastian J; Mboowa, Gerald; Wajja, Anne; Namaganda, Carolyn; Musoke, Philippa; Mayanja-Kizza, Harriet; Joloba, Moses Lutaakome

    2015-01-15

    performance standards in resource-limited countries. With the demonstrated quality of work, the laboratory attracted more research groups and post-pioneer funding, which helped to ensure sustainability. The high skilled experts in this research laboratory also continue to provide an excellent resource for the needed national discussion of the laboratory and quality management systems.

  19. Laboratory directed research and development program FY 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Todd; Levy, Karin

    2000-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness. This is the annual report on Laboratory Directed Research and Development (LDRD) program for FY99.« less

  20. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics (LQCD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negele, John W.

    Building on the success of two preceding generations of Scientific Discovery through Advanced Computing (SciDAC) projects, this grant supported the MIT component (P.I. John Negele) of a multi-institutional SciDAC-3 project that also included Brookhaven National Laboratory, the lead laboratory with P. I. Frithjof Karsch serving as Project Director, Thomas Jefferson National Accelerator Facility with P. I. David Richards serving as Co-director, University of Washington with P. I. Martin Savage, University of North Carolina with P. I. Rob Fowler, and College of William and Mary with P. I. Andreas Stathopoulos. Nationally, this multi-institutional project coordinated the software development effort that themore » nuclear physics lattice QCD community needs to ensure that lattice calculations can make optimal use of forthcoming leadership-class and dedicated hardware, including that at the national laboratories, and to exploit future computational resources in the Exascale era.« less

  1. Self managing experiment resources

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Ubeda, M.; Tsaregorodtsev, A.; Romanovskiy, V.; Roiser, S.; Charpentier, P.; Graciani, R.

    2014-06-01

    Within this paper we present an autonomic Computing resources management system, used by LHCb for assessing the status of their Grid resources. Virtual Organizations Grids include heterogeneous resources. For example, LHC experiments very often use resources not provided by WLCG, and Cloud Computing resources will soon provide a non-negligible fraction of their computing power. The lack of standards and procedures across experiments and sites generated the appearance of multiple information systems, monitoring tools, ticket portals, etc... which nowadays coexist and represent a very precious source of information for running HEP experiments Computing systems as well as sites. These two facts lead to many particular solutions for a general problem: managing the experiment resources. In this paper we present how LHCb, via the DIRAC interware, addressed such issues. With a renewed Central Information Schema hosting all resources metadata and a Status System (Resource Status System) delivering real time information, the system controls the resources topology, independently of the resource types. The Resource Status System applies data mining techniques against all possible information sources available and assesses the status changes, that are then propagated to the topology description. Obviously, giving full control to such an automated system is not risk-free. Therefore, in order to minimise the probability of misbehavior, a battery of tests has been developed in order to certify the correctness of its assessments. We will demonstrate the performance and efficiency of such a system in terms of cost reduction and reliability.

  2. PandASoft: Open Source Instructional Laboratory Administration Software

    NASA Astrophysics Data System (ADS)

    Gay, P. L.; Braasch, P.; Synkova, Y. N.

    2004-12-01

    PandASoft (Physics and Astronomy Software) is software for organizing and archiving a department's teaching resources and materials. An easy to use, secure interface allows faculty and staff to explore equipment inventories, see what laboratory experiments are available, find handouts, and track what has been used in different classes in the past. Divided into five sections: classes, equipment, laboratories, links, and media, its database cross links materials, allowing users to see what labs are used with which classes, what media and equipment are used with which labs, or simply what equipment is lurking in which room. Written in PHP and MySQL, this software can be installed on any UNIX / Linux platform, including Macintosh OS X. It is designed to allow users to easily customize the headers, footers and colors to blend with existing sites - no programming experience required. While initial data input is labor intensive, the system will save time later by allowing users to quickly answer questions related to what is in inventory, where it is located, how many are in stock, and where online they can learn more. It will also provide a central location for storing PDFs of handouts, and links to applets and cool sites at other universities. PandASoft comes with over 100 links to online resources pre-installed. We would like to thank Dr. Wolfgang Rueckner and the Harvard University Science Center for providing computers and resources for this project.

  3. Simulating Laboratory Procedures.

    ERIC Educational Resources Information Center

    Baker, J. E.; And Others

    1986-01-01

    Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…

  4. Setting up a Low-Cost Lab Management System for a Multi-Purpose Computing Laboratory Using Virtualisation Technology

    ERIC Educational Resources Information Center

    Mok, Heng Ngee; Lee, Yeow Leong; Tan, Wee Kiat

    2012-01-01

    This paper describes how a generic computer laboratory equipped with 52 workstations is set up for teaching IT-related courses and other general purpose usage. The authors have successfully constructed a lab management system based on decentralised, client-side software virtualisation technology using Linux and free software tools from VMware that…

  5. A Low-Cost Computer-Controlled Arduino-Based Educational Laboratory System for Teaching the Fundamentals of Photovoltaic Cells

    ERIC Educational Resources Information Center

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-01-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental…

  6. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subjectmore » that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.« less

  7. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  8. Sandia National Laboratories: Employee & Retiree Resources: Emergency

    Science.gov Websites

    ; Technology Defense Systems & Assessments About Defense Systems & Assessments Program Areas Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  9. Sandia National Laboratories: Employee & Retiree Resources: Technical

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  10. Computer simulation of thermal and fluid systems for MIUS integration and subsystems test /MIST/ laboratory. [Modular Integrated Utility System

    NASA Technical Reports Server (NTRS)

    Rochelle, W. C.; Liu, D. K.; Nunnery, W. J., Jr.; Brandli, A. E.

    1975-01-01

    This paper describes the application of the SINDA (systems improved numerical differencing analyzer) computer program to simulate the operation of the NASA/JSC MIUS integration and subsystems test (MIST) laboratory. The MIST laboratory is designed to test the integration capability of the following subsystems of a modular integrated utility system (MIUS): (1) electric power generation, (2) space heating and cooling, (3) solid waste disposal, (4) potable water supply, and (5) waste water treatment. The SINDA/MIST computer model is designed to simulate the response of these subsystems to externally impressed loads. The computer model determines the amount of recovered waste heat from the prime mover exhaust, water jacket and oil/aftercooler and from the incinerator. This recovered waste heat is used in the model to heat potable water, for space heating, absorption air conditioning, waste water sterilization, and to provide for thermal storage. The details of the thermal and fluid simulation of MIST including the system configuration, modes of operation modeled, SINDA model characteristics and the results of several analyses are described.

  11. Applications of computational modeling in ballistics

    NASA Technical Reports Server (NTRS)

    Sturek, Walter B.

    1987-01-01

    The development of the technology of ballistics as applied to gun launched Army weapon systems is the main objective of research at the U.S. Army Ballistic Research Laboratory (BRL). The primary research programs at the BRL consist of three major ballistic disciplines: exterior, interior, and terminal. The work done at the BRL in these areas was traditionally highly dependent on experimental testing. A considerable emphasis was placed on the development of computational modeling to augment the experimental testing in the development cycle; however, the impact of the computational modeling to this date is modest. With the availability of supercomputer computational resources recently installed at the BRL, a new emphasis on the application of computational modeling to ballistics technology is taking place. The major application areas are outlined which are receiving considerable attention at the BRL at present and to indicate the modeling approaches involved. An attempt was made to give some information as to the degree of success achieved and indicate the areas of greatest need.

  12. Symmetrically private information retrieval based on blind quantum computing

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling

    2015-05-01

    Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.

  13. NREL: Renewable Resource Data Center - Geothermal Resource Related Links

    Science.gov Websites

    from the following sources: U.S. Department of Energy Geothermal Technologies Office. National Geothermal Resource Related Links Comprehensive geothermal resource information is also available Geothermal Data System A portal to geothermal data. Southern Methodist University Geothermal Laboratory The

  14. Using an Advanced Computational Laboratory Experiment to Extend and Deepen Physical Chemistry Students' Understanding of Atomic Structure

    ERIC Educational Resources Information Center

    Hoffman, Gary G.

    2015-01-01

    A computational laboratory experiment is described, which involves the advanced study of an atomic system. The students use concepts and techniques typically covered in a physical chemistry course but extend those concepts and techniques to more complex situations. The students get a chance to explore the study of atomic states and perform…

  15. Low-Cost Virtual Laboratory Workbench for Electronic Engineering

    ERIC Educational Resources Information Center

    Achumba, Ifeyinwa E.; Azzi, Djamel; Stocker, James

    2010-01-01

    The laboratory component of undergraduate engineering education poses challenges in resource constrained engineering faculties. The cost, time, space and physical presence requirements of the traditional (real) laboratory approach are the contributory factors. These resource constraints may mitigate the acquisition of meaningful laboratory…

  16. Pencil-and-Paper Neural Networks: An Undergraduate Laboratory Exercise in Computational Neuroscience

    PubMed Central

    Crisp, Kevin M.; Sutter, Ellen N.; Westerberg, Jacob A.

    2015-01-01

    Although it has been more than 70 years since McCulloch and Pitts published their seminal work on artificial neural networks, such models remain primarily in the domain of computer science departments in undergraduate education. This is unfortunate, as simple network models offer undergraduate students a much-needed bridge between cellular neurobiology and processes governing thought and behavior. Here, we present a very simple laboratory exercise in which students constructed, trained and tested artificial neural networks by hand on paper. They explored a variety of concepts, including pattern recognition, pattern completion, noise elimination and stimulus ambiguity. Learning gains were evident in changes in the use of language when writing about information processing in the brain. PMID:26557791

  17. Laboratories | Energy Systems Integration Facility | NREL

    Science.gov Websites

    laboratories to be safely divided into multiple test stand locations (or "capability hubs") to enable Fabrication Laboratory Energy Systems High-Pressure Test Laboratory Energy Systems Integration Laboratory Energy Systems Sensor Laboratory Fuel Cell Development and Test Laboratory High-Performance Computing

  18. Assessment of physical activity with the Computer Science and Applications, Inc., accelerometer: laboratory versus field validation.

    PubMed

    Nichols, J F; Morgan, C G; Chabot, L E; Sallis, J F; Calfas, K J

    2000-03-01

    Our purpose was to compare the validity of the Computer Science and Applications, (CSA) Inc., accelerometer in laboratory and field settings and establish CSA count ranges for light, moderate, and vigorous physical activity. Validity was determined in 60 adults during treadmill exercise, using oxygen consumption (VO2) as the criterion measure, while 30 adults walked and jogged outdoors on a 400-m track. The relationship between CSA counts and VO2 was linear (R2 = .89 SEE = 3.72 ml.kg-1.min-1), as was the relationship between velocity and counts in the field (R2 = .89, SEE = 0.89 mi.hr-1). However, significant differences were found (p < .05) between laboratory and field measures of CSA counts for light and vigorous intensity. We conclude that the CSA can be used to quantify walking and jogging outdoors on level ground; however, laboratory equations may not be appropriate for use in field settings, particularly for light and vigorous activity.

  19. Realizing the Potential of Information Resources: Information, Technology, and Services. Track 8: Academic Computing and Libraries.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Eight papers are presented from the 1995 CAUSE conference track on academic computing and library issues faced by managers of information technology at colleges and universities. The papers include: (1) "Where's the Beef?: Implementation of Discipline-Specific Training on Internet Resources" (Priscilla Hancock and others); (2)…

  20. Idaho National Laboratory Cultural Resource Monitoring Report for FY 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brenda R. Pace; Julie B. Braun

    2009-10-01

    This report describes the cultural resource monitoring activities of the Idaho National Laboratory’s (INL) Cultural Resource Management (CRM) Office during fiscal year 2009 (FY 2009). Throughout the year, thirty-eight cultural resource localities were revisited including: two locations with Native American human remains, one of which is a cave, two additional caves, twenty-two prehistoric archaeological sites, six historic homesteads, two historic stage stations, two historic trails, and two nuclear resources, including Experimental Breeder Reactor-I, which is a designated National Historic Landmark. Several INL project areas were also monitored in FY 2009 to assess project compliance with cultural resource recommendations and monitormore » the effects of ongoing project activities. Although impacts were documented at a few locations and trespassing citations were issued in one instance, no significant adverse effects that would threaten the National Register eligibility of any resources were observed. Monitoring also demonstrated that several INL projects generally remain in compliance with recommendations to protect cultural resources.« less

  1. A comparison of traditional physical laboratory and computer-simulated laboratory experiences in relation to engineering undergraduate students' conceptual understandings of a communication systems topic

    NASA Astrophysics Data System (ADS)

    Javidi, Giti

    2005-07-01

    This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the

  2. Lawrence Berkeley Laboratory Institutional Plan, FY 1993--1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chew, Joseph T.; Stroh, Suzanne C.; Maio, Linda R.

    1992-10-01

    The FY 1993--1998 Institutional Plan provides an overview of the Lawrence Berkeley Laboratory mission, strategic plan, scientific initiatives, research programs, environment and safety program plans, educational and technology transfer efforts, human resources, and facilities needs. The Strategic Plan section identifies long-range conditions that can influence the Laboratory, potential research trends, and several management implications. The Initiatives section identifies potential new research programs that represent major long-term opportunities for the Laboratory and the resources required for their implementation. The Scientific and Technical Programs section summarizes current programs and potential changes in research program activity. The Environment, Safety, and Health section describesmore » the management systems and programs underway at the Laboratory to protect the environment, the public, and the employees. The Technology Transfer and Education programs section describes current and planned programs to enhance the nation`s scientific literacy and human infrastructure and to improve economic competitiveness. The Human Resources section identifies LBL staff composition and development programs. The section on Site and Facilities discusses resources required to sustain and improve the physical plant and its equipment. The Resource Projections are estimates of required budgetary authority for the Laboratory`s ongoing research programs. The plan is an institutional management report for integration with the Department of Energy`s strategic planning activities that is developed through an annual planning process. The plan identifies technical and administrative directions in the context of the National Energy Strategy and the Department of Energy`s program planning initiatives. Preparation of the plan is coordinated by the Office for Planning and Development from information contributed by the Laboratory`s scientific and support divisions.« less

  3. Combining a Laboratory Practical Class with a Computer Simulation: Studies on the Synthesis of Urea in Isolated Hepatocytes.

    ERIC Educational Resources Information Center

    Bender, David A.

    1986-01-01

    Describes how a computer simulation is used with a laboratory experiment on the synthesis of urea in isolated hepatocytes. The simulation calculates the amount of urea formed and the amount of ammonium remaining as the concentrations of ornithine, citrulline, argininosuccinate, arginine, and aspartate are altered. (JN)

  4. Rationale for cost-effective laboratory medicine.

    PubMed Central

    Robinson, A

    1994-01-01

    There is virtually universal consensus that the health care system in the United States is too expensive and that costs need to be limited. Similar to health care costs in general, clinical laboratory expenditures have increased rapidly as a result of increased utilization and inflationary trends within the national economy. Economic constraints require that a compromise be reached between individual welfare and limited societal resources. Public pressure and changing health care needs have precipitated both subtle and radical laboratory changes to more effectively use allocated resources. Responsibility for excessive laboratory use can be assigned primarily to the following four groups: practicing physicians, physicians in training, patients, and the clinical laboratory. The strategies to contain escalating health care costs have ranged from individualized physician education programs to government intervention. Laboratories have responded to the fiscal restraints imposed by prospective payment systems by attempting to reduce operational costs without adversely impacting quality. Although cost containment directed at misutilization and overutilization of existing services has conserved resources, to date, an effective cost control mechanism has yet to be identified and successfully implemented on a grand enough scale to significantly impact health care expenditures in the United States. PMID:8055467

  5. Computational Science News | Computational Science | NREL

    Science.gov Websites

    -Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC

  6. Idaho National Laboratory Cultural Resource Monitoring Report for FY 2008

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brenda R. Pace

    2009-01-01

    This report describes the cultural resource monitoring activities of the Idaho National Laboratory’s (INL) Cultural Resource Management (CRM) Office during fiscal year 2008 (FY 2008). Throughout the year, 45 cultural resource localities were revisited including: two locations of heightened Shoshone-Bannock tribal sensitivity, four caves, one butte, twenty-eight prehistoric archaeological sites, three historic homesteads, two historic stage stations, one historic canal construction camp, three historic trails, and Experimental Breeder Reactor-I, which is a designated National Historic Landmark. Several INL project areas were also monitored in FY 2008 to assess project compliance with cultural resource recommendations, confirm the locations of previously recordedmore » cultural resources in relation to project activities, to assess the damage caused by fire-fighting efforts, and to watch for cultural materials during ground disturbing activities. Although impacts were documented at a few locations, no significant adverse effects that would threaten the National Register eligibility of any resource were observed. Monitoring also demonstrated that INL projects generally remain in compliance with recommendations to protect cultural resources« less

  7. Laboratory Directed Research and Development Annual Report for 2009

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Pamela J.

    This report documents progress made on all LDRD-funded projects during fiscal year 2009. As a US Department of Energy (DOE) Office of Science (SC) national laboratory, Pacific Northwest National Laboratory (PNNL) has an enduring mission to bring molecular and environmental sciences and engineering strengths to bear on DOE missions and national needs. Their vision is to be recognized worldwide and valued nationally for leadership in accelerating the discovery and deployment of solutions to challenges in energy, national security, and the environment. To achieve this mission and vision, they provide distinctive, world-leading science and technology in: (1) the design and scalablemore » synthesis of materials and chemicals; (2) climate change science and emissions management; (3) efficient and secure electricity management from generation to end use; and (4) signature discovery and exploitation for threat detection and reduction. PNNL leadership also extends to operating EMSL: the Environmental Molecular Sciences Laboratory, a national scientific user facility dedicated to providing itnegrated experimental and computational resources for discovery and technological innovation in the environmental molecular sciences.« less

  8. Earth System Grid II (ESG): Turning Climate Model Datasets Into Community Resources

    NASA Astrophysics Data System (ADS)

    Williams, D.; Middleton, D.; Foster, I.; Nevedova, V.; Kesselman, C.; Chervenak, A.; Bharathi, S.; Drach, B.; Cinquni, L.; Brown, D.; Strand, G.; Fox, P.; Garcia, J.; Bernholdte, D.; Chanchio, K.; Pouchard, L.; Chen, M.; Shoshani, A.; Sim, A.

    2003-12-01

    High-resolution, long-duration simulations performed with advanced DOE SciDAC/NCAR climate models will produce tens of petabytes of output. To be useful, this output must be made available to global change impacts researchers nationwide, both at national laboratories and at universities, other research laboratories, and other institutions. To this end, we propose to create a new Earth System Grid, ESG-II - a virtual collaborative environment that links distributed centers, users, models, and data. ESG-II will provide scientists with virtual proximity to the distributed data and resources that they require to perform their research. The creation of this environment will significantly increase the scientific productivity of U.S. climate researchers by turning climate datasets into community resources. In creating ESG-II, we will integrate and extend a range of Grid and collaboratory technologies, including the DODS remote access protocols for environmental data, Globus Toolkit technologies for authentication, resource discovery, and resource access, and Data Grid technologies developed in other projects. We will develop new technologies for (1) creating and operating "filtering servers" capable of performing sophisticated analyses, and (2) delivering results to users. In so doing, we will simultaneously contribute to climate science and advance the state of the art in collaboratory technology. We expect our results to be useful to numerous other DOE projects. The three-year R&D program will be undertaken by a talented and experienced team of computer scientists at five laboratories (ANL, LBNL, LLNL, NCAR, ORNL) and one university (ISI), working in close collaboration with climate scientists at several sites.

  9. Simulated Laboratory in Digital Logic.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…

  10. 30 CFR 6.10 - Use of independent laboratories.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PRODUCT SAFETY STANDARDS § 6.10 Use of independent laboratories. (a) MSHA will accept testing and... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Use of independent laboratories. 6.10 Section 6.10 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING, EVALUATION...

  11. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  12. Laboratory Resources Management in Manufacturing Systems Programs

    ERIC Educational Resources Information Center

    Obi, Samuel C.

    2004-01-01

    Most, if not all, industrial technology (IT) programs have laboratories or workshops. Often equipped with modern equipment, tools, materials, and measurement and test instruments, these facilities constitute a major investment for IT programs. Improper use or over use of program facilities may result in dirty lab equipment, lost or damaged tools,…

  13. Relating Solar Resource Variability to Cloud Type

    NASA Astrophysics Data System (ADS)

    Hinkelman, L. M.; Sengupta, M.

    2012-12-01

    Power production from renewable energy (RE) resources is rapidly increasing. Generation of renewable energy is quite variable since the solar and wind resources that form the inputs are, themselves, inherently variable. There is thus a need to understand the impact of renewable generation on the transmission grid. Such studies require estimates of high temporal and spatial resolution power output under various scenarios, which can be created from corresponding solar resource data. Satellite-based solar resource estimates are the best source of long-term solar irradiance data for the typically large areas covered by transmission studies. As satellite-based resource datasets are generally available at lower temporal and spatial resolution than required, there is, in turn, a need to downscale these resource data. Downscaling in both space and time requires information about solar irradiance variability, which is primarily a function of cloud types and properties. In this study, we analyze the relationship between solar resource variability and satellite-based cloud properties. One-minute resolution surface irradiance data were obtained from a number of stations operated by the National Oceanic and Atmospheric Administration (NOAA) under the Surface Radiation (SURFRAD) and Integrated Surface Irradiance Study (ISIS) networks as well as from NREL's Solar Radiation Research Laboratory (SRRL) in Golden, Colorado. Individual sites were selected so that a range of meteorological conditions would be represented. Cloud information at a nominal 4 km resolution and half hour intervals was derived from NOAA's Geostationary Operation Environmental Satellite (GOES) series of satellites. Cloud class information from the GOES data set was then used to select and composite irradiance data from the measurement sites. The irradiance variability for each cloud classification was characterized using general statistics of the fluxes themselves and their variability in time, as represented

  14. Laboratory Directed Research and Development Program FY 2006

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen

    2007-03-08

    The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab or LBNL) is a multi-program national research facility operated by the University of California for the Department of Energy (DOE). As an integral element of DOE's National Laboratory System, Berkeley Lab supports DOE's missions in fundamental science, energy resources, and environmental quality. Berkeley Lab programs advance four distinct goals for DOE and the nation: (1) To perform leading multidisciplinary research in the computing sciences, physical sciences, energy sciences, biosciences, and general sciences in a manner that ensures employee and public safety and protection of the environment. (2) To develop and operatemore » unique national experimental facilities for qualified investigators. (3) To educate and train future generations of scientists and engineers to promote national science and education goals. (4) To transfer knowledge and technological innovations and to foster productive relationships among Berkeley Lab's research programs, universities, and industry in order to promote national economic competitiveness.« less

  15. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  16. Idaho National Laboratory Cultural Resource Monitoring Report for FY 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    INL Cultural Resource Management Office

    2010-10-01

    This report describes the cultural resource monitoring activities of the Idaho National Laboratory’s (INL) Cultural Resource Management (CRM) Office during fiscal year 2010 (FY 2010). Throughout the year, thirty-three cultural resource localities were revisited, including somethat were visited more than once, including: two locations with Native American human remains, one of which is a cave, two additional caves, twenty-six prehistoric archaeological sites, two historic stage stations, and Experimental Breeder Reactor-I, which is a designated National Historic Landmark. The resources that were monitored included seventeen that are routinely visited and sixteen that are located in INL project areas. Although impacts weremore » documented at a few locations and one trespassing incident (albeit sans formal charges) was discovered, no significant adverse effects that would threaten the National Register eligibility of any resources were observed. Monitoring also demonstrated that several INL projects generally remain in compliance with recommendations to protect cultural resources.« less

  17. Genome Resource Banking of Biomedically Important Laboratory Animals

    PubMed Central

    Agca, Yuksel

    2014-01-01

    Genome resource banking (GRB) is the systematic collection, storage, and re-distribution of biomaterials in an organized, logistical, and secure manner. Genome cyrobanks usually contain biomaterials and associated genomic information essential for progression of biomedicine, human health, and research. In that regard, appropriate genome cryobanks could provide essential biomaterials for both current and future research projects in the form of various cell types and tissues, including sperm, oocytes, embryos, embryonic or adult stem cells, induced pluripotent stem cells, and gonadal tissues. In addition to cryobanked germplasm, cryobanking of DNA, serum, blood products, and tissues from scientifically, economically and ecologically important species has become a common practice. For revitalization of the whole organism, cryopreserved germplasm in conjunction with assisted reproductive technologies (ART), offer a powerful approach for research model management, as well as assisting in animal production for agriculture, conservation, and human reproductive medicine. Recently, many developed and developing countries have allocated substantial resources to establish genome resources banks which are responsible for safeguarding scientifically, economically and ecologically important wild type, mutant and transgenic plants, fish, and local livestock breeds, as well as wildlife species. This review is dedicated to the memory of Dr. John K. Critser, who had made profound contributions to the science of cryobiology and establishment of genome research and resources centers for mice, rats and swine. Emphasis will be given to application of GRBs to species with substantial contributions to the advancement of biomedicine and human health. PMID:22981880

  18. Laboratory x-ray micro-computed tomography: a user guideline for biological samples

    PubMed Central

    2017-01-01

    Abstract Laboratory x-ray micro–computed tomography (micro-CT) is a fast-growing method in scientific research applications that allows for non-destructive imaging of morphological structures. This paper provides an easily operated “how to” guide for new potential users and describes the various steps required for successful planning of research projects that involve micro-CT. Background information on micro-CT is provided, followed by relevant setup, scanning, reconstructing, and visualization methods and considerations. Throughout the guide, a Jackson's chameleon specimen, which was scanned at different settings, is used as an interactive example. The ultimate aim of this paper is make new users familiar with the concepts and applications of micro-CT in an attempt to promote its use in future scientific studies. PMID:28419369

  19. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  20. Optimizing physician access to surgical intensive care unit laboratory information through mobile computing.

    PubMed

    Strain, J J; Felciano, R M; Seiver, A; Acuff, R; Fagan, L

    1996-01-01

    Approximately 30 minutes of computer access time are required by surgical residents at Stanford University Medical Center (SUMC) to examine the lab values of all patients on a surgical intensive care unit (ICU) service, a task that must be performed several times a day. To reduce the time accessing this information and simultaneously increase the readability and currency of the data, we have created a mobile, pen-based user interface and software system that delivers lab results to surgeons in the ICU. The ScroungeMaster system, loaded on a portable tablet computer, retrieves lab results for a subset of patients from the central laboratory computer and stores them in a local database cache. The cache can be updated on command; this update takes approximately 2.7 minutes for all ICU patients being followed by the surgeon, and can be performed as a background task while the user continues to access selected lab results. The user interface presents lab results according to physiologic system. Which labs are displayed first is governed by a layout selection algorithm based on previous accesses to the patient's lab information, physician preferences, and the nature of the patient's medical condition. Initial evaluation of the system has shown that physicians prefer the ScroungeMaster interface to that of existing systems at SUMC and are satisfied with the system's performance. We discuss the evolution of ScroungeMaster and make observations on changes to physician work flow with the presence of mobile, pen-based computing in the ICU.

  1. Monitoring of computing resource use of active software releases at ATLAS

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  2. Controlling Laboratory Processes From A Personal Computer

    NASA Technical Reports Server (NTRS)

    Will, H.; Mackin, M. A.

    1991-01-01

    Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.

  3. Computing with Beowulf

    NASA Technical Reports Server (NTRS)

    Cohen, Jarrett

    1999-01-01

    Parallel computers built out of mass-market parts are cost-effectively performing data processing and simulation tasks. The Supercomputing (now known as "SC") series of conferences celebrated its 10th anniversary last November. While vendors have come and gone, the dominant paradigm for tackling big problems still is a shared-resource, commercial supercomputer. Growing numbers of users needing a cheaper or dedicated-access alternative are building their own supercomputers out of mass-market parts. Such machines are generally called Beowulf-class systems after the 11th century epic. This modern-day Beowulf story began in 1994 at NASA's Goddard Space Flight Center. A laboratory for the Earth and space sciences, computing managers there threw down a gauntlet to develop a $50,000 gigaFLOPS workstation for processing satellite data sets. Soon, Thomas Sterling and Don Becker were working on the Beowulf concept at the University Space Research Association (USRA)-run Center of Excellence in Space Data and Information Sciences (CESDIS). Beowulf clusters mix three primary ingredients: commodity personal computers or workstations, low-cost Ethernet networks, and the open-source Linux operating system. One of the larger Beowulfs is Goddard's Highly-parallel Integrated Virtual Environment, or HIVE for short.

  4. BROOKHAVEN NATIONAL LABORATORY WILDLIFE MANAGEMENT PLAN.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NAIDU,J.R.

    2002-10-22

    The purpose of the Wildlife Management Plan (WMP) is to promote stewardship of the natural resources found at the Brookhaven National Laboratory (BNL), and to integrate their protection with pursuit of the Laboratory's mission.

  5. Modular laboratories--cost-effective and sustainable infrastructure for resource-limited settings.

    PubMed

    Bridges, Daniel J; Colborn, James; Chan, Adeline S T; Winters, Anna M; Dengala, Dereje; Fornadel, Christen M; Kosloff, Barry

    2014-12-01

    High-quality laboratory space to support basic science, clinical research projects, or health services is often severely lacking in the developing world. Moreover, the construction of suitable facilities using traditional methods is time-consuming, expensive, and challenging to implement. Three real world examples showing how shipping containers can be converted into modern laboratories are highlighted. These include use as an insectary, a molecular laboratory, and a BSL-3 containment laboratory. These modular conversions have a number of advantages over brick and mortar construction and provide a cost-effective and timely solution to offer high-quality, user-friendly laboratory space applicable within the developing world. © The American Society of Tropical Medicine and Hygiene.

  6. Computer-assisted enzyme immunoassays and simplified immunofluorescence assays: applications for the diagnostic laboratory and the veterinarian's office.

    PubMed

    Jacobson, R H; Downing, D R; Lynch, T J

    1982-11-15

    A computer-assisted enzyme-linked immunosorbent assay (ELISA) system, based on kinetics of the reaction between substrate and enzyme molecules, was developed for testing large numbers of sera in laboratory applications. Systematic and random errors associated with conventional ELISA technique were identified leading to results formulated on a statistically validated, objective, and standardized basis. In a parallel development, an inexpensive system for field and veterinary office applications contained many of the qualities of the computer-assisted ELISA. This system uses a fluorogenic indicator (rather than the enzyme-substrate interaction) in a rapid test (15 to 20 minutes' duration) which promises broad application in serodiagnosis.

  7. Designing Online Resources in Preparation for Authentic Laboratory Experiences

    PubMed Central

    Boulay, Rachel; Parisky, Alex; Leong, Peter

    2013-01-01

    Professional development for science teachers can be benefited through active learning in science laboratories. However, how online training materials can be used to complement traditional laboratory training is less understood. This paper explores the design of online training modules to teach molecular biology and user perception of those modules that were part of an intensive molecular biology “boot camp” targeting high school biology teachers in the State of Hawaii. The John A. Burns School of Medicine at the University of Hawaii had an opportunity to design and develop professional development that prepares science teachers with an introduction of skills, techniques, and applications for their students to conduct medical research in a laboratory setting. A group of 29 experienced teachers shared their opinions of the online materials and reported on how they used the online materials in their learning process or teaching. PMID:24319698

  8. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  9. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  10. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  11. UBioLab: a web-laboratory for ubiquitous in-silico experiments.

    PubMed

    Bartocci, Ezio; Cacciagrano, Diletta; Di Berardini, Maria Rita; Merelli, Emanuela; Vito, Leonardo

    2012-07-09

    The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists –for what concerns their management and visualization– and for bioinformaticians –for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle –and possibly to handle in a transparent and uniform way– aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features –as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques– give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.

  12. UBioLab: a web-LABoratory for Ubiquitous in-silico experiments.

    PubMed

    Bartocci, E; Di Berardini, M R; Merelli, E; Vito, L

    2012-03-01

    The huge and dynamic amount of bioinformatic resources (e.g., data and tools) available nowadays in Internet represents a big challenge for biologists -for what concerns their management and visualization- and for bioinformaticians -for what concerns the possibility of rapidly creating and executing in-silico experiments involving resources and activities spread over the WWW hyperspace. Any framework aiming at integrating such resources as in a physical laboratory has imperatively to tackle -and possibly to handle in a transparent and uniform way- aspects concerning physical distribution, semantic heterogeneity, co-existence of different computational paradigms and, as a consequence, of different invocation interfaces (i.e., OGSA for Grid nodes, SOAP for Web Services, Java RMI for Java objects, etc.). The framework UBioLab has been just designed and developed as a prototype following the above objective. Several architectural features -as those ones of being fully Web-based and of combining domain ontologies, Semantic Web and workflow techniques- give evidence of an effort in such a direction. The integration of a semantic knowledge management system for distributed (bioinformatic) resources, a semantic-driven graphic environment for defining and monitoring ubiquitous workflows and an intelligent agent-based technology for their distributed execution allows UBioLab to be a semantic guide for bioinformaticians and biologists providing (i) a flexible environment for visualizing, organizing and inferring any (semantics and computational) "type" of domain knowledge (e.g., resources and activities, expressed in a declarative form), (ii) a powerful engine for defining and storing semantic-driven ubiquitous in-silico experiments on the domain hyperspace, as well as (iii) a transparent, automatic and distributed environment for correct experiment executions.

  13. The use of computers to teach human anatomy and physiology to allied health and nursing students

    NASA Astrophysics Data System (ADS)

    Bergeron, Valerie J.

    Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the

  14. Enhanced delegated computing using coherence

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Dunjko, Vedran; Schlederer, Florian; Moore, Merritt; Kashefi, Elham; Walmsley, Ian A.

    2016-03-01

    A longstanding question is whether it is possible to delegate computational tasks securely—such that neither the computation nor the data is revealed to the server. Recently, both a classical and a quantum solution to this problem were found [C. Gentry, in Proceedings of the 41st Annual ACM Symposium on the Theory of Computing (Association for Computing Machinery, New York, 2009), pp. 167-178; A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, Los Alamitos, CA, 2009), pp. 517-526]. Here, we study the first step towards the interplay between classical and quantum approaches and show how coherence can be used as a tool for secure delegated classical computation. We show that a client with limited computational capacity—restricted to an XOR gate—can perform universal classical computation by manipulating information carriers that may occupy superpositions of two states. Using single photonic qubits or coherent light, we experimentally implement secure delegated classical computations between an independent client and a server, which are installed in two different laboratories and separated by 50 m . The server has access to the light sources and measurement devices, whereas the client may use only a restricted set of passive optical devices to manipulate the information-carrying light beams. Thus, our work highlights how minimal quantum and classical resources can be combined and exploited for classical computing.

  15. Universal computer test stand (recommended computer test requirements). [for space shuttle computer evaluation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Techniques are considered which would be used to characterize areospace computers with the space shuttle application as end usage. The system level digital problems which have been encountered and documented are surveyed. From the large cross section of tests, an optimum set is recommended that has a high probability of discovering documented system level digital problems within laboratory environments. Defined is a baseline hardware, software system which is required as a laboratory tool to test aerospace computers. Hardware and software baselines and additions necessary to interface the UTE to aerospace computers for test purposes are outlined.

  16. Reducing usage of the computational resources by event driven approach to model predictive control

    NASA Astrophysics Data System (ADS)

    Misik, Stefan; Bradac, Zdenek; Cela, Arben

    2017-08-01

    This paper deals with a real-time and optimal control of dynamic systems while also considers the constraints which these systems might be subject to. Main objective of this work is to propose a simple modification of the existing Model Predictive Control approach to better suit needs of computational resource-constrained real-time systems. An example using model of a mechanical system is presented and the performance of the proposed method is evaluated in a simulated environment.

  17. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  18. Laboratory Directed Research and Development FY2010 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, K J

    2011-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has at its core a primary national security mission - to ensure the safety, security, and reliability of the nation's nuclear weapons stockpile without nuclear testing, and to prevent and counter the spread and use of weapons of mass destruction: nuclear, chemical, and biological. The Laboratory uses the scientific and engineering expertise and facilities developed for its primary mission to pursue advanced technologies to meet other important national security needs - homeland defense, military operations, and missile defense, for example - that evolve in response to emerging threats. For broader nationalmore » needs, LLNL executes programs in energy security, climate change and long-term energy needs, environmental assessment and management, bioscience and technology to improve human health, and for breakthroughs in fundamental science and technology. With this multidisciplinary expertise, the Laboratory serves as a science and technology resource to the U.S. government and as a partner with industry and academia. This annual report discusses the following topics: (1) Advanced Sensors and Instrumentation; (2) Biological Sciences; (3) Chemistry; (4) Earth and Space Sciences; (5) Energy Supply and Use; (6) Engineering and Manufacturing Processes; (7) Materials Science and Technology; Mathematics and Computing Science; (8) Nuclear Science and Engineering; and (9) Physics.« less

  19. Laboratory Directed Research and Development FY2011 Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, W; Sketchley, J; Kotta, P

    2012-03-22

    A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser

  20. DB4US: A Decision Support System for Laboratory Information Management

    PubMed Central

    Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-01-01

    are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745

  1. DB4US: A Decision Support System for Laboratory Information Management.

    PubMed

    Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael

    2012-11-14

    needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.

  2. [Medical support on human resources and clinical laboratory in Myanmar].

    PubMed

    Koide, Norio

    2012-03-01

    I have been involved in medical cooperation programs between Myanmar and Japan for over 10 years. The purpose of the first visit to Myanmar was the investigation of hepatitis C spreading among thalassemia patients. I learned that the medical system was underdeveloped in this country, and have initiated several cooperation programs together with Professor Shigeru Okada, such as the "Protection against hepatitis C in Myanmar", "Scientist exchange between the Ministry of Health, Myanmar and Okayama University", and "Various activities sponsored by a Non-Profit Organization". As for clinical laboratories, the laboratory system itself is pre-constructed and the benefit of a clinical laboratory in modern medicine is not given to patients in Myanmar. The donation of drugs and reagents for laboratory tests is helpful, but it will be more helpful to assist the future leaders to learn modern medicine and develop their own various systems to support modern medicine. Our activity in the cooperation program is described.

  3. e-Science and data management resources on the Web.

    PubMed

    Gore, Sally A

    2011-01-01

    The way research is conducted has changed over time, from simple experiments to computer modeling and simulation, from individuals working in isolated laboratories to global networks of researchers collaborating on a single topic. Often, this new paradigm results in the generation of staggering amounts of data. The intensive use of data and the existence of networks of researchers characterize e-Science. The role of libraries and librarians in e-Science has been a topic of interest for some time now. This column looks at tools, resources, and projects that demonstrate successful collaborations between libraries and researchers in e-Science.

  4. Special Education Teacher Computer Literacy Training. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume II. Final Report.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the second of four project objectives, the development of a special education teacher computer literacy…

  5. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  6. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  7. The Collaborative Cross at Oak Ridge National Laboratory: developing a powerful resource for systems genetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chesler, Elissa J; Branstetter, Lisa R; Churchill, Gary A

    2008-01-01

    Complex traits and disease co-morbidity in humans and in model organisms are the result of naturally occurring polymorphisms that interact with each other and with the environment. To ensure the availability of the resources needed to investigate biomolecular networks and ultimately systems level phenotypes, we have initiated breeding of a new genetic reference population of mice, the Collaborative Cross. This population has been designed to optimally support systems genetics analysis. Its novel and important features include high levels of genetic diversity, a large population size to ensure sufficient power in high-dimensional studies, and high mapping precision through accumulation of independentmore » recombination events. Implementation of the Collaborative Cross has been in progress at the Oak Ridge National Laboratory (ORNL) since May 2005. This is achieved through a software assisted breeding program with fully traceable lineages, performed in a uniform environment. Currently, there are 650 lines in production with almost 200 lines over seven generations of inbreeding. Retired breeders enter a high-throughput phenotyping protocol and DNA samples are banked for analysis of recombination history, allele loss, and population structure. Herein we present a progress report of the Collaborative Cross breeding program at ORNL and a description of the kinds of investigations that this resource will support.« less

  8. Lawrence Berkeley Laboratory Institutional Plan, FY 1993--1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-10-01

    The FY 1993--1998 Institutional Plan provides an overview of the Lawrence Berkeley Laboratory mission, strategic plan, scientific initiatives, research programs, environment and safety program plans, educational and technology transfer efforts, human resources, and facilities needs. The Strategic Plan section identifies long-range conditions that can influence the Laboratory, potential research trends, and several management implications. The Initiatives section identifies potential new research programs that represent major long-term opportunities for the Laboratory and the resources required for their implementation. The Scientific and Technical Programs section summarizes current programs and potential changes in research program activity. The Environment, Safety, and Health section describesmore » the management systems and programs underway at the Laboratory to protect the environment, the public, and the employees. The Technology Transfer and Education programs section describes current and planned programs to enhance the nation's scientific literacy and human infrastructure and to improve economic competitiveness. The Human Resources section identifies LBL staff composition and development programs. The section on Site and Facilities discusses resources required to sustain and improve the physical plant and its equipment. The Resource Projections are estimates of required budgetary authority for the Laboratory's ongoing research programs. The plan is an institutional management report for integration with the Department of Energy's strategic planning activities that is developed through an annual planning process. The plan identifies technical and administrative directions in the context of the National Energy Strategy and the Department of Energy's program planning initiatives. Preparation of the plan is coordinated by the Office for Planning and Development from information contributed by the Laboratory's scientific and support divisions.« less

  9. Computer Security for Commercial Nuclear Power Plants - Literature Review for Korea Hydro Nuclear Power Central Research Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duran, Felicia Angelica; Waymire, Russell L.

    2013-10-01

    Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documentsmore » have also been provided to KHNP-CRI.« less

  10. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  11. Accomplishment Summary 1968-1969. Biological Computer Laboratory.

    ERIC Educational Resources Information Center

    Von Foerster, Heinz; And Others

    This report summarizes theoretical, applied, and experimental studies in the areas of computational principles in complex intelligent systems, cybernetics, multivalued logic, and the mechanization of cognitive processes. This work is summarized under the following topic headings: properties of complex dynamic systems; computers and the language…

  12. Sandia National Laboratories: News: Media Resources: Media Contacts

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  13. Sandia National Laboratories: Employee & Retiree Resources: Remote Access

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  14. A Resource Service Model in the Industrial IoT System Based on Transparent Computing.

    PubMed

    Li, Weimin; Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-03-26

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system.

  15. A Resource Service Model in the Industrial IoT System Based on Transparent Computing

    PubMed Central

    Wang, Bin; Sheng, Jinfang; Dong, Ke; Li, Zitong; Hu, Yixiang

    2018-01-01

    The Internet of Things (IoT) has received a lot of attention, especially in industrial scenarios. One of the typical applications is the intelligent mine, which actually constructs the Six-Hedge underground systems with IoT platforms. Based on a case study of the Six Systems in the underground metal mine, this paper summarizes the main challenges of industrial IoT from the aspects of heterogeneity in devices and resources, security, reliability, deployment and maintenance costs. Then, a novel resource service model for the industrial IoT applications based on Transparent Computing (TC) is presented, which supports centralized management of all resources including operating system (OS), programs and data on the server-side for the IoT devices, thus offering an effective, reliable, secure and cross-OS IoT service and reducing the costs of IoT system deployment and maintenance. The model has five layers: sensing layer, aggregation layer, network layer, service and storage layer and interface and management layer. We also present a detailed analysis on the system architecture and key technologies of the model. Finally, the efficiency of the model is shown by an experiment prototype system. PMID:29587450

  16. Ernest Orlando Lawrence Berkeley National Laboratory institutional plan, FY 1996--2001

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    The FY 1996--2001 Institutional Plan provides an overview of the Ernest Orlando Lawrence Berkeley National Laboratory mission, strategic plan, core business areas, critical success factors, and the resource requirements to fulfill its mission in support of national needs in fundamental science and technology, energy resources, and environmental quality. The Laboratory Strategic Plan section identifies long-range conditions that will influence the Laboratory, as well as potential research trends and management implications. The Core Business Areas section identifies those initiatives that are potential new research programs representing major long-term opportunities for the Laboratory, and the resources required for their implementation. It alsomore » summarizes current programs and potential changes in research program activity, science and technology partnerships, and university and science education. The Critical Success Factors section reviews human resources; work force diversity; environment, safety, and health programs; management practices; site and facility needs; and communications and trust. The Resource Projections are estimates of required budgetary authority for the Laboratory`s ongoing research programs. The Institutional Plan is a management report for integration with the Department of Energy`s strategic planning activities, developed through an annual planning process. The plan identifies technical and administrative directions in the context of the national energy policy and research needs and the Department of Energy`s program planning initiatives. Preparation of the plan is coordinated by the Office of Planning and Communications from information contributed by the Laboratory`s scientific and support divisions.« less

  17. The Tanzania experience: clinical laboratory testing harmonization and equipment standardization at different levels of a tiered health laboratory system.

    PubMed

    Massambu, Charles; Mwangi, Christina

    2009-06-01

    The rapid scale-up of the care and treatment programs in Tanzania during the preceding 4 years has greatly increased the demand for quality laboratory services for diagnosis of HIV and monitoring patients during antiretroviral therapy. Laboratory services were not in a position to cope with this demand owing to poor infrastructure, lack of human resources, erratic and/or lack of reagent supply and commodities, and slow manual technologies. With the limited human resources in the laboratory and the need for scaling up the care and treatment program, it became necessary to install automated equipment and train personnel for the increased volume of testing and new tests across all laboratory levels. With the numerous partners procuring equipment, the possibility of a multitude of equipment platforms with attendant challenges for procurement of reagents, maintenance of equipment, and quality assurance arose. Tanzania, therefore, had to harmonize laboratory tests and standardize laboratory equipment at different levels of the laboratory network. The process of harmonization of tests and standardization of equipment included assessment of laboratories, review of guidelines, development of a national laboratory operational plan, and stakeholder advocacy. This document outlines this process.

  18. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively onmore » such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.« less

  19. Research in remote sensing of agriculture, earth resources, and man's environment

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1974-01-01

    Research performed on NASA and USDA remote sensing projects are reviewed and include: (1) the 1971 Corn Blight Watch Experiment; (2) crop identification; (3) soil mapping; (4) land use inventories; (5) geologic mapping; and (6) forest and water resources data collection. The extent to which ERTS images and airborne data were used is indicated along with computer implementation. A field and laboratory spectroradiometer system is described together with the LARSYS software system, both of which were widely used during the research. Abstracts are included of 160 technical reports published as a result of the work.

  20. Issues in undergraduate education in computational science and high performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchioro, T.L. II; Martin, D.

    1994-12-31

    The ever increasing need for mathematical and computational literacy within their society and among members of the work force has generated enormous pressure to revise and improve the teaching of related subjects throughout the curriculum, particularly at the undergraduate level. The Calculus Reform movement is perhaps the best known example of an organized initiative in this regard. The UCES (Undergraduate Computational Engineering and Science) project, an effort funded by the Department of Energy and administered through the Ames Laboratory, is sponsoring an informal and open discussion of the salient issues confronting efforts to improve and expand the teaching of computationalmore » science as a problem oriented, interdisciplinary approach to scientific investigation. Although the format is open, the authors hope to consider pertinent questions such as: (1) How can faculty and research scientists obtain the recognition necessary to further excellence in teaching the mathematical and computational sciences? (2) What sort of educational resources--both hardware and software--are needed to teach computational science at the undergraduate level? Are traditional procedural languages sufficient? Are PCs enough? Are massively parallel platforms needed? (3) How can electronic educational materials be distributed in an efficient way? Can they be made interactive in nature? How should such materials be tied to the World Wide Web and the growing ``Information Superhighway``?« less

  1. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  2. Computer Engineers.

    ERIC Educational Resources Information Center

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  3. The Development of an Individualized Instructional Program in Beginning College Mathematics Utilizing Computer Based Resource Units. Final Report.

    ERIC Educational Resources Information Center

    Rockhill, Theron D.

    Reported is an attempt to develop and evaluate an individualized instructional program in pre-calculus college mathematics. Four computer based resource units were developed in the areas of set theory, relations and function, algebra, trigonometry, and analytic geometry. Objectives were determined by experienced calculus teachers, and…

  4. Exploring Electronics Laboratory Experiments Using Computer Software

    ERIC Educational Resources Information Center

    Gandole, Yogendra Babarao

    2011-01-01

    The roles of teachers and students are changing, and there are undoubtedly ways of learning not yet discovered. However, the computer and software technology may provide a significant role to identify the problems, to present solutions and life-long learning. It is clear that the computer based educational technology has reached the point where…

  5. To Compare the Effects of Computer Based Learning and the Laboratory Based Learning on Students' Achievement Regarding Electric Circuits

    ERIC Educational Resources Information Center

    Bayrak, Bekir; Kanli, Uygar; Kandil Ingeç, Sebnem

    2007-01-01

    In this study, the research problem was: "Is the computer based physics instruction as effective as laboratory intensive physics instruction with regards to academic success on electric circuits 9th grade students?" For this research of experimental quality the design of pre-test and post-test are applied with an experiment and a control…

  6. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  7. Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy

    2004-01-01

    We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.

  8. Psychiatrists’ Comfort Using Computers and Other Electronic Devices in Clinical Practice

    PubMed Central

    Fochtmann, Laura J.; Clarke, Diana E.; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K.; Plovnick, Robert M.

    2015-01-01

    This report highlights findings from the Study of Psychiatrists’ Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists’ comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted. PMID:26667248

  9. Psychiatrists' Comfort Using Computers and Other Electronic Devices in Clinical Practice.

    PubMed

    Duffy, Farifteh F; Fochtmann, Laura J; Clarke, Diana E; Barber, Keila; Hong, Seung-Hee; Yager, Joel; Mościcki, Eve K; Plovnick, Robert M

    2016-09-01

    This report highlights findings from the Study of Psychiatrists' Use of Informational Resources in Clinical Practice, a cross-sectional Web- and paper-based survey that examined psychiatrists' comfort using computers and other electronic devices in clinical practice. One-thousand psychiatrists were randomly selected from the American Medical Association Physician Masterfile and asked to complete the survey between May and August, 2012. A total of 152 eligible psychiatrists completed the questionnaire (response rate 22.2 %). The majority of psychiatrists reported comfort using computers for educational and personal purposes. However, 26 % of psychiatrists reported not using or not being comfortable using computers for clinical functions. Psychiatrists under age 50 were more likely to report comfort using computers for all purposes than their older counterparts. Clinical tasks for which computers were reportedly used comfortably, specifically by psychiatrists younger than 50, included documenting clinical encounters, prescribing, ordering laboratory tests, accessing read-only patient information (e.g., test results), conducting internet searches for general clinical information, accessing online patient educational materials, and communicating with patients or other clinicians. Psychiatrists generally reported comfort using computers for personal and educational purposes. However, use of computers in clinical care was less common, particularly among psychiatrists 50 and older. Information and educational resources need to be available in a variety of accessible, user-friendly, computer and non-computer-based formats, to support use across all ages. Moreover, ongoing training and technical assistance with use of electronic and mobile device technologies in clinical practice is needed. Research on barriers to clinical use of computers is warranted.

  10. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  11. Discovery & Interaction in Astro 101 Laboratory Experiments

    NASA Astrophysics Data System (ADS)

    Maloney, Frank Patrick; Maurone, Philip; DeWarf, Laurence E.

    2016-01-01

    The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for arts students. We report on a strategy, begun in 1992, for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. These experiments have evolved as :a) the quality and speed of the hardware has greatly increasedb) the corresponding hardware costs have decreasedc) the students have become computer and Internet literated) the importance of computationally and scientifically literate arts graduates in the workplace has increased.We present the current suite of laboratory experiments, and describe the nature, procedures, and goals in this two-semester laboratory for liberal arts majors at the Astro 101 university level.

  12. Personal computer versus personal computer/mobile device combination users' preclinical laboratory e-learning activity.

    PubMed

    Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro

    2017-11-01

    The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.

  13. Mouse Genome Informatics (MGI) Is the International Resource for Information on the Laboratory Mouse.

    PubMed

    Law, MeiYee; Shaw, David R

    2018-01-01

    Mouse Genome Informatics (MGI, http://www.informatics.jax.org/ ) web resources provide free access to meticulously curated information about the laboratory mouse. MGI's primary goal is to help researchers investigate the genetic foundations of human diseases by translating information from mouse phenotypes and disease models studies to human systems. MGI provides comprehensive phenotypes for over 50,000 mutant alleles in mice and provides experimental model descriptions for over 1500 human diseases. Curated data from scientific publications are integrated with those from high-throughput phenotyping and gene expression centers. Data are standardized using defined, hierarchical vocabularies such as the Mammalian Phenotype (MP) Ontology, Mouse Developmental Anatomy and the Gene Ontologies (GO). This chapter introduces you to Gene and Allele Detail pages and provides step-by-step instructions for simple searches and those that take advantage of the breadth of MGI data integration.

  14. A resource facility for kinetic analysis: modeling using the SAAM computer programs.

    PubMed

    Foster, D M; Boston, R C; Jacquez, J A; Zech, L

    1989-01-01

    Kinetic analysis and integrated system modeling have contributed significantly to understanding the physiology and pathophysiology of metabolic systems in humans and animals. Many experimental biologists are aware of the usefulness of these techniques and recognize that kinetic modeling requires special expertise. The Resource Facility for Kinetic Analysis (RFKA) provides this expertise through: (1) development and application of modeling technology for biomedical problems, and (2) development of computer-based kinetic modeling methodologies concentrating on the computer program Simulation, Analysis, and Modeling (SAAM) and its conversational version, CONversational SAAM (CONSAM). The RFKA offers consultation to the biomedical community in the use of modeling to analyze kinetic data and trains individuals in using this technology for biomedical research. Early versions of SAAM were widely applied in solving dosimetry problems; many users, however, are not familiar with recent improvements to the software. The purpose of this paper is to acquaint biomedical researchers in the dosimetry field with RFKA, which, together with the joint National Cancer Institute-National Heart, Lung and Blood Institute project, is overseeing SAAM development and applications. In addition, RFKA provides many service activities to the SAAM user community that are relevant to solving dosimetry problems.

  15. Visual interaction: models, systems, prototypes. The Pictorial Computing Laboratory at the University of Rome La Sapienza.

    PubMed

    Bottoni, Paolo; Cinque, Luigi; De Marsico, Maria; Levialdi, Stefano; Panizzi, Emanuele

    2006-06-01

    This paper reports on the research activities performed by the Pictorial Computing Laboratory at the University of Rome, La Sapienza, during the last 5 years. Such work, essentially is based on the study of humancomputer interaction, spans from metamodels of interaction down to prototypes of interactive systems for both synchronous multimedia communication and groupwork, annotation systems for web pages, also encompassing theoretical and practical issues of visual languages and environments also including pattern recognition algorithms. Some applications are also considered like e-learning and collaborative work.

  16. Laboratories for Teaching of Mathematical Subjects

    ERIC Educational Resources Information Center

    Berežný, Štefan

    2017-01-01

    We have adapted our two laboratories at our department based on our research results, which were presented at the conference CADGME 2014 in Halle and published in the journal. In this article we describe the hardware and software structure of the Laboratory 1: LabIT4KT-1: Laboratory of Computer Modelling and the Laboratory 2: LabIT4KT-2:…

  17. Comparison of Mars Science Laboratory Reaction Control System Jet Computations With Flow Visualization and Velocimetry

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.

    2013-01-01

    Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.

  18. Air Force Weapons Laboratory Computational Requirements for 1976 Through 1980

    DTIC Science & Technology

    1976-01-01

    Air Force Weapons Laboratory , Attn: DYS, Kirtland AFB, NM 87117...final report was prepared by the Air Force Weapons Laboratory , Kirtland Air Force Base, New Mexico under Job Order 06CB. Dr. Clifford E. Rhoades, Jr... Force Base, New Mexico 87117 62601F, 06CB II. CONTROLLING OFFICE NAME AND ADDRESS Ai"- Force Weapons Laboratory / Jan 1076 Kirtland Air Force Base,

  19. Distributed Energy Resource (DER) Cybersecurity Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleem, Danish; Johnson, Jay

    This presentation covers the work that Sandia National Laboratories and National Renewable Energy Laboratory are doing for distributed energy resource cybersecurity standards, prepared for NREL's Annual Cybersecurity & Resilience Workshop on October 9-10, 2017.

  20. Sandia National Laboratories: Advanced Simulation and Computing

    Science.gov Websites

    Programs Nuclear Weapons About Nuclear Weapons Safety & Security Weapons Science & Technology Robotics R&D 100 Awards Laboratory Directed Research & Development Technology Deployment Centers Audit Sandia's Economic Impact Licensing & Technology Transfer Browse Technology Portfolios

  1. Analog Computer Laboratory with Biological Examples.

    ERIC Educational Resources Information Center

    Strebel, Donald E.

    1979-01-01

    The use of biological examples in teaching applications of the analog computer is discussed and several examples from mathematical ecology, enzyme kinetics, and tracer dynamics are described. (Author/GA)

  2. A Model for Designing Adaptive Laboratory Evolution Experiments.

    PubMed

    LaCroix, Ryan A; Palsson, Bernhard O; Feist, Adam M

    2017-04-15

    The occurrence of mutations is a cornerstone of the evolutionary theory of adaptation, capitalizing on the rare chance that a mutation confers a fitness benefit. Natural selection is increasingly being leveraged in laboratory settings for industrial and basic science applications. Despite increasing deployment, there are no standardized procedures available for designing and performing adaptive laboratory evolution (ALE) experiments. Thus, there is a need to optimize the experimental design, specifically for determining when to consider an experiment complete and for balancing outcomes with available resources (i.e., laboratory supplies, personnel, and time). To design and to better understand ALE experiments, a simulator, ALEsim, was developed, validated, and applied to the optimization of ALE experiments. The effects of various passage sizes were experimentally determined and subsequently evaluated with ALEsim, to explain differences in experimental outcomes. Furthermore, a beneficial mutation rate of 10 -6.9 to 10 -8.4 mutations per cell division was derived. A retrospective analysis of ALE experiments revealed that passage sizes typically employed in serial passage batch culture ALE experiments led to inefficient production and fixation of beneficial mutations. ALEsim and the results described here will aid in the design of ALE experiments to fit the exact needs of a project while taking into account the resources required and will lower the barriers to entry for this experimental technique. IMPORTANCE ALE is a widely used scientific technique to increase scientific understanding, as well as to create industrially relevant organisms. The manner in which ALE experiments are conducted is highly manual and uniform, with little optimization for efficiency. Such inefficiencies result in suboptimal experiments that can take multiple months to complete. With the availability of automation and computer simulations, we can now perform these experiments in an optimized

  3. Powder X-ray diffraction laboratory, Reston, Virginia

    USGS Publications Warehouse

    Piatak, Nadine M.; Dulong, Frank T.; Jackson, John C.; Folger, Helen W.

    2014-01-01

    The powder x-ray diffraction (XRD) laboratory is managed jointly by the Eastern Mineral and Environmental Resources and Eastern Energy Resources Science Centers. Laboratory scientists collaborate on a wide variety of research problems involving other U.S. Geological Survey (USGS) science centers and government agencies, universities, and industry. Capabilities include identification and quantification of crystalline and amorphous phases, and crystallographic and atomic structure analysis for a wide variety of sample media. Customized laboratory procedures and analyses commonly are used to characterize non-routine samples including, but not limited to, organic and inorganic components in petroleum source rocks, ore and mine waste, clay minerals, and glassy phases. Procedures can be adapted to meet a variety of research objectives.

  4. The Use and Benefits of Computer Aided Learning in the Assessment of the Laboratory Exercise "Enzyme Induction in Escherichia coli".

    ERIC Educational Resources Information Center

    Pamula, F.; And Others

    1995-01-01

    Describes an interactive computer program written to provide accurate and immediate feedback to students while they are processing experimental data. Discusses the problems inherent in laboratory courses that led to the development of this program. Advantages of the software include allowing students to work at their own pace in a nonthreatening…

  5. Teaching pediatric laboratory medicine to pathology residents.

    PubMed

    Pysher, Theodore J; Bach, Philip R; Geaghan, Sharon M; Hamilton, Marilyn S; Laposata, Michael; Lockitch, Gillian; Brugnara, Carlo; Coffin, Cheryl M; Pasquali, Marzia; Rinaldo, Piero; Roberts, William L; Rutledge, Joe C; Ashwood, Edward R; Blaylock, Robert C; Campos, Joseph M; Goldsmith, Barbara; Jones, Patricia M; Lim, Megan; Meikle, A Wayne; Perkins, Sherrie L; Perry, Deborah A; Petti, Cathy A; Rogers, Beverly B; Steele, Paul E; Weiss, Ronald L; Woods, Gail

    2006-07-01

    Laboratory data are essential to the medical care of fetuses, infants, children, and adolescents. However, the performance and interpretation of laboratory tests on specimens from these patients, which may constitute a significant component of the workload in general hospitals and integrated health care systems as well as specialized perinatal or pediatric centers, present unique challenges to the clinical pathologist and the laboratory. Therefore, pathology residents should receive training in pediatric laboratory medicine. Children's Health Improvement through Laboratory Diagnostics, a group of pathologists and laboratory scientists with interest and expertise in pediatric laboratory medicine, convened a task force to develop a list of curriculum topics, key resources, and training experiences in pediatric laboratory medicine for trainees in anatomic and clinical pathology or straight clinical pathology residency programs and in pediatric pathology fellowship programs. Based on the experiences of 11 training programs, we have compiled a comprehensive list of pediatric topics in the areas of clinical chemistry, endocrinology, hematology, urinalysis, coagulation medicine, transfusion medicine, immunology, microbiology and virology, biochemical genetics, cytogenetics and molecular diagnostics, point of care testing, and laboratory management. This report also includes recommendations for training experiences and a list of key texts and other resources in pediatric laboratory medicine. Clinical pathologists should be trained to meet the laboratory medicine needs of pediatric patients and to assist the clinicians caring for these patients with the selection and interpretation of laboratory studies. This review helps program directors tailor their curricula to more effectively provide this training.

  6. Natural and laboratory compaction bands in porous carbonates: a three-dimensional characterization using synchrotron X-ray computed microtomography

    NASA Astrophysics Data System (ADS)

    Cilona, A.; Arzilli, F.; Mancini, L.; Emanuele, T.

    2014-12-01

    Porous carbonates form important reservoirs for water and hydrocarbons. The fluid flow properties of carbonate reservoirs may be affected by post-depositional processes (e.g., mechanical and chemical), which need to be quantified. Field-based studies described bed-parallel compaction bands (CBs) within carbonates with a wide range of porosities. These burial-related structures accommodate volumetric strain by grain rotation, translation, pore collapse and pressure solution. Recently, the same structures have been reproduced for the first time in the laboratory by performing triaxial compaction experiments on porous grainstones. These laboratory studies characterized and compared the microstructures of natural and laboratory CBs, but no analysis of pore connectivity has been performed. In this paper, we use an innovative approach to characterize the pore networks (e.g. porosity, connectivity) of natural and laboratory CBs and compare them with the host rock one. We collected the data using the synchrotron X-ray computed microtomography technique at the SYRMEP beamline of the Elettra-Sincrotrone Trieste Laboratory (Italy). Quantitative analyses of the samples were performed with the Pore3D software library. The porosity was calculated from segmented 3D images of pristine and deformed carbonates. A process of skeletonization was then applied to quantify the number of connected pores within the rock volume. The analysis of the skeleton allowed us to highlight the differences between natural and laboratory CBs, and to investigate how pore connectivity evolves as a function of different deformation pathways. Both pore volume and connectivity are reduced within the CBs respect to the pristine rock and the natural CB has a lower porosity with respect to the laboratory one. The grain contacts in the natural CB are welded, whereas in the laboratory one they have more irregular shapes and grain crushing is the predominant process.

  7. Animal Resource Program | Center for Cancer Research

    Cancer.gov

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Office:

  8. Animal Resource Program | Center for Cancer Research

    Cancer.gov

    CCR Animal Resource Program The CCR Animal Resource Program plans, develops, and coordinates laboratory animal resources for CCR’s research programs. We also provide training, imaging, and technology development in support of moving basic discoveries to the clinic. The ARP Manager:

  9. Spacecraft computer resource margin management. [of Project Galileo Orbiter in-flight reprogramming task

    NASA Technical Reports Server (NTRS)

    Larman, B. T.

    1981-01-01

    The conduction of the Project Galileo Orbiter, with 18 microcomputers and the equivalent of 360K 8-bit bytes of memory contained within two major engineering subsystems and eight science instruments, requires that the key onboard computer system resources be managed in a very rigorous manner. Attention is given to the rationale behind the project policy, the development stage, the preliminary design stage, the design/implementation stage, and the optimization or 'scrubbing' stage. The implementation of the policy is discussed, taking into account the development of the Attitude and Articulation Control Subsystem (AACS) and the Command and Data Subsystem (CDS), the reporting of margin status, and the response to allocation oversubscription.

  10. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  11. Integrating Reservations and Queuing in Remote Laboratory Scheduling

    ERIC Educational Resources Information Center

    Lowe, D.

    2013-01-01

    Remote laboratories (RLs) have become increasingly seen as a useful tool in supporting flexible shared access to scarce laboratory resources. An important element in supporting shared access is coordinating the scheduling of the laboratory usage. Optimized scheduling can significantly decrease access waiting times and improve the utilization level…

  12. 30 CFR 6.10 - Use of independent laboratories.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Use of independent laboratories. 6.10 Section 6..., AND APPROVAL OF MINING PRODUCTS TESTING AND EVALUATION BY INDEPENDENT LABORATORIES AND NON-MSHA PRODUCT SAFETY STANDARDS § 6.10 Use of independent laboratories. (a) MSHA will accept testing and...

  13. Training strategies for laboratory animal veterinarians: challenges and opportunities.

    PubMed

    Colby, Lesley A; Turner, Patricia V; Vasbinder, Mary Ann

    2007-01-01

    The field of laboratory animal medicine is experiencing a serious shortage of appropriately trained veterinarians for both clinically related and research-oriented positions within academia, industry, and government. Recent outreach efforts sponsored by professional organizations have stimulated increased interest in the field. It is an opportune time to critically review and evaluate postgraduate training opportunities in the United States and Canada, including formal training programs, informal training, publicly accessible training resources and educational opportunities, and newly emerging training resources such as Internet-based learning aids. Challenges related to each of these training opportunities exist and include increasing enrollment in formal programs, securing adequate funding support, ensuring appropriate content between formal programs that may have diverse objectives, and accommodating the training needs of veterinarians who enter the field by the experience route. Current training opportunities and resources that exist for veterinarians who enter and are established within the field of laboratory animal science are examined. Strategies for improving formal laboratory animal medicine training programs and for developing alternative programs more suited to practicing clinical veterinarians are discussed. In addition, the resources for high-quality continuing education of experienced laboratory animal veterinarians are reviewed.

  14. The Virtual Geophysics Laboratory (VGL): Scientific Workflows Operating Across Organizations and Across Infrastructures

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Wyborn, L. A.; Fraser, R.; Rankine, T.; Woodcock, R.; Vote, J.; Evans, B.

    2012-12-01

    The Virtual Geophysics Laboratory (VGL) is web portal that provides geoscientists with an integrated online environment that: seamlessly accesses geophysical and geoscience data services from the AuScope national geoscience information infrastructure; loosely couples these data to a variety of gesocience software tools; and provides large scale processing facilities via cloud computing. VGL is a collaboration between CSIRO, Geoscience Australia, National Computational Infrastructure, Monash University, Australian National University and the University of Queensland. The VGL provides a distributed system whereby a user can enter an online virtual laboratory to seamlessly connect to OGC web services for geoscience data. The data is supplied in open standards formats using international standards like GeoSciML. A VGL user uses a web mapping interface to discover and filter the data sources using spatial and attribute filters to define a subset. Once the data is selected the user is not required to download the data. VGL collates the service query information for later in the processing workflow where it will be staged directly to the computing facilities. The combination of deferring data download and access to Cloud computing enables VGL users to access their data at higher resolutions and to undertake larger scale inversions, more complex models and simulations than their own local computing facilities might allow. Inside the Virtual Geophysics Laboratory, the user has access to a library of existing models, complete with exemplar workflows for specific scientific problems based on those models. For example, the user can load a geological model published by Geoscience Australia, apply a basic deformation workflow provided by a CSIRO scientist, and have it run in a scientific code from Monash. Finally the user can publish these results to share with a colleague or cite in a paper. This opens new opportunities for access and collaboration as all the resources (models

  15. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE PAGES

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar; ...

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in

  16. Optimization of tomographic reconstruction workflows on geographically distributed resources

    PubMed Central

    Bicer, Tekin; Gürsoy, Doǧa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T.

    2016-01-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  17. Optimization of tomographic reconstruction workflows on geographically distributed resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bicer, Tekin; Gursoy, Doga; Kettimuthu, Rajkumar

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modelingmore » of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in

  18. Synchronization of Finite State Shared Resources

    DTIC Science & Technology

    1976-03-01

    IMHI uiw mmm " AFOSR -TR- 70- 0^8 3 QC o SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Sei neide.- DEPARTMENT of COMPUTER...34" ■ ■ ^ I I. i. . : ,1 . i-i SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Schneider Department of Computer...SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. ABSTRACT The problem of synchronizing a set of operations defined on a shared resource

  19. A Framework for CS1 Closed Laboratories

    ERIC Educational Resources Information Center

    Soh, Leen-Kiat; Samal, Ashok; Nugent, Gwen

    2005-01-01

    Closed laboratories are becoming an increasingly popular approach to teaching introductory computer science courses, as they facilitate structured problem-solving and cooperation. However, most closed laboratories have been designed and implemented without embedded instructional research components for constant evaluation of the laboratories'…

  20. Communication and computing technology in biocontainment laboratories using the NEIDL as a model.

    PubMed

    McCall, John; Hardcastle, Kath

    2014-07-01

    The National Emerging Infectious Diseases Laboratories (NEIDL), Boston University, is a globally unique biocontainment research facility housing biosafety level 2 (BSL-2), BSL-3, and BSL-4 laboratories. Located in the BioSquare area at the University's Medical Campus, it is part of a national network of secure facilities constructed to study infectious diseases of major public health concern. The NEIDL allows for basic, translational, and clinical phases of research to be carried out in a single facility with the overall goal of accelerating understanding, treatment, and prevention of infectious diseases. The NEIDL will also act as a center of excellence providing training and education in all aspects of biocontainment research. Within every detail of NEIDL operations is a primary emphasis on safety and security. The ultramodern NEIDL has required a new approach to communications technology solutions in order to ensure safety and security and meet the needs of investigators working in this complex building. This article discusses the implementation of secure wireless networks and private cloud computing to promote operational efficiency, biosecurity, and biosafety with additional energy-saving advantages. The utilization of a dedicated data center, virtualized servers, virtualized desktop integration, multichannel secure wireless networks, and a NEIDL-dedicated Voice over Internet Protocol (VoIP) network are all discussed. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  1. Prediction of Wind Energy Resources (PoWER) Users Guide

    DTIC Science & Technology

    2016-01-01

    ARL-TR-7573● JAN 2016 US Army Research Laboratory Prediction of Wind Energy Resources (PoWER) User’s Guide by David P Sauter...not return it to the originator. ARL-TR-7573 ● JAN 2016 US Army Research Laboratory Prediction of Wind Energy Resources (PoWER...2016 2. REPORT TYPE Final 3. DATES COVERED (From - To) 09/2015–11/2015 4. TITLE AND SUBTITLE Prediction of Wind Energy Resources (PoWER) User’s

  2. The Impact of Internet Virtual Physics Laboratory Instruction on the Achievement in Physics, Science Process Skills and Computer Attitudes of 10th-Grade Students

    NASA Astrophysics Data System (ADS)

    Yang, Kun-Yuan; Heh, Jia-Sheng

    2007-10-01

    The purpose of this study was to investigate and compare the impact of Internet Virtual Physics Laboratory (IVPL) instruction with traditional laboratory instruction in physics academic achievement, performance of science process skills, and computer attitudes of tenth grade students. One-hundred and fifty students from four classes at one private senior high school in Taoyuan Country, Taiwan, R.O.C. were sampled. All four classes contained 75 students who were equally divided into an experimental group and a control group. The pre-test results indicated that the students' entry-level physics academic achievement, science process skills, and computer attitudes were equal for both groups. On the post-test, the experimental group achieved significantly higher mean scores in physics academic achievement and science process skills. There was no significant difference in computer attitudes between the groups. We concluded that the IVPL had potential to help tenth graders improve their physics academic achievement and science process skills.

  3. A professional development model for medical laboratory scientists working in the microbiology laboratory.

    PubMed

    Amerson, Megan H; Pulido, Lila; Garza, Melinda N; Ali, Faheem A; Greenhill, Brandy; Einspahr, Christopher L; Yarsa, Joseph; Sood, Pramilla K; Hu, Peter C

    2012-01-01

    The University of Texas M.D. Anderson Cancer Center, Division of Pathology and Laboratory Medicine is committed to providing the best pathology and medicine through: state-of-the art techniques, progressive ground-breaking research, education and training for the clinical diagnosis and research of cancer and related diseases. After surveying the laboratory staff and other hospital professionals, the Department administrators and Human Resource generalists developed a professional development model for Microbiology to support laboratory skills, behavior, certification, and continual education within its staff. This model sets high standards for the laboratory professionals to allow the labs to work at their fullest potential; it provides organization to training technologists based on complete laboratory needs instead of training technologists in individual areas in which more training is required if the laboratory needs them to work in other areas. This model is a working example for all microbiology based laboratories who want to set high standards and want their staff to be acknowledged for demonstrated excellence and professional development in the laboratory. The PDM model is designed to focus on the needs of the laboratory as well as the laboratory professionals.

  4. Radar Control Optimal Resource Allocation

    DTIC Science & Technology

    2015-07-13

    other tunable parameters of radars [17, 18]. Such radar resource scheduling usually demands massive computation. Even myopic 14 Distribution A: Approved...reduced validity of the optimal choice of radar resource. In the non- myopic context, the computational problem becomes exponentially more difficult...computed as t? = ασ2 q + σ r √ α q (σ + r + α q) α q2 r − 1ασ q2 + q r2 . (19) We are only interested in t? > 1 and solving the inequality we obtain the

  5. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, Martin J.

    This project was part of a coordinated software development effort which the nuclear physics lattice QCD community pursues in order to ensure that lattice calculations can make optimal use of present, and forthcoming leadership-class and dedicated hardware, including those of the national laboratories, and prepares for the exploitation of future computational resources in the exascale era. The UW team improved and extended software libraries used in lattice QCD calculations related to multi-nucleon systems, enhanced production running codes related to load balancing multi-nucleon production on large-scale computing platforms, and developed SQLite (addressable database) interfaces to efficiently archive and analyze multi-nucleon datamore » and developed a Mathematica interface for the SQLite databases.« less

  6. Family History Resources.

    ERIC Educational Resources Information Center

    Bookmark, 1991

    1991-01-01

    The 12 articles in this issue focus on the theme of family history resources: (1) "Introduction: Family History Resources" (Joseph F. Shubert); (2) "Work, Credentials, and Expectations of a Professional Genealogist" (Coreen P. Hallenbeck and Lewis W. Hallenbeck); (3) "Computers and Genealogy" (Theresa C. Strasser);…

  7. Operational Changes in a Shared Resource Laboratory with the Use of a Product Lifecycle Management Approach: A Case Study.

    PubMed

    Hexley, Philip; Smith, Victoria; Wall, Samantha

    2016-04-01

    Shared Resource Laboratories (SRLs) provide investigators access to necessary scientific and resource expertise to leverage complex technologies fully for advancing high-quality biomedical research in a cost-effective manner. At the University of Nebraska Medical Center, the Flow Cytometry Research Facility (FCRF) offered access to exceptional technology, but the methods of operation were outdated and unsustainable. Whereas technology has advanced and the institute has expanded, the operations at the facility remained unchanged for 35 yr. To rectify this, at the end of 2013, we took a product lifecycle management approach to affect large operational changes and align the services offered with the SRL goal of education, as well as to provide service to researchers. These disruptive operational changes took over 10 mo to complete and allowed for independent end-user acquisition of flow cytometry data. The results have been monitored for the past 12 mo. The operational changes have had a positive impact on the quality of research, increased investigator-facility interaction, reduced stress of facility staff, and increased overall use of the resources. This product lifecycle management approach to facility operations allowed us to conceive of, design, implement, and monitor effectively the changes at the FCRF. This approach should be considered by SRL management when faced with the need for operationally disruptive measures.

  8. How fifth grade Latino/a bilingual students use their linguistic resources in the classroom and laboratory during science instruction

    NASA Astrophysics Data System (ADS)

    Stevenson, Alma R.

    2013-12-01

    This qualitative, sociolinguistic research study examines how bilingual Latino/a students use their linguistic resources in the classroom and laboratory during science instruction. This study was conducted in a school in the southwestern United States serving an economically depressed, predominantly Latino population. The object of study was a fifth grade science class entirely comprised of language minority students transitioning out of bilingual education. Therefore, English was the means of instruction in science, supported by informal peer-to-peer Spanish-language communication. This study is grounded in a social constructivist paradigm. From this standpoint, learning science is a social process where social, cultural, and linguistic factors are all considered crucial to the process of acquiring scientific knowledge. The study was descriptive in nature, examining specific linguistic behaviors with the purpose of identifying and analyzing the linguistic functions of students' utterances while participating in science learning. The results suggest that students purposefully adapt their use of linguistic resources in order to facilitate their participation in science leaning. What is underscored in this study is the importance of explicitly acknowledging, supporting, and incorporating bilingual students' linguistic resources both in Spanish and English into the science classroom in order to optimize students' participation and facilitate their understanding.

  9. Single-Shot X-Ray Phase-Contrast Computed Tomography with Nonmicrofocal Laboratory Sources

    NASA Astrophysics Data System (ADS)

    Diemoz, P. C.; Hagen, C. K.; Endrizzi, M.; Minuti, M.; Bellazzini, R.; Urbani, L.; De Coppi, P.; Olivo, A.

    2017-04-01

    We present a method that enables performing x-ray phase-contrast imaging (XPCI) computed tomography with a laboratory setup using a single image per projection angle, eliminating the need to move optical elements during acquisition. Theoretical derivation of the method is presented, and its validity conditions are provided. The object is assumed to be quasihomogeneous, i.e., to feature a ratio between the refractive index and the linear attenuation coefficient that is approximately constant across the field of view. The method is experimentally demonstrated on a plastics phantom and on biological samples using a continuous rotation acquisition scheme achieving scan times of a few minutes. Moreover, we show that such acquisition times can be further reduced with the use of a high-efficiency photon-counting detector. Thanks to its ability to substantially simplify the image-acquisition procedure and greatly reduce collection times, we believe this method represents a very important step towards the application of XPCI to real-world problems.

  10. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  11. Neutron Radiography and Computed Tomography at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raine, Dudley A. III; Hubbard, Camden R.; Whaley, Paul M.

    1997-12-31

    The capability to perform neutron radiography and computed tomography is being developed at Oak Ridge National Laboratory. The facility will be located at the High Flux Isotope Reactor (HFIR), which has the highest steady state neutron flux of any reactor in the world. The Monte Carlo N-Particle transport code (MCNP), versions 4A and 4B, has been used extensively in the design phase of the facility to predict and optimize the operating characteristics, and to ensure the safety of personnel working in and around the blockhouse. Neutrons are quite penetrating in most engineering materials and can be useful to detect internalmore » flaws and features. Hydrogen atoms, such as in a hydrocarbon fuel, lubricant or a metal hydride, are relatively opaque to neutron transmission. Thus, neutron based tomography or radiography is ideal to image their presence. The source flux also provides unparalleled flexibility for future upgrades, including real time radiography where dynamic processes can be observed. A novel tomography detector has been designed using optical fibers and digital technology to provide a large dynamic range for reconstructions. Film radiography is also available for high resolution imaging applications. This paper summarizes the results of the design phase of this facility and the potential benefits to science and industry.« less

  12. Cultural resources regulatory analysis, area overview, and assessment of previous Department of Energy and Kirtland Air Force Base inventories for Sandia National Laboratories, New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoagland, S.R.; Lord, K.J.

    The following regulatory analysis and literature review of archaeological and historic resources on the Sandia National Laboratory/New Mexico (SNL/NM) occupied properties was prepared by the Chambers Group Inc. in January 1992. Based upon compliance surveys of Technical Area I through V undertaken in 1990 and 1991 the report concludes that, although consultation with the Department of Energy and State Historic Preservation Officer will still be required for particular projects, cultural resources should not affect the overall planning and development of future SNL/NM projects. As SNL/NM buildings approach 50 years in age additional analysis and consultations may be required. In ordermore » to protect sensitive resources, the location coordinates and maps provided in the original report are not included here.« less

  13. Oak Ridge National Laboratory Institutional Plan, FY 1995--FY 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-11-01

    This report discusses the institutional plan for Oak Ridge National Laboratory for the next five years (1995-2000). Included in this report are the: laboratory director`s statement; laboratory mission, vision, and core competencies; laboratory plan; major laboratory initiatives; scientific and technical programs; critical success factors; summaries of other plans; and resource projections.

  14. Acausal measurement-based quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-07-01

    In measurement-based quantum computing, there is a natural "causal cone" among qubits of the resource state, since the measurement angle on a qubit has to depend on previous measurement results in order to correct the effect of by-product operators. If we respect the no-signaling principle, by-product operators cannot be avoided. Here we study the possibility of acausal measurement-based quantum computing by using the process matrix framework [Oreshkov, Costa, and Brukner, Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076]. We construct a resource process matrix for acausal measurement-based quantum computing restricting local operations to projective measurements. The resource process matrix is an analog of the resource state of the standard causal measurement-based quantum computing. We find that if we restrict local operations to projective measurements the resource process matrix is (up to a normalization factor and trivial ancilla qubits) equivalent to the decorated graph state created from the graph state of the corresponding causal measurement-based quantum computing. We also show that it is possible to consider a causal game whose causal inequality is violated by acausal measurement-based quantum computing.

  15. Student research laboratory for optical engineering

    NASA Astrophysics Data System (ADS)

    Tolstoba, Nadezhda D.; Saitgalina, Azaliya; Abdula, Polina; Butova, Daria

    2015-10-01

    Student research laboratory for optical engineering is comfortable place for student's scientific and educational activity. The main ideas of laboratory, process of creation of laboratory and also activity of laboratory are described in this article. At ITMO University in 2013-2014 were formed a lot of research laboratories. SNLO is a student research (scientific) laboratory formed by the Department of Applied and computer optics of the University ITMO (Information Technologies of Mechanics and Optics). Activity of laboratory is career guidance of entrants and students in the field of optical engineering. Student research laboratory for optical engineering is a place where student can work in the interesting and entertaining scientific atmosphere.

  16. The value of point-of-care CD4+ and laboratory viral load in tailoring antiretroviral therapy monitoring strategies to resource limitations.

    PubMed

    Hyle, Emily P; Jani, Ilesh V; Rosettie, Katherine L; Wood, Robin; Osher, Benjamin; Resch, Stephen; Pei, Pamela P; Maggiore, Paolo; Freedberg, Kenneth A; Peter, Trevor; Parker, Robert A; Walensky, Rochelle P

    2017-09-24

    To examine the clinical and economic value of point-of-care CD4 (POC-CD4) or viral load monitoring compared with current practices in Mozambique, a country representative of the diverse resource limitations encountered by HIV treatment programs in sub-Saharan Africa. We use the Cost-Effectiveness of Preventing AIDS Complications-International model to examine the clinical impact, cost (2014 US$), and incremental cost-effectiveness ratio [$/year of life saved (YLS)] of ART monitoring strategies in Mozambique. We compare: monitoring for clinical disease progression [clinical ART monitoring strategy (CLIN)] vs. annual POC-CD4 in rural settings without laboratory services and biannual laboratory CD4 (LAB-CD4), biannual POC-CD4, and annual viral load in urban settings with laboratory services. We examine the impact of a range of values in sensitivity analyses, using Mozambique's 2014 per capita gross domestic product ($620) as a benchmark cost-effectiveness threshold. In rural settings, annual POC-CD4 compared to CLIN improves life expectancy by 2.8 years, reduces time on failed ART by 0.6 years, and yields an incremental cost-effectiveness ratio of $480/YLS. In urban settings, biannual POC-CD4 is more expensive and less effective than viral load. Compared to biannual LAB-CD4, viral load improves life expectancy by 0.6 years, reduces time on failed ART by 1.0 year, and is cost-effective ($440/YLS). In rural settings, annual POC-CD4 improves clinical outcomes and is cost-effective compared to CLIN. In urban settings, viral load has the greatest clinical benefit and is cost-effective compared to biannual POC-CD4 or LAB-CD4. Tailoring ART monitoring strategies to specific settings with different available resources can improve clinical outcomes while remaining economically efficient.

  17. Instructional computing in space physics moves ahead

    NASA Astrophysics Data System (ADS)

    Russell, C. T.; Omidi, N.

    As the number of spacecraft stationed in the Earth's magnetosphere exponentiates and society becomes more technologically sophisticated and dependent on these spacebased resources, both the importance of space physics and the need to train people in this field will increase.Space physics is a very difficult subject for students to master. Both mechanical and electromagnetic forces are important. The treatment of problems can be very mathematical, and the scale sizes of phenomena are usually such that laboratory studies become impossible, and experimentation, when possible at all, must be carried out in deep space. Fortunately, computers have evolved to the point that they are able to greatly facilitate instruction in space physics.

  18. Conventional Microscopy vs. Computer Imagery in Chiropractic Education.

    PubMed

    Cunningham, Christine M; Larzelere, Elizabeth D; Arar, Ilija

    2008-01-01

    As human tissue pathology slides become increasingly difficult to obtain, other methods of teaching microscopy in educational laboratories must be considered. The purpose of this study was to evaluate our students' satisfaction with newly implemented computer imagery based laboratory instruction and to obtain input from their perspective on the advantages and disadvantages of computerized vs. traditional microscope laboratories. This undertaking involved the creation of a new computer laboratory. Robbins and Cotran Pathologic Basis of Disease, 7(th)ed, was chosen as the required text which gave students access to the Robbins Pathology website, including complete content of text, Interactive Case Study Companion, and Virtual Microscope. Students had experience with traditional microscopes in their histology and microbiology laboratory courses. Student satisfaction with computer based learning was assessed using a 28 question survey which was administered to three successive trimesters of pathology students (n=193) using the computer survey website Zoomerang. Answers were given on a scale of 1-5 and statistically analyzed using weighted averages. The survey data indicated that students were satisfied with computer based learning activities during pathology laboratory instruction. The most favorable aspect to computer imagery was 24-7 availability (weighted avg. 4.16), followed by clarification offered by accompanying text and captions (weighted avg. 4.08). Although advantages and disadvantages exist in using conventional microscopy and computer imagery, current pathology teaching environments warrant investigation of replacing traditional microscope exercises with computer applications. Chiropractic students supported the adoption of computer-assisted instruction in pathology laboratories.

  19. Professional Computer Education Organizations--A Resource for Administrators.

    ERIC Educational Resources Information Center

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  20. How to Motivate Students to Work in the Laboratory: A New Approach for an Electrical Machines Laboratory

    ERIC Educational Resources Information Center

    Saavedra Montes, A. J.; Botero Castro, H. A.; Hernandez Riveros, J. A.

    2010-01-01

    Many laboratory courses have become iterative processes in which students only seek to meet the requirements and pass the course. Some students believe these courses are boring and do not give them training as engineers. To provide a solution to the poor motivation of students in laboratories with few resources, this work proposes the method…