Science.gov

Sample records for additional computational resources

  1. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  2. Calculators and Computers: Graphical Addition.

    ERIC Educational Resources Information Center

    Spero, Samuel W.

    1978-01-01

    A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

  3. CERN Computing Resources Lifecycle Management

    NASA Astrophysics Data System (ADS)

    Tselishchev, Alexey; Tedesco, Paolo; Ormancey, Emmanuel; Isnard, Christian

    2011-12-01

    Computing environments in High Energy Physics are typically complex and heterogeneous, with a wide variety of hardware resources, operating systems and applications. The research activity in all its aspects is carried out by international collaborations constituted by a growing number of participants with a high manpower turnover. These factors can increase the administrative workload required to manage the computing infrastructure and to track resource usage and inheritance. It is therefore necessary to rationalize and formalize the computing resources management, while respecting the requirement of flexibility of scientific applications and services. This paper shows how during the last years the CERN computing infrastructure has been moving in this direction, establishing well-defined policies and lifecycles for resource management. Applications are being migrated towards proposed common identity, authentication and authorization models, reducing their complexity while increasing security and usability. Regular tasks like the creation of primary user accounts are being automated, and self-service facilities are being introduced for common operations, like creation of additional accounts, group subscriptions and password reset. This approach is leading to more efficient and manageable systems.

  4. Enabling opportunistic resources for CMS Computing Operations

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less

  5. Enabling opportunistic resources for CMS Computing Operations

    SciTech Connect

    Hufnagel, Dick

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  6. Enabling opportunistic resources for CMS Computing Operations

    NASA Astrophysics Data System (ADS)

    Hufnagel, D.; CMS Collaboration

    2015-12-01

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  7. Enabling opportunistic resources for CMS Computing Operations

    SciTech Connect

    Hufnagel, Dirk

    2015-12-23

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  8. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  9. Framework Resources Multiply Computing Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  10. COMPUTATIONAL RESOURCES FOR BIOFUEL FEEDSTOCK SPECIES

    SciTech Connect

    Buell, Carol Robin; Childs, Kevin L

    2013-05-07

    While current production of ethanol as a biofuel relies on starch and sugar inputs, it is anticipated that sustainable production of ethanol for biofuel use will utilize lignocellulosic feedstocks. Candidate plant species to be used for lignocellulosic ethanol production include a large number of species within the Grass, Pine and Birch plant families. For these biofuel feedstock species, there are variable amounts of genome sequence resources available, ranging from complete genome sequences (e.g. sorghum, poplar) to transcriptome data sets (e.g. switchgrass, pine). These data sets are not only dispersed in location but also disparate in content. It will be essential to leverage and improve these genomic data sets for the improvement of biofuel feedstock production. The objectives of this project were to provide computational tools and resources for data-mining genome sequence/annotation and large-scale functional genomic datasets available for biofuel feedstock species. We have created a Bioenergy Feedstock Genomics Resource that provides a web-based portal or clearing house for genomic data for plant species relevant to biofuel feedstock production. Sequence data from a total of 54 plant species are included in the Bioenergy Feedstock Genomics Resource including model plant species that permit leveraging of knowledge across taxa to biofuel feedstock species.We have generated additional computational analyses of these data, including uniform annotation, to facilitate genomic approaches to improved biofuel feedstock production. These data have been centralized in the publicly available Bioenergy Feedstock Genomics Resource (http://bfgr.plantbiology.msu.edu/).

  11. Statistics Online Computational Resource for Education

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  12. Statistics Online Computational Resource for Education

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas

    2011-01-01

    Summary The Statistics Online Computational Resource (www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials – instructional resources and computational libraries. PMID:21297884

  13. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  14. Resources for Teaching with Computers in History.

    ERIC Educational Resources Information Center

    Seiter, David M.

    1988-01-01

    Highlights seven resources for teaching history with computers. Titles include "Technology and the Social Studies: A Vision"; "Using Databases in History Teaching"; "Computers and Database Management in the History Curriculum"; "Social Studies. Microsoft Courseware Evaluations"; "Teaching Comparative…

  15. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  16. Production scheduling with discrete and renewable additional resources

    NASA Astrophysics Data System (ADS)

    Kalinowski, K.; Grabowik, C.; Paprocka, I.; Kempa, W.

    2015-11-01

    In this paper an approach to planning of additional resources when scheduling operations are discussed. The considered resources are assumed to be discrete and renewable. In most research in scheduling domain, the basic and often the only type of regarded resources is a workstation. It can be understood as a machine, a device or even as a separated space on the shop floor. In many cases, during the detailed scheduling of operations the need of using more than one resource, required for its implementation, can be indicated. Resource requirements for an operation may relate to different resources or resources of the same type. Additional resources are most often referred to these human resources, tools or equipment, for which the limited availability in the manufacturing system may have an influence on the execution dates of some operations. In the paper the concept of the division into basic and additional resources and their planning method was shown. A situation in which sets of basic and additional resources are not separable - the same additional resource may be a basic resource for another operation is also considered. Scheduling of operations, including greater amount of resources can cause many difficulties, depending on whether the resource is involved in the entire time of operation, only in the selected part(s) of operation (e.g. as auxiliary staff at setup time) or cyclic - e.g. when an operator supports more than one machine, or supervises the execution of several operations. For this reason the dates and work times of resources participation in the operation can be different. Presented issues are crucial when modelling of production scheduling environment and designing of structures for the purpose of scheduling software development.

  17. Resourceful Computing in Unstructured Environments

    DTIC Science & Technology

    1991-07-31

    solutions for ultra-large scale computing problems. "* Analogical and common sense reasoning for analysis and design. The remainder of this report...an analog VLSI chip embedding one of the earliest Vision Machine algorithms--edge detection-integrated with a CCD imager on the same chip. 4 Hardware...Behaviors from a Carefully Evolved Network." Neural Computation 1 (1989). Brooks, R.A. "AL: Great Expectations." Guest editorial, Manufacturing Engineering

  18. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  19. Methods and systems for providing reconfigurable and recoverable computing resources

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A method for optimizing the use of digital computing resources to achieve reliability and availability of the computing resources is disclosed. The method comprises providing one or more processors with a recovery mechanism, the one or more processors executing one or more applications. A determination is made whether the one or more processors needs to be reconfigured. A rapid recovery is employed to reconfigure the one or more processors when needed. A computing system that provides reconfigurable and recoverable computing resources is also disclosed. The system comprises one or more processors with a recovery mechanism, with the one or more processors configured to execute a first application, and an additional processor configured to execute a second application different than the first application. The additional processor is reconfigurable with rapid recovery such that the additional processor can execute the first application when one of the one more processors fails.

  20. Estimating Z-Pinch computing resources.

    SciTech Connect

    Brunner, Thomas A.

    2007-04-01

    The Z facility at Sandia National Laboratories produces high energy density environments. Computer simulations of the experiments provide key insights and help make the most efficient use of the facility. This document estimates the computer resources needed in order to support the experimental program. The resource estimate is what we would like to have in about five years and assumes that we will have a robust, scalable simulation capability as well as enough physicists to run the simulations.

  1. Computer Network Resources for Physical Geography Instruction.

    ERIC Educational Resources Information Center

    Bishop, Michael P.; And Others

    1993-01-01

    Asserts that the use of computer networks provides an important and effective resource for geography instruction. Describes the use of the Internet network in physical geography instruction. Provides an example of the use of Internet resources in a climatology/meteorology course. (CFR)

  2. Computers and the Classroom. A Resource Guide.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Louis, MO.

    Designed for use by educators trying to establish or find networks providing access to educational computing information and avenues for the exchange of ideas and experiences, this guide brings together and describes several different types of resources to provide a base from which other contacts can be made. The resources listed focus on…

  3. Distributing Computer Resources in Education and Training.

    ERIC Educational Resources Information Center

    Bell, Wynne

    1982-01-01

    The future direction of computers in educational settings is the topic of speculation. It is noted that resources in education are so meagre that only practical ventures can be considered. Suggestions are made for stretching available resources and maximizing the benefits to be gained through the new technology. (MP)

  4. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  5. Modern meteorological computing resources - The Maryland experience

    NASA Technical Reports Server (NTRS)

    Huffman, George J.

    1988-01-01

    The Department of Meteorology at the University of Maryland is developing one of the first computer systems in meteorology to take advantage of the new networked computer architecture that has been made possible by recent advances in computer and communication technology. Elements of the department's system include scientific workstations, local mainframe computers, remote mainframe computers, local-area networks,'long-haul' computer-to-computer communications, and 'receive-only' communications. Some background is provided, together with highlights of some lessons that were learned in carrying out the design. In agreement with work in the Unidata Project, this work shows that the networked computer architecture discussed here presents a new style of resources for solving problems that arise in meteorological research and education.

  6. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  7. Computed tomography characterisation of additive manufacturing materials.

    PubMed

    Bibb, Richard; Thompson, Darren; Winder, John

    2011-06-01

    Additive manufacturing, covering processes frequently referred to as rapid prototyping and rapid manufacturing, provides new opportunities in the manufacture of highly complex and custom-fitting medical devices and products. Whilst many medical applications of AM have been explored and physical properties of the resulting parts have been studied, the characterisation of AM materials in computed tomography has not been explored. The aim of this study was to determine the CT number of commonly used AM materials. There are many potential applications of the information resulting from this study in the design and manufacture of wearable medical devices, implants, prostheses and medical imaging test phantoms. A selection of 19 AM material samples were CT scanned and the resultant images analysed to ascertain the materials' CT number and appearance in the images. It was found that some AM materials have CT numbers very similar to human tissues, FDM, SLA and SLS produce samples that appear uniform on CT images and that 3D printed materials show a variation in internal structure.

  8. Argonne's Laboratory computing resource center : 2006 annual report.

    SciTech Connect

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

  9. Mission Critical Computer Resources Management Guide

    DTIC Science & Technology

    1988-09-01

    6 SOFTWARE TEST AND EVALUATION 6.1 TEST PLANNING ........ ..................... 6-1 6.1.1 System Support Computer Resources ....... . 6-1 6.1.2...7-14 CHAPTER 8 PLANNING FOR COMPUTER SOFTWARE 8.1 INTRODUCTION ........ ..................... 8-1 8.2 PLANS AND...DOCUMENTATION ..... .................. 8-1 8.2.1 Program Management Plan (PMP) .. ......... .. 8-1 8.2.2 Test and Evaluation Master Plan (TEMP) .... 8-1 8.2.3

  10. Argonne Laboratory Computing Resource Center - FY2004 Report.

    SciTech Connect

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  11. Argonne's Laboratory Computing Resource Center 2009 annual report.

    SciTech Connect

    Bair, R. B.

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  12. Optimised resource construction for verifiable quantum computation

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham; Wallden, Petros

    2017-04-01

    Recent developments have brought the possibility of achieving scalable quantum networks and quantum devices closer. From the computational point of view these emerging technologies become relevant when they are no longer classically simulatable. Hence a pressing challenge is the construction of practical methods to verify the correctness of the outcome produced by universal or non-universal quantum devices. A promising approach that has been extensively explored is the scheme of verification via encryption through blind quantum computation. We present here a new construction that simplifies the required resources for any such verifiable protocol. We obtain an overhead that is linear in the size of the input (computation), while the security parameter remains independent of the size of the computation and can be made exponentially small (with a small extra cost). Furthermore our construction is generic and could be applied to any universal or non-universal scheme with a given underlying graph.

  13. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  14. Automating usability of ATLAS Distributed Computing resources

    NASA Astrophysics Data System (ADS)

    Tupputi, S. A.; Di Girolamo, A.; Kouba, T.; Schovancová, J.; Atlas Collaboration

    2014-06-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  15. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    SciTech Connect

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    comprehensive scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has begun developing a 'path forward' plan for additional computing resources.

  16. Addition of multiple limiting resources reduces grassland diversity.

    PubMed

    Harpole, W Stanley; Sullivan, Lauren L; Lind, Eric M; Firn, Jennifer; Adler, Peter B; Borer, Elizabeth T; Chase, Jonathan; Fay, Philip A; Hautier, Yann; Hillebrand, Helmut; MacDougall, Andrew S; Seabloom, Eric W; Williams, Ryan; Bakker, Jonathan D; Cadotte, Marc W; Chaneton, Enrique J; Chu, Chengjin; Cleland, Elsa E; D'Antonio, Carla; Davies, Kendi F; Gruner, Daniel S; Hagenah, Nicole; Kirkman, Kevin; Knops, Johannes M H; La Pierre, Kimberly J; McCulley, Rebecca L; Moore, Joslin L; Morgan, John W; Prober, Suzanne M; Risch, Anita C; Schuetz, Martin; Stevens, Carly J; Wragg, Peter D

    2016-09-01

    Niche dimensionality provides a general theoretical explanation for biodiversity-more niches, defined by more limiting factors, allow for more ways that species can coexist. Because plant species compete for the same set of limiting resources, theory predicts that addition of a limiting resource eliminates potential trade-offs, reducing the number of species that can coexist. Multiple nutrient limitation of plant production is common and therefore fertilization may reduce diversity by reducing the number or dimensionality of belowground limiting factors. At the same time, nutrient addition, by increasing biomass, should ultimately shift competition from belowground nutrients towards a one-dimensional competitive trade-off for light. Here we show that plant species diversity decreased when a greater number of limiting nutrients were added across 45 grassland sites from a multi-continent experimental network. The number of added nutrients predicted diversity loss, even after controlling for effects of plant biomass, and even where biomass production was not nutrient-limited. We found that elevated resource supply reduced niche dimensionality and diversity and increased both productivity and compositional turnover. Our results point to the importance of understanding dimensionality in ecological systems that are undergoing diversity loss in response to multiple global change factors.

  17. Addition of multiple limiting resources reduces grassland diversity

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Niche dimensionality is the most general theoretical explanation for biodiversity: more niches allow for more ecological tradeoffs between species and thus greater opportunities for coexistence. Resource competition theory predicts that removing resource limitations, by increasing resource availabil...

  18. Scalable resource management in high performance computers.

    SciTech Connect

    Frachtenberg, E.; Petrini, F.; Fernandez Peinador, J.; Coll, S.

    2002-01-01

    Clusters of workstations have emerged as an important platform for building cost-effective, scalable and highly-available computers. Although many hardware solutions are available today, the largest challenge in making large-scale clusters usable lies in the system software. In this paper we present STORM, a resource management tool designed to provide scalability, low overhead and the flexibility necessary to efficiently support and analyze a wide range of job scheduling algorithms. STORM achieves these feats by closely integrating the management daemons with the low-level features that are common in state-of-the-art high-performance system area networks. The architecture of STORM is based on three main technical innovations. First, a sizable part of the scheduler runs in the thread processor located on the network interface. Second, we use hardware collectives that are highly scalable both for implementing control heartbeats and to distribute the binary of a parallel job in near-constant time, irrespective of job and machine sizes. Third, we use an I/O bypass protocol that allows fast data movements from the file system to the communication buffers in the network interface and vice versa. The experimental results show that STORM can launch a job with a binary of 12MB on a 64 processor/32 node cluster in less than 0.25 sec on an empty network, in less than 0.45 sec when all the processors are busy computing other jobs, and in less than 0.65 sec when the network is flooded with a background traffic. This paper provides experimental and analytical evidence that these results scale to a much larger number of nodes. To the best of our knowledge, STORM is at least two orders of magnitude faster than existing production schedulers in launching jobs, performing resource management tasks and gang scheduling.

  19. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  20. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  1. Resource estimation in high performance medical image computing.

    PubMed

    Banalagay, Rueben; Covington, Kelsie Jade; Wilkes, D M; Landman, Bennett A

    2014-10-01

    Medical imaging analysis processes often involve the concatenation of many steps (e.g., multi-stage scripts) to integrate and realize advancements from image acquisition, image processing, and computational analysis. With the dramatic increase in data size for medical imaging studies (e.g., improved resolution, higher throughput acquisition, shared databases), interesting study designs are becoming intractable or impractical on individual workstations and servers. Modern pipeline environments provide control structures to distribute computational load in high performance computing (HPC) environments. However, high performance computing environments are often shared resources, and scheduling computation across these resources necessitates higher level modeling of resource utilization. Submission of 'jobs' requires an estimate of the CPU runtime and memory usage. The resource requirements for medical image processing algorithms are difficult to predict since the requirements can vary greatly between different machines, different execution instances, and different data inputs. Poor resource estimates can lead to wasted resources in high performance environments due to incomplete executions and extended queue wait times. Hence, resource estimation is becoming a major hurdle for medical image processing algorithms to efficiently leverage high performance computing environments. Herein, we present our implementation of a resource estimation system to overcome these difficulties and ultimately provide users with the ability to more efficiently utilize high performance computing resources.

  2. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  3. Diversity in computing technologies and strategies for dynamic resource allocation

    DOE PAGES

    Garzoglio, G.; Gutsche, O.

    2015-12-23

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  4. Computing the Envelope for Stepwise-Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Computing tight resource-level bounds is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with nodes equal to the events and edges equal to the necessary predecessor links between events. A staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. Each stage has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible and promising for use in the inner loop of flexible-time scheduling algorithms.

  5. Computing the Envelope for Stepwise Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Estimating tight resource level is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with noises equal to the events and edges equal to the necessary predecessor links between events. The incremental solution of a staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. The staged algorithm has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible for use in the inner loop of search-based scheduling algorithms.

  6. ACToR A Aggregated Computational Toxicology Resource (S)

    EPA Science Inventory

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  7. Renewable Fuel Standard Program (RFS1): Final Rule Additional Resources

    EPA Pesticide Factsheets

    The final rule of fuels and fuel additives: renewable fuel standard program is published on May 1, 2007 and is effective on September 1, 2007. You will find the links to this final rule and technical amendments supporting this rule.

  8. Renewable Fuel Standard (RFS2): Final Rule Additional Resources

    EPA Pesticide Factsheets

    The final rule of fuels and fuel additives: renewable fuel standard program is published on March 26, 2010 and is effective on July 1, 2010. You will find the links to this final rule and technical amendments supporting this rule.

  9. Decentralized Resource Management in Distributed Computer Systems.

    DTIC Science & Technology

    1982-02-01

    processor at a given point in time. The second activity is the cooperation of processors in balancing their loads. The method by which processors...given synchronization method will be practical if it is easy to use at the user level. At the user level, the user should be required only to specify...access the shared resources. Such a - method , however, neither supports concurrent access, nor supports any priority policy other than round-robin

  10. ACToR A Aggregated Computational Toxicology Resource (S) ...

    EPA Pesticide Factsheets

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  11. Resource requirements for digital computations on electrooptical systems

    NASA Astrophysics Data System (ADS)

    Eshaghian, Mary M.; Panda, Dhabaleswar K.; Kumar, V. K. Prasanna

    1991-03-01

    The resource requirements of electrooptical organizations in performing digital computing tasks are studied via a generic model of parallel computation using optical interconnects, called the 'optical model of computation' (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Relationships between information transfer and computational resources in solving a given problem are derived. A computationally intensive operation, two-dimensional digital image convolution is undertaken. Irrespective of the input/output scheme and the order of computation, a lower bound of Omega(nw) is obtained on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  12. Computer Resources Handbook for Flight Critical Systems.

    DTIC Science & Technology

    1985-01-01

    in avionic systems are suspected of being due to software. In a study of software reliability for digital flight controls conducted by SoHaR for the...aircraft and flight crew -- the use of computers in flight critical applications. Special reliability and fault tolerance (RAFT) techniques are being Used...tolerance in flight critical systems. Conventional reliability techniques and analysis and reliability improvement techniques at the system level are

  13. Computer Usage as Instructional Resources for Vocational Training in Nigeria

    ERIC Educational Resources Information Center

    Oguzor, Nkasiobi Silas

    2011-01-01

    The use of computers has become the driving force in the delivery of instruction of today's vocational education and training (VET) in Nigeria. Though computers have become an increasingly accessible resource for educators to use in their teaching activities, most teachers are still unable to integrate it in their teaching and learning processes.…

  14. Performance Evaluation of Resource Management in Cloud Computing Environments.

    PubMed

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  15. Shared resource control between human and computer

    NASA Technical Reports Server (NTRS)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  16. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  17. Computational resources and tools for antimicrobial peptides.

    PubMed

    Liu, Shicai; Fan, Linlin; Sun, Jian; Lao, Xingzhen; Zheng, Heng

    2017-01-01

    Antimicrobial peptides (AMPs), as evolutionarily conserved components of innate immune system, protect against pathogens including bacteria, fungi, viruses, and parasites. In general, AMPs are relatively small peptides (<10 kDa) with cationic nature and amphipathic structure and have modes of action different from traditional antibiotics. Up to now, there are more than 19 000 AMPs that have been reported, including those isolated from nature sources or by synthesis. They have been considered to be promising substitutes of conventional antibiotics in the quest to address the increasing occurrence of antibiotic resistance. However, most AMPs have modest direct antimicrobial activity, and their mechanisms of action, as well as their structure-activity relationships, are still poorly understood. Computational strategies are invaluable assets to provide insight into the activity of AMPs and thus exploit their potential as a new generation of antimicrobials. This article reviews the advances of AMP databases and computational tools for the prediction and design of new active AMPs. Copyright © 2016 European Peptide Society and John Wiley & Sons, Ltd.

  18. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  19. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  20. The Job Demands-Resources Model: An Analysis of Additive and Joint Effects of Demands and Resources

    ERIC Educational Resources Information Center

    Hu, Qiao; Schaufeli, Wilmar B.; Taris, Toon W.

    2011-01-01

    The present study investigated the additive, synergistic, and moderating effects of job demands and job resources on well-being (burnout and work engagement) and organizational outcomes, as specified by the Job Demands-Resources (JD-R) model. A survey was conducted among two Chinese samples: 625 blue collar workers and 761 health professionals. A…

  1. The Gain of Resource Delegation in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander

    In this paper, we address job scheduling in Distributed Computing Infrastructures, that is a loosely coupled network of autonomous acting High Performance Computing systems. In contrast to the common approach of mutual workload exchange, we consider the more intuitive operator's viewpoint of load-dependent resource reconfiguration. In case of a site's over-utilization, the scheduling system is able to lease resources from other sites to keep up service quality for its local user community. Contrary, the granting of idle resources can increase utilization in times of low local workload and thus ensure higher efficiency. The evaluation considers real workload data and is done with respect to common service quality indicators. For two simple resource exchange policies and three basic setups we show the possible gain of this approach and analyze the dynamics in workload-adaptive reconfiguration behavior.

  2. Dynamic computing resource allocation in online flood monitoring and prediction

    NASA Astrophysics Data System (ADS)

    Kuchar, S.; Podhoranyi, M.; Vavrik, R.; Portero, A.

    2016-08-01

    This paper presents tools and methodologies for dynamic allocation of high performance computing resources during operation of the Floreon+ online flood monitoring and prediction system. The resource allocation is done throughout the execution of supported simulations to meet the required service quality levels for system operation. It also ensures flexible reactions to changing weather and flood situations, as it is not economically feasible to operate online flood monitoring systems in the full performance mode during non-flood seasons. Different service quality levels are therefore described for different flooding scenarios, and the runtime manager controls them by allocating only minimal resources currently expected to meet the deadlines. Finally, an experiment covering all presented aspects of computing resource allocation in rainfall-runoff and Monte Carlo uncertainty simulation is performed for the area of the Moravian-Silesian region in the Czech Republic.

  3. Economic models for management of resources in peer-to-peer and grid computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  4. EST analysis pipeline: use of distributed computing resources.

    PubMed

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  5. Aggregation of Cricket Activity in Response to Resource Addition Increases Local Diversity.

    PubMed

    Szinwelski, Neucir; Rosa, Cassiano Sousa; Solar, Ricardo Ribeiro de Castro; Sperber, Carlos Frankl

    2015-01-01

    Crickets are often found feeding on fallen fruits among forest litter. Fruits and other sugar-rich resources are not homogeneously distributed, nor are they always available. We therefore expect that crickets dwelling in forest litter have a limited supply of sugar-rich resource, and will perceive this and displace towards resource-supplemented sites. Here we evaluate how sugar availability affects cricket species richness and abundance in old-growth Atlantic forest by spraying sugarcane syrup on leaf litter, simulating increasing availability, and collecting crickets via pitfall trapping. We found an asymptotic positive association between resource addition and species richness, and an interaction between resource addition and species identity on cricket abundance, which indicates differential effects of resource addition among cricket species. Our results indicate that 12 of the 13 cricket species present in forest litter are maintained at low densities by resource scarcity; this highlights sugar-rich resource as a short-term driver of litter cricket community structure in tropical forests. When resource was experimentally increased, species richness increased due to behavioral displacement. We present evidence that the density of many species is limited by resource scarcity and, when resources are added, behavioral displacement promotes increased species packing and alters species composition. Further, our findings have technical applicability for increasing sampling efficiency of local cricket diversity in studies aiming to estimate species richness, but with no regard to local environmental drivers or species-abundance characteristics.

  6. Computer Technology in Curriculum and Instruction Handbook: Resources.

    ERIC Educational Resources Information Center

    Collins, Sue; And Others

    This annotated bibliography and resource list is the fourth guide in a series that constitutes a handbook on educational computing for teachers and administrators. Citations include the title, author, publisher, copyright date, current price, and publisher's address. Research and journal articles are listed under the following topic areas:…

  7. ACToR: Aggregated Computational Toxicology Resource (T)

    EPA Science Inventory

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information ...

  8. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  9. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  10. Exploiting multicore compute resources in the CMS experiment

    NASA Astrophysics Data System (ADS)

    Ramírez, J. E.; Pérez-Calero Yzquierdo, A.; Hernández, J. M.; CMS Collaboration

    2016-10-01

    CMS has developed a strategy to efficiently exploit the multicore architecture of the compute resources accessible to the experiment. A coherent use of the multiple cores available in a compute node yields substantial gains in terms of resource utilization. The implemented approach makes use of the multithreading support of the event processing framework and the multicore scheduling capabilities of the resource provisioning system. Multicore slots are acquired and provisioned by means of multicore pilot agents which internally schedule and execute single and multicore payloads. Multicore scheduling and multithreaded processing are currently used in production for online event selection and prompt data reconstruction. More workflows are being adapted to run in multicore mode. This paper presents a review of the experience gained in the deployment and operation of the multicore scheduling and processing system, the current status and future plans.

  11. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  12. 15 CFR 270.204 - Provision of additional resources and services needed by a Team.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... services needed by a Team. 270.204 Section 270.204 Commerce and Foreign Trade Regulations Relating to... CONSTRUCTION SAFETY TEAMS NATIONAL CONSTRUCTION SAFETY TEAMS Investigations § 270.204 Provision of additional resources and services needed by a Team. The Director will determine the appropriate resources that a...

  13. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  14. Resource Costs for Fault-Tolerant Linear Optical Quantum Computing

    NASA Astrophysics Data System (ADS)

    Li, Ying; Humphreys, Peter C.; Mendoza, Gabriel J.; Benjamin, Simon C.

    2015-10-01

    Linear optical quantum computing (LOQC) seems attractively simple: Information is borne entirely by light and processed by components such as beam splitters, phase shifters, and detectors. However, this very simplicity leads to limitations, such as the lack of deterministic entangling operations, which are compensated for by using substantial hardware overheads. Here, we quantify the resource costs for full-scale LOQC by proposing a specific protocol based on the surface code. With the caveat that our protocol can be further optimized, we report that the required number of physical components is at least 5 orders of magnitude greater than in comparable matter-based systems. Moreover, the resource requirements grow further if the per-component photon-loss rate is worse than 1 0-3 or the per-component noise rate is worse than 1 0-5. We identify the performance of switches in the network as the single most influential factor influencing resource scaling.

  15. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    PubMed

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  16. Pricing the Computing Resources: Reading Between the Lines and Beyond

    NASA Technical Reports Server (NTRS)

    Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)

    2001-01-01

    Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.

  17. Emotor control: computations underlying bodily resource allocation, emotions, and confidence

    PubMed Central

    Kepecs, Adam; Mensh, Brett D.

    2015-01-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience—approaching subjective behavior as the result of mental computations instantiated in the brain—to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This “emotor” control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on “confidence.” Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior. PMID:26869840

  18. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  19. Optimal allocation of computational resources in hydrogeological models under uncertainty

    NASA Astrophysics Data System (ADS)

    Moslehi, Mahsa; Rajagopal, Ram; de Barros, Felipe P. J.

    2015-09-01

    Flow and transport models in heterogeneous geological formations are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting subsurface flow and transport often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field parameter representing hydrogeological characteristics of the aquifer. The physical resolution (e.g. spatial grid resolution) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We develop an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model prediction and physical errors corresponding to numerical grid resolution. Computational resources are allocated by considering the overall error based on a joint statistical-numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The performance of the framework is tested against computationally extensive simulations of flow and transport in spatially heterogeneous aquifers. Results show that modelers can achieve optimum physical and statistical resolutions while keeping a minimum error for a given computational time. The physical and statistical resolutions obtained through our analysis yield lower computational costs when compared to the results obtained with prevalent recommendations in the literature. Lastly, we highlight the significance of the geometrical characteristics of the contaminant source zone on the

  20. MX Siting Investigation. Mineral Resources Survey, Seven Additional Valleys, Nevada/Utah Siting Area. Volume IV.

    DTIC Science & Technology

    1981-06-23

    8217 AD-AI13 146 ERTEC WESTERN INC. LONG BEACH CA F/6 B/7 MX SITING INVESTIGATION. MINERAL RESOURCES SURVEY, SEVEN AGOITI--ETC(U) UNCLASSIFIED E-TR...50 MINERAL RESOURCES SURVEY SEVEN ADDITIONAL VALLEYS NEVADA/UTAH SITING AREA VOLUME IV 4Prepared for: U. S. Department of the Air Force Ballistic...VALLEY MINERAL RESOURCES SURVEfV STUDY AREA OXJNOARY SEPT. 26, 1960 I MX SITING INVESTIGATION 27 FEDC t97 DEPARTMENT OF THE AIR FORCE I ik 320’- 36 37 4

  1. Geology and mineral and energy resources, Roswell Resource Area, New Mexico; an interactive computer presentation

    USGS Publications Warehouse

    Tidball, Ronald R.; Bartsch-Winkler, S. B.

    1995-01-01

    This Compact Disc-Read Only Memory (CD-ROM) contains a program illustrating the geology and mineral and energy resources of the Roswell Resource Area, an administrative unit of the U.S. Bureau of Land Management in east-central New Mexico. The program enables the user to access information on the geology, geochemistry, geophysics, mining history, metallic and industrial mineral commodities, hydrocarbons, and assessments of the area. The program was created with the display software, SuperCard, version 1.5, by Aldus. The program will run only on a Macintosh personal computer. This CD-ROM was produced in accordance with Macintosh HFS standards. The program was developed on a Macintosh II-series computer with system 7.0.1. The program is a compiled, executable form that is nonproprietary and does not require the presence of the SuperCard software.

  2. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  3. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  4. The Use of Passwords for Controlled Access to Computer Resources. Computer Science & Technology.

    ERIC Educational Resources Information Center

    Wood, Helen M.

    This paper considers the generation of passwords and their effective application to the problem of controlling access to computer resources. After describing the need for and uses of passwords, password schemes are categorized according to selection technique, lifetime, physical characteristics, and information content. Password protection, both…

  5. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  6. Enabling Grid Computing resources within the KM3NeT computing model

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  7. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  8. Computational calculation of equilibrium constants: addition to carbonyl compounds.

    PubMed

    Gómez-Bombarelli, Rafael; González-Pérez, Marina; Pérez-Prior, María Teresa; Calle, Emilio; Casado, Julio

    2009-10-22

    Hydration reactions are relevant for understanding many organic mechanisms. Since the experimental determination of hydration and hemiacetalization equilibrium constants is fairly complex, computational calculations now offer a useful alternative to experimental measurements. In this work, carbonyl hydration and hemiacetalization constants were calculated from the free energy differences between compounds in solution, using absolute and relative approaches. The following conclusions can be drawn: (i) The use of a relative approach in the calculation of hydration and hemiacetalization constants allows compensation of systematic errors in the solvation energies. (ii) On average, the methodology proposed here can predict hydration constants within +/- 0.5 log K(hyd) units for aldehydes. (iii) Hydration constants can be calculated for ketones and carboxylic acid derivatives within less than +/- 1.0 log K(hyd), on average, at the CBS-Q level of theory. (iv) The proposed methodology can predict hemiacetal formation constants accurately at the MP2 6-31++G(d,p) level using a common reference. If group references are used, the results obtained using the much cheaper DFT-B3LYP 6-31++G(d,p) level are almost as accurate. (v) In general, the best results are obtained if a common reference for all compounds is used. The use of group references improves the results at the lower levels of theory, but at higher levels, this becomes unnecessary.

  9. Subjective and objective components of resource value additively increase aggression in parasitoid contests

    PubMed Central

    Stockermans, Bernard C.; Hardy, Ian C. W.

    2013-01-01

    Two major categories of factors are predicted to influence behaviour in dyadic contests; differences in the abilities of the contestants to acquire and retain resources (resource holding potential), and the value of the contested resource (resource value, RV; which comprises objective and subjective components). Recent studies indicate that subjective components affect contest behaviour in several animal taxa but few have simultaneously investigated objective RV components. We find that both an objective (host size) and a subjective (contestant age) component of RV affect contest intensity in the parasitoid wasp Goniozus legneri. These additively influence aggressiveness, with a larger effect from the subjective component than the objective component. The greater influence of subjective RV adds weight to the recent surge of recognition of this RV component's importance in contest behaviour. PMID:23697643

  10. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    ERIC Educational Resources Information Center

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  11. SInCRe—structural interactome computational resource for Mycobacterium tuberculosis

    PubMed Central

    Metri, Rahul; Hariharaputran, Sridhar; Ramakrishnan, Gayatri; Anand, Praveen; Raghavender, Upadhyayula S.; Ochoa-Montaño, Bernardo; Higueruelo, Alicia P.; Sowdhamini, Ramanathan; Chandra, Nagasuma R.; Blundell, Tom L.; Srinivasan, Narayanaswamy

    2015-01-01

    We have developed an integrated database for Mycobacterium tuberculosis H37Rv (Mtb) that collates information on protein sequences, domain assignments, functional annotation and 3D structural information along with protein–protein and protein–small molecule interactions. SInCRe (Structural Interactome Computational Resource) is developed out of CamBan (Cambridge and Bangalore) collaboration. The motivation for development of this database is to provide an integrated platform to allow easily access and interpretation of data and results obtained by all the groups in CamBan in the field of Mtb informatics. In-house algorithms and databases developed independently by various academic groups in CamBan are used to generate Mtb-specific datasets and are integrated in this database to provide a structural dimension to studies on tuberculosis. The SInCRe database readily provides information on identification of functional domains, genome-scale modelling of structures of Mtb proteins and characterization of the small-molecule binding sites within Mtb. The resource also provides structure-based function annotation, information on small-molecule binders including FDA (Food and Drug Administration)-approved drugs, protein–protein interactions (PPIs) and natural compounds that bind to pathogen proteins potentially and result in weakening or elimination of host–pathogen protein–protein interactions. Together they provide prerequisites for identification of off-target binding. Database URL: http://proline.biochem.iisc.ernet.in/sincre PMID:26130660

  12. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  13. A resource-sharing model based on a repeated game in fog computing.

    PubMed

    Sun, Yan; Zhang, Nan

    2017-03-01

    With the rapid development of cloud computing techniques, the number of users is undergoing exponential growth. It is difficult for traditional data centers to perform many tasks in real time because of the limited bandwidth of resources. The concept of fog computing is proposed to support traditional cloud computing and to provide cloud services. In fog computing, the resource pool is composed of sporadic distributed resources that are more flexible and movable than a traditional data center. In this paper, we propose a fog computing structure and present a crowd-funding algorithm to integrate spare resources in the network. Furthermore, to encourage more resource owners to share their resources with the resource pool and to supervise the resource supporters as they actively perform their tasks, we propose an incentive mechanism in our algorithm. Simulation results show that our proposed incentive mechanism can effectively reduce the SLA violation rate and accelerate the completion of tasks.

  14. Computational resources for cryo-electron tomography in Bsoft.

    PubMed

    Heymann, J Bernard; Cardone, Giovanni; Winkler, Dennis C; Steven, Alasdair C

    2008-03-01

    The Bsoft package [Heymann, J.B., Belnap, D.M., 2007. Bsoft: image processing and molecular modeling for electron microscopy. J. Struct. Biol. 157, 3-18] has been enhanced by adding utilities for processing electron tomographic (ET) data; in particular, cryo-ET data characterized by low contrast and high noise. To handle the high computational load efficiently, a workflow was developed, based on the database-like parameter handling in Bsoft, aimed at minimizing user interaction and facilitating automation. To the same end, scripting elements distribute the processing among multiple processors on the same or different computers. The resolution of a tomogram depends on the precision of projection alignment, which is usually based on pinpointing fiducial markers (electron-dense gold particles). Alignment requires accurate specification of the tilt axis, and our protocol includes a procedure for determining it to adequate accuracy. Refinement of projection alignment provides information that allows assessment of its precision, as well as projection quality control. We implemented a reciprocal space algorithm that affords an alternative to back-projection or real space algorithms for calculating tomograms. Resources are also included that allow resolution assessment by cross-validation (NLOO2D); denoising and interpretation; and the extraction, mutual alignment, and averaging of tomographic sub-volumes.

  15. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    NASA Technical Reports Server (NTRS)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  16. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    ERIC Educational Resources Information Center

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  17. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  18. Effects of native species diversity and resource additions on invader impact.

    PubMed

    Maron, John L; Marler, Marilyn

    2008-07-01

    Theory and empirical work have demonstrated that diverse communities can inhibit invasion. Yet, it is unclear how diversity influences invader impact, how impact varies among exotics, and what the relative importance of diversity is versus extrinsic factors that themselves can influence invasion. To address these issues, we established plant assemblages that varied in native species and functional richness and crossed this gradient in diversity with resource (water) addition. Identical assemblages were either uninvaded or invaded with one of three exotic forbs: spotted knapweed (Centaurea maculosa), dalmatian toadflax (Linaria dalmatica), or sulfur cinquefoil (Potentilla recta). To determine impacts, we measured the effects of exotics on native biomass and, for spotted knapweed, on soil moisture and nitrogen levels. Assemblages with high species richness were less invaded and less impacted than less diverse assemblages. Impact scaled with exotic biomass; spotted knapweed had the largest impact on native biomass compared with the other exotics. Although invasion depressed native biomass, the net result was to increase total community yield. Water addition increased invasibility (for knapweed only) but had no effect on invader impact. Together, these results suggest that diversity inhibits invasion and reduces impact more than resource additions facilitate invasion or impact.

  19. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    PubMed

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  20. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    NASA Astrophysics Data System (ADS)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  1. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  2. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    ERIC Educational Resources Information Center

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  3. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  4. Incrementality and additionality: A new dimension to North-South resource transfers

    SciTech Connect

    Jordan, A. . School of Environmental Sciences); Werksman, J. . Foundation for International Environmental Law and Development)

    1994-06-01

    In the last four years, incrementality'' and additionality'' have emerged as new terms in the evolving lexicon of international environmental diplomacy. As Parties to the Conventions on Climate Change, Biodiversity and the Ozone Layer, industrialized states undertake to provide sufficient additional resources (the principle of additionality) to meet the incremental cost (the concept of incrementality) of measures undertaken by the developing countries to tackle global environmental problems. Issues of incrementality and additionality go to the heart of a much deeper and highly contentious debate on who should pay the costs of responding to global environmental problems; on how the payment should be made; on which agency or agencies should manage the transfers; and upon which parties should be compensated. Every sign is that if the overall North to South transfer breaks down or is retarded, then the process of implementing the aforementioned agreements may be jeopardized. This paper reviews the emergency of the two terms in international environmental politics; it pinpoints the theoretical and practical difficulties of defining and implementing them; and it assesses whether these difficulties and conflicts of opinion may, in some manner, be resolved.

  5. Effective Computer Resource Management: Keeping the Tail from Wagging the Dog.

    ERIC Educational Resources Information Center

    Sampson, James P., Jr.

    1982-01-01

    Predicts that student services will be increasingly influenced by computer technology. Suggests this resource be managed effectively to minimize potential problems and prevent a mechanistic and impersonal environment. Urges student personnel workers to assume active responsibility for planning, evaluating, and operating computer resources. (JAC)

  6. A Review of Resources for Evaluating K-12 Computer Science Education Programs

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Hartikainen, Elina

    2004-01-01

    Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…

  7. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  8. A software framework for building biomedical machine learning classifiers through grid computing resources.

    PubMed

    Ramos-Pollán, Raúl; Guevara-López, Miguel Angel; Oliveira, Eugénio

    2012-08-01

    This paper describes the BiomedTK software framework, created to perform massive explorations of machine learning classifiers configurations for biomedical data analysis over distributed Grid computing resources. BiomedTK integrates ROC analysis throughout the complete classifier construction process and enables explorations of large parameter sweeps for training third party classifiers such as artificial neural networks and support vector machines, offering the capability to harness the vast amount of computing power serviced by Grid infrastructures. In addition, it includes classifiers modified by the authors for ROC optimization and functionality to build ensemble classifiers and manipulate datasets (import/export, extract and transform data, etc.). BiomedTK was experimentally validated by training thousands of classifier configurations for representative biomedical UCI datasets reaching in little time classification levels comparable to those reported in existing literature. The comprehensive method herewith presented represents an improvement to biomedical data analysis in both methodology and potential reach of machine learning based experimentation.

  9. Professional Computer Education Organizations--A Resource for Administrators.

    ERIC Educational Resources Information Center

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  10. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  11. Computers in the Classroom: Teacher's Resource Manual for Algebra.

    ERIC Educational Resources Information Center

    Koetke, Walter

    Demonstration programs, possible assignments for students (with solutions), and remedial drill programs for students to use are presented to aid teachers using a computer or a computer terminal in the teaching of algebra. The text can be followed page by page or used as a well-indexed reference work, and specific suggestions are made on how and…

  12. Science and Technology Resources on the Internet: Computer Security.

    ERIC Educational Resources Information Center

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  13. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing

    PubMed Central

    Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network. PMID:28030553

  14. Crowd-Funding: A New Resource Cooperation Mode for Mobile Cloud Computing.

    PubMed

    Zhang, Nan; Yang, Xiaolong; Zhang, Min; Sun, Yan

    2016-01-01

    Mobile cloud computing, which integrates the cloud computing techniques into the mobile environment, is regarded as one of the enabler technologies for 5G mobile wireless networks. There are many sporadic spare resources distributed within various devices in the networks, which can be used to support mobile cloud applications. However, these devices, with only a few spare resources, cannot support some resource-intensive mobile applications alone. If some of them cooperate with each other and share their resources, then they can support many applications. In this paper, we propose a resource cooperative provision mode referred to as "Crowd-funding", which is designed to aggregate the distributed devices together as the resource provider of mobile applications. Moreover, to facilitate high-efficiency resource management via dynamic resource allocation, different resource providers should be selected to form a stable resource coalition for different requirements. Thus, considering different requirements, we propose two different resource aggregation models for coalition formation. Finally, we may allocate the revenues based on their attributions according to the concept of the "Shapley value" to enable a more impartial revenue share among the cooperators. It is shown that a dynamic and flexible resource-management method can be developed based on the proposed Crowd-funding model, relying on the spare resources in the network.

  15. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    PubMed

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  16. Quantum computing with incoherent resources and quantum jumps.

    PubMed

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  17. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  18. Curriculum and Resources: Computer Provision in a CTC.

    ERIC Educational Resources Information Center

    Denholm, Lawrence

    The program for City Technical Colleges (CTCs) draws on ideas and resources from government, private industry, and education to focus on the educational needs of inner city and urban children. Mathematics, science, and technology are at the center of the CTCs' mission, in a context which includes economic awareness and a commitment to enterprise…

  19. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  20. Utilizing a Collaborative Cross Number Puzzle Game to Develop the Computing Ability of Addition and Subtraction

    ERIC Educational Resources Information Center

    Chen, Yen-Hua; Looi, Chee-Kit; Lin, Chiu-Pin; Shao, Yin-Juan; Chan, Tak-Wai

    2012-01-01

    While addition and subtraction is a key mathematical skill for young children, a typical activity for them in classrooms involves doing repetitive arithmetic calculation exercises. In this study, we explore a collaborative way for students to learn these skills in a technology-enabled way with wireless computers. Two classes, comprising a total of…

  1. A novel resource management method of providing operating system as a service for mobile transparent computing.

    PubMed

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  2. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  3. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  4. Web application for simplifying access to computer center resources and information.

    SciTech Connect

    Long, J. W.

    2013-05-01

    Lorenz is a product of the ASC Scientific Data Management effort. Lorenz is a web-based application designed to help computer centers make information and resources more easily available to their users.

  5. Justification of Filter Selection for Robot Balancing in Conditions of Limited Computational Resources

    NASA Astrophysics Data System (ADS)

    Momot, M. V.; Politsinskaia, E. V.; Sushko, A. V.; Semerenko, I. A.

    2016-08-01

    The paper considers the problem of mathematical filter selection, used for balancing of wheeled robot in conditions of limited computational resources. The solution based on complementary filter is proposed.

  6. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  7. Dynamic remapping of parallel computations with varying resource demands

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.; Saltz, J. H.

    1986-01-01

    A large class of computational problems is characterized by frequent synchronization, and computational requirements which change as a function of time. When such a problem must be solved on a message passing multiprocessor machine, the combination of these characteristics lead to system performance which decreases in time. Performance can be improved with periodic redistribution of computational load; however, redistribution can exact a sometimes large delay cost. We study the issue of deciding when to invoke a global load remapping mechanism. Such a decision policy must effectively weigh the costs of remapping against the performance benefits. We treat this problem by constructing two analytic models which exhibit stochastically decreasing performance. One model is quite tractable; we are able to describe the optimal remapping algorithm, and the optimal decision policy governing when to invoke that algorithm. However, computational complexity prohibits the use of the optimal remapping decision policy. We then study the performance of a general remapping policy on both analytic models. This policy attempts to minimize a statistic W(n) which measures the system degradation (including the cost of remapping) per computation step over a period of n steps. We show that as a function of time, the expected value of W(n) has at most one minimum, and that when this minimum exists it defines the optimal fixed-interval remapping policy. Our decision policy appeals to this result by remapping when it estimates that W(n) is minimized. Our performance data suggests that this policy effectively finds the natural frequency of remapping. We also use the analytic models to express the relationship between performance and remapping cost, number of processors, and the computation's stochastic activity.

  8. BioSCAN: a network sharable computational resource for searching biosequence databases.

    PubMed

    Singh, R K; Hoffman, D L; Tell, S G; White, C T

    1996-06-01

    We describe a network sharable, interactive computational tool for rapid and sensitive search and analysis of biomolecular sequence databases such as GenBank, GenPept, Protein Identification Resource, and SWISS-PROT. The resource is accessible via the World Wide Web using popular client software such as Mosaic and Netscape. The client software is freely available on a number of computing platforms including Macintosh, IBM-PC, and Unix workstations.

  9. Estimating the Resources for Quantum Computation with the QuRE Toolbox

    DTIC Science & Technology

    2013-05-31

    Martinis. Quantum -information processing with circuit quantum electrodynamics . Quant. Inf. Processing, 8:81 – 103, Jun 2009. [16] M. D. Barrett, T...Estimating the Resources for Quantum Computation with the QuRE Toolbox Martin Suchara Arvin Faruque Ching-Yi Lai Gerardo Paz Frederic Chong John D...SUBTITLE Estimating the Resources for Quantum Computation with the QuRE Toolbox 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  10. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    PubMed Central

    Ehsan, Shoaib; Clark, Adrian F.; ur Rehman, Naveed; McDonald-Maier, Klaus D.

    2015-01-01

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems. PMID:26184211

  11. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    PubMed

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-07-10

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.

  12. Multi-Programmatic and Institutional Computing Capacity Resource Attachment 2 Statement of Work

    SciTech Connect

    Seager, M

    2002-04-15

    Lawrence Livermore National Laboratory (LLNL) has identified high-performance computing as a critical competency necessary to meet the goals of LLNL's scientific and engineering programs. Leadership in scientific computing demands the availability of a stable, powerful, well-balanced computational infrastructure, and it requires research directed at advanced architectures, enabling numerical methods and computer science. To encourage all programs to benefit from the huge investment being made by the Advanced Simulation and Computing Program (ASCI) at LLNL, and to provide a mechanism to facilitate multi-programmatic leveraging of resources and access to high-performance equipment by researchers, M&IC was created. The Livermore Computing (LC) Center, a part of the Computations Directorate Integrated Computing and Communications (ICC) Department can be viewed as composed of two facilities, one open and one secure. This acquisition is focused on the M&IC resources in the Open Computing Facility (OCF). For the M&IC program, recent efforts and expenditures have focused on enhancing capacity and stabilizing the TeraCluster 2000 (TC2K) resource. Capacity is a measure of the ability to process a varied workload from many scientists simultaneously. Capability represents the ability to deliver a very large system to run scientific calculations at large scale. In this procurement action, we intend to significantly increase the capability of the M&IC resource to address multiple teraFLOP/s problems, and well as increasing the capacity to do many 100 gigaFLOP/s calculations.

  13. Computers and Resource-Based History Teaching: A UK Perspective.

    ERIC Educational Resources Information Center

    Spaeth, Donald A.; Cameron, Sonja

    2000-01-01

    Presents an overview of developments in computer-aided history teaching for higher education in the United Kingdom and the United States. Explains that these developments have focused on providing students with access to primary sources to enhance their understanding of historical methods and content. (CMK)

  14. Novel Techniques for Secure Use of Public Cloud Computing Resources

    DTIC Science & Technology

    2015-09-17

    SIGCOMM Computer Communication Review, volume 43, 513–514. ACM, 2013. [61] Jeong, Ik Rae and Jeong Ok Kwon. “Analysis of some keyword search schemes in...Government’s Information Infrastruc- ture”, 1993. URL http://govinfo.library.unt.edu/npr/library/reports/it09.html. [92] Rhee, Hyun Sook, Ik Rae Jeong

  15. Computer Modelling of Biological Molecules: Free Resources on the Internet.

    ERIC Educational Resources Information Center

    Millar, Neil

    1996-01-01

    Describes a three-dimensional computer modeling system for biological molecules which is suitable for sixth-form teaching. Consists of the modeling program "RasMol" together with structure files of proteins, DNA, and small biological molecules. Describes how the whole system can be downloaded from various sites on the Internet.…

  16. MCPLOTS: a particle physics resource based on volunteer computing

    NASA Astrophysics Data System (ADS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P. Z.

    2014-02-01

    The mcplots.cern.ch web site ( mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc@home 2.0 platform.

  17. MX Siting Investigation. Mineral Resources Survey, Seven Additional Valleys, Nevada/Utah Siting Area. Volume II.

    DTIC Science & Technology

    1981-06-23

    AO-AI13 14𔃾 ERTEC WESTERN INC LONG BEACH CA F/6 7/4 MX SITING INVESTIGATION. MINERAL RESOURCES SURVEY, SEVEN ADDITI-ETC(U) JUN Al F04704-80-C-OGO6...DTIC-DDA-2 FORM DOCUMENT PROCESSING SHEET DTIC ocT :g 70A -- ~’ .9 ’I K ii I / "~1 - i~ / . . ..1’ ~ ~- .. ~ ~1 I E-TR-50 MINERAL RESOURCES SURVEY...144 ERTEC WESTERN INC. LONG BEACH CA F/6 7/4 MX SITING INVESTIGATION. MINERAL RESOURCES SURVEY. SEVEN AOOITI-ETCIU) JUN 81 FON7O-80-C-0006

  18. Distributed Computation Resources for Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Doutriaux, C.; Williams, D. N.

    2014-12-01

    The Intergovernmental Panel on Climate Change (IPCC), prompted by the United Nations General Assembly, has published a series of papers in their Fifth Assessment Report (AR5) on processes, impacts, and mitigations of climate change in 2013. The science used in these reports was generated by an international group of domain experts. They studied various scenarios of climate change through the use of highly complex computer models to simulate the Earth's climate over long periods of time. The resulting total data of approximately five petabytes are stored in a distributed data grid known as the Earth System Grid Federation (ESGF). Through the ESGF, consumers of the data can find and download data with limited capabilities for server-side processing. The Sixth Assessment Report (AR6) is already in the planning stages and is estimated to create as much as two orders of magnitude more data than the AR5 distributed archive. It is clear that data analysis capabilities currently in use will be inadequate to allow for the necessary science to be done with AR6 data—the data will just be too big. A major paradigm shift from downloading data to local systems to perform data analytics must evolve to moving the analysis routines to the data and performing these computations on distributed platforms. In preparation for this need, the ESGF has started a Compute Working Team (CWT) to create solutions that allow users to perform distributed, high-performance data analytics on the AR6 data. The team will be designing and developing a general Application Programming Interface (API) to enable highly parallel, server-side processing throughout the ESGF data grid. This API will be integrated with multiple analysis and visualization tools, such as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), netCDF Operator (NCO), and others. This presentation will provide an update on the ESGF CWT's overall approach toward enabling the necessary storage proximal computational

  19. Use of School Resource Centre-Based Computers in Leisure Time by Teenage Pupils

    ERIC Educational Resources Information Center

    Shenton, Andrew K.

    2008-01-01

    Little research has investigated young people's use of school computers outside lessons, a deficiency that prompted the writing of this paper. The ICT-related behaviour of pupils within the Resource Centre of an English high school is scrutinized, with data collected through observation and online questionnaire. The computers, which were…

  20. Orchestrating the XO Computer with Digital and Conventional Resources to Teach Mathematics

    ERIC Educational Resources Information Center

    Díaz, A.; Nussbaum, M.; Varela, I.

    2015-01-01

    Recent research has suggested that simply providing each child with a computer does not lead to an improvement in learning. Given that dozens of countries across the world are purchasing computers for their students, we ask which elements are necessary to improve learning when introducing digital resources into the classroom. Understood the…

  1. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    SciTech Connect

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  2. Computer Monte Carlo simulation in quantitative resource estimation

    USGS Publications Warehouse

    Root, D.H.; Menzie, W.D.; Scott, W.A.

    1992-01-01

    The method of making quantitative assessments of mineral resources sufficiently detailed for economic analysis is outlined in three steps. The steps are (1) determination of types of deposits that may be present in an area, (2) estimation of the numbers of deposits of the permissible deposit types, and (3) combination by Monte Carlo simulation of the estimated numbers of deposits with the historical grades and tonnages of these deposits to produce a probability distribution of the quantities of contained metal. Two examples of the estimation of the number of deposits (step 2) are given. The first example is for mercury deposits in southwestern Alaska and the second is for lode tin deposits in the Seward Peninsula. The flow of the Monte Carlo simulation program is presented with particular attention to the dependencies between grades and tonnages of deposits and between grades of different metals in the same deposit. ?? 1992 Oxford University Press.

  3. Monitoring of computing resource utilization of the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Rousseau, David; Dimitrov, Gancho; Vukotic, Ilija; Aidel, Osman; Schaffer, Rd; Albrand, Solveig

    2012-12-01

    Due to the good performance of the LHC accelerator, the ATLAS experiment has seen higher than anticipated levels for both the event rate and the average number of interactions per bunch crossing. In order to respond to these changing requirements, the current and future usage of CPU, memory and disk resources has to be monitored, understood and acted upon. This requires data collection at a fairly fine level of granularity: the performance of each object written and each algorithm run, as well as a dozen per-job variables, are gathered for the different processing steps of Monte Carlo generation and simulation and the reconstruction of both data and Monte Carlo. We present a system to collect and visualize the data from both the online Tier-0 system and distributed grid production jobs. Around 40 GB of performance data are expected from up to 200k jobs per day, thus making performance optimization of the underlying Oracle database of utmost importance.

  4. Using High Performance Computing to Support Water Resource Planning

    SciTech Connect

    Groves, David G.; Lembert, Robert J.; May, Deborah W.; Leek, James R.; Syme, James

    2015-10-22

    In recent years, decision support modeling has embraced deliberation-withanalysis— an iterative process in which decisionmakers come together with experts to evaluate a complex problem and alternative solutions in a scientifically rigorous and transparent manner. Simulation modeling supports decisionmaking throughout this process; visualizations enable decisionmakers to assess how proposed strategies stand up over time in uncertain conditions. But running these simulation models over standard computers can be slow. This, in turn, can slow the entire decisionmaking process, interrupting valuable interaction between decisionmakers and analytics.

  5. Analysis of the effects of section 29 tax credits on reserve additions and production of gas from unconventional resources

    SciTech Connect

    Not Available

    1990-09-01

    Federal tax credits for production of natural gas from unconventional resources can stimulate drilling and reserves additions at a relatively low cost to the Treasury. This report presents the results of an analysis of the effects of a proposed extension of the Section 29 alternative fuels production credit specifically for unconventional gas. ICF Resources estimated the net effect of the extension of the credit (the difference between development activity expected with the extension of the credit and that expected if the credit expires in December 1990 as scheduled). The analysis addressed the effect of tax credits on project economics and capital formation, drilling and reserve additions, production, impact on the US and regional economies, and the net public sector costs and incremental revenues. The analysis was based on explicit modeling of the three dominant unconventional gas resources: Tight sands, coalbed methane, and Devonian shales. It incorporated the most current data on resource size, typical well recoveries and economics, and anticipated activity of the major producers. Each resource was further disaggregated for analysis based on distinct resource characteristics, development practices, regional economics, and historical development patterns.

  6. Computational tools and resources for prediction and analysis of gene regulatory regions in the chick genome.

    PubMed

    Khan, Mohsin A F; Soto-Jimenez, Luz Mayela; Howe, Timothy; Streit, Andrea; Sosinsky, Alona; Stern, Claudio D

    2013-05-01

    The discovery of cis-regulatory elements is a challenging problem in bioinformatics, owing to distal locations and context-specific roles of these elements in controlling gene regulation. Here we review the current bioinformatics methodologies and resources available for systematic discovery of cis-acting regulatory elements and conserved transcription factor binding sites in the chick genome. In addition, we propose and make available, a novel workflow using computational tools that integrate CTCF analysis to predict putative insulator elements, enhancer prediction, and TFBS analysis. To demonstrate the usefulness of this computational workflow, we then use it to analyze the locus of the gene Sox2 whose developmental expression is known to be controlled by a complex array of cis-acting regulatory elements. The workflow accurately predicts most of the experimentally verified elements along with some that have not yet been discovered. A web version of the CTCF tool, together with instructions for using the workflow can be accessed from http://toolshed.g2.bx.psu.edu/view/mkhan1980/ctcf_analysis. For local installation of the tool, relevant Perl scripts and instructions are provided in the directory named "code" in the supplementary materials.

  7. Computational tools and resources for prediction and analysis of gene regulatory regions in the chick genome

    PubMed Central

    Khan, Mohsin A. F.; Soto-Jimenez, Luz Mayela; Howe, Timothy; Streit, Andrea; Sosinsky, Alona; Stern, Claudio D.

    2013-01-01

    The discovery of cis-regulatory elements is a challenging problem in bioinformatics, owing to distal locations and context-specific roles of these elements in controlling gene regulation. Here we review the current bioinformatics methodologies and resources available for systematic discovery of cis-acting regulatory elements and conserved transcription factor binding sites in the chick genome. In addition, we propose and make available, a novel workflow using computational tools that integrate CTCF analysis to predict putative insulator elements, enhancer prediction and TFBS analysis. To demonstrate the usefulness of this computational workflow, we then use it to analyze the locus of the gene Sox2 whose developmental expression is known to be controlled by a complex array of cis-acting regulatory elements. The workflow accurately predicts most of the experimentally verified elements along with some that have not yet been discovered. A web version of the CTCF tool, together with instructions for using the workflow can be accessed from http://toolshed.g2.bx.psu.edu/view/mkhan1980/ctcf_analysis. For local installation of the tool, relevant Perl scripts and instructions are provided in the directory named “code” in the supplementary materials. PMID:23355428

  8. A comprehensive overview of computational resources to aid in precision genome editing with engineered nucleases.

    PubMed

    Periwal, Vinita

    2016-07-03

    Genome editing with engineered nucleases (zinc finger nucleases, TAL effector nucleases s and Clustered regularly inter-spaced short palindromic repeats/CRISPR-associated) has recently been shown to have great promise in a variety of therapeutic and biotechnological applications. However, their exploitation in genetic analysis and clinical settings largely depends on their specificity for the intended genomic target. Large and complex genomes often contain highly homologous/repetitive sequences, which limits the specificity of genome editing tools and could result in off-target activity. Over the past few years, various computational approaches have been developed to assist the design process and predict/reduce the off-target activity of these nucleases. These tools could be efficiently used to guide the design of constructs for engineered nucleases and evaluate results after genome editing. This review provides a comprehensive overview of various databases, tools, web servers and resources for genome editing and compares their features and functionalities. Additionally, it also describes tools that have been developed to analyse post-genome editing results. The article also discusses important design parameters that could be considered while designing these nucleases. This review is intended to be a quick reference guide for experimentalists as well as computational biologists working in the field of genome editing with engineered nucleases.

  9. Education for Homeless Adults: Strategies for Implementation. Volume II - Resources and Additional Lessons.

    ERIC Educational Resources Information Center

    Hudson River Center for Program Development, Glenmont, NY.

    This document, the second in a series of guidebooks that were developed for educators of homeless adults in New York, offers strategies and plans for sample lessons in which a holistic approach is used to help homeless adults and families improve their lives through education. The guidebook begins with lists of print and nonprint resources,…

  10. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  11. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  12. Resources

    MedlinePlus

    ... can be found on the web, through local libraries, your health care provider, and the yellow pages under "social service organizations." AIDS - resources Alcoholism - resources Allergy - resources ...

  13. A survey on resource allocation in high performance distributed computing systems

    SciTech Connect

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  14. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  15. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  16. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    ERIC Educational Resources Information Center

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  17. Computer Technology for the Handicapped in Special Education and Rehabilitation: A Resource Guide. Volume II.

    ERIC Educational Resources Information Center

    Browning, Philip; And Others

    The guide presents annotations on 335 resources, journal articles, books, associations, and reports dealing with computer utilization for handicapped persons in rehabilitation and education. Author and subject indexes precede the annotations which are arranged alphabetically. Citations usually include information on title, author, source, date,…

  18. Center for the Vocationally Challenged: Business PC Specialist and Computer Programmer Training. Vocational Education Resource Package.

    ERIC Educational Resources Information Center

    Evaluation and Training Inst., Los Angeles, CA.

    This Vocational Education Resource Package (VERP) was developed to provide materials useful in replicating an exemplary vocational education program for special student populations in the California Community Colleges. This VERP provides information on an intensive computer training program for the physically disabled, originally developed at…

  19. The Ever-Present Demand for Public Computing Resources. CDS Spotlight

    ERIC Educational Resources Information Center

    Pirani, Judith A.

    2014-01-01

    This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…

  20. Adolescents, Health Education, and Computers: The Body Awareness Resource Network (BARN).

    ERIC Educational Resources Information Center

    Bosworth, Kris; And Others

    1983-01-01

    The Body Awareness Resource Network (BARN) is a computer-based system designed as a confidential, nonjudgmental source of health information for adolescents. Topics include alcohol and other drugs, diet and activity, family communication, human sexuality, smoking, and stress management; programs are available for high school and middle school…

  1. PAH growth initiated by propargyl addition: mechanism development and computational kinetics.

    PubMed

    Raj, Abhijeet; Al Rashidi, Mariam J; Chung, Suk Ho; Sarathy, S Mani

    2014-04-24

    Polycyclic aromatic hydrocarbon (PAH) growth is known to be the principal pathway to soot formation during fuel combustion, as such, a physical understanding of the PAH growth mechanism is needed to effectively assess, predict, and control soot formation in flames. Although the hydrogen abstraction C2H2 addition (HACA) mechanism is believed to be the main contributor to PAH growth, it has been shown to under-predict some of the experimental data on PAHs and soot concentrations in flames. This article presents a submechanism of PAH growth that is initiated by propargyl (C3H3) addition onto naphthalene (A2) and the naphthyl radical. C3H3 has been chosen since it is known to be a precursor of benzene in combustion and has appreciable concentrations in flames. This mechanism has been developed up to the formation of pyrene (A4), and the temperature-dependent kinetics of each elementary reaction has been determined using density functional theory (DFT) computations at the B3LYP/6-311++G(d,p) level of theory and transition state theory (TST). H-abstraction, H-addition, H-migration, β-scission, and intramolecular addition reactions have been taken into account. The energy barriers of the two main pathways (H-abstraction and H-addition) were found to be relatively small if not negative, whereas the energy barriers of the other pathways were in the range of (6-89 kcal·mol(-1)). The rates reported in this study may be extrapolated to larger PAH molecules that have a zigzag site similar to that in naphthalene, and the mechanism presented herein may be used as a complement to the HACA mechanism to improve prediction of PAH and soot formation.

  2. Teaching English as an Additional Language 5-11: A Whole School Resource File

    ERIC Educational Resources Information Center

    Scott, Caroline

    2011-01-01

    There are increasing numbers of children with little or no English entering English speaking mainstream lessons. This often leaves them with unique frustrations due to limited English language proficiency and disorientation. Teachers often feel unable to cater sufficiently for these new arrivals. "Teaching English as an Additional Language Ages…

  3. Allocating Tactical High-Performance Computer (HPC) Resources to Offloaded Computation in Battlefield Scenarios

    DTIC Science & Technology

    2013-12-01

    devices. Offloading solutions such as Cuckoo (12), MAUI(13), COMET(14), and ThinkAir(15) offload applications via Wi-Fi or 3G networks to servers or...Soldier Smartphone Program. Information Week, 2010. 12. Kemp, R.; Palmer, N.; Kielmann, T.; Bal, H. Cuckoo : A Computation Offloading Framework for

  4. Effects of resource addition on recovery of production and plant functional composition in degraded semiarid grasslands.

    PubMed

    Chen, Qing; Hooper, David U; Li, Hui; Gong, Xiao Ying; Peng, Fei; Wang, Hong; Dittert, Klaus; Lin, Shan

    2017-02-28

    Degradation of semiarid ecosystems from overgrazing threatens a variety of ecosystem services. Rainfall and nitrogen commonly co-limit production in semiarid grassland ecosystems; however, few studies have reported how interactive effects of precipitation and nitrogen addition influence the recovery of grasslands degraded by overgrazing. We conducted a 6-year experiment manipulating precipitation (natural precipitation and simulated wet year precipitation) and nitrogen (0, 25 and 50 kg N ha(-1)) addition at two sites with different histories of livestock grazing (moderately and heavily grazed) in Inner Mongolian steppe. Our results suggest that recovery of plant community composition and recovery of production can be decoupled. Perennial grasses provide long-term stability of high-quality forage production in this system. Supplemental water combined with exclosures led, in the heavily grazed site, to the strongest recovery of perennial grasses, although widespread irrigation of rangeland is not a feasible management strategy in many semiarid and arid regions. N fertilization combined with exclosures, but without water addition, increased dominance of unpalatable annual species, which in turn retarded growth of perennial species and increased inter-annual variation in primary production at both sites. Alleviation of grazing pressure alone allowed recovery of desired perennial species via successional processes in the heavily grazed site. Our experiments suggest that recovery of primary production and desirable community composition are not necessarily correlated. The use of N fertilization for the management of overgrazed grassland needs careful and systematic evaluation, as it has potential to impede, rather than aid, recovery.

  5. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  6. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  7. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  8. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  9. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGES

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  10. Distributed Problem Solving: Adaptive Networks with a Computer Intermediary Resource. Intelligent Executive Computer Communication

    DTIC Science & Technology

    1991-06-01

    Alto Research Center, August 1982. [82] Ernst W. Mayr . Well Structured Parallel Programs Are Not Easier to Schedule. Technical Report No. STAN-CS-81...articles on scheduling) [Dolev, 80; Graham, 69; Helmbold and Mayr , 84; Mayr , 81], in systems [Ackerman, 82] and in artificial intelligence [Rosenschein...Flight Control. NASA Technical Memorandum 58258, May 1984. [60] D. Helmbold and E. Mayr . Fast Scheduling Algorithms on Parallel Computers. Technical

  11. Current status and prospects of computational resources for natural product dereplication: a review.

    PubMed

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-03-01

    Research in natural products has always enhanced drug discovery by providing new and unique chemical compounds. However, recently, drug discovery from natural products is slowed down by the increasing chance of re-isolating known compounds. Rapid identification of previously isolated compounds in an automated manner, called dereplication, steers researchers toward novel findings, thereby reducing the time and effort for identifying new drug leads. Dereplication identifies compounds by comparing processed experimental data with those of known compounds, and so, diverse computational resources such as databases and tools to process and compare compound data are necessary. Automating the dereplication process through the integration of computational resources has always been an aspired goal of natural product researchers. To increase the utilization of current computational resources for natural products, we first provide an overview of the dereplication process, and then list useful resources, categorizing into databases, methods and software tools and further explaining them from a dereplication perspective. Finally, we discuss the current challenges to automating dereplication and proposed solutions.

  12. Perspectives on the utilization of aquaculture coproduct in Europe and Asia: prospects for value addition and improved resource efficiency.

    PubMed

    Newton, Richard; Telfer, Trevor; Little, Dave

    2014-01-01

    Aquaculture has often been criticized for its environmental impacts, especially efficiencies concerning global fisheries resources for use in aquafeeds among others. However, little attention has been paid to the contribution of coproducts from aquaculture, which can vary between 40% and 70% of the production. These have often been underutilized and could be redirected to maximize the efficient use of resource inputs including reducing the burden on fisheries resources. In this review, we identify strategies to enhance the overall value of the harvested yield including noneffluent processing coproducts for three of the most important global aquaculture species, and discuss the current and prospective utilization of these resources for value addition and environmental impact reduction. The review concludes that in Europe coproducts are often underutilized because of logistical reasons such as the disconnected nature of the value chain, and perceived legislative barriers. However, in Asia, most coproducts are used, often innovatively but not to their full economic potential and sometimes with possible human health and biosecurity risks. These include possible spread of diseased material and low traceability in some circumstances. Full economic and environmental appraisal is long overdue for the current and potential strategies available for coproduct utilization.

  13. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  14. Computation of octanol-water partition coefficients by guiding an additive model with knowledge.

    PubMed

    Cheng, Tiejun; Zhao, Yuan; Li, Xun; Lin, Fu; Xu, Yong; Zhang, Xinglong; Li, Yan; Wang, Renxiao; Lai, Luhua

    2007-01-01

    We have developed a new method, i.e., XLOGP3, for logP computation. XLOGP3 predicts the logP value of a query compound by using the known logP value of a reference compound as a starting point. The difference in the logP values of the query compound and the reference compound is then estimated by an additive model. The additive model implemented in XLOGP3 uses a total of 87 atom/group types and two correction factors as descriptors. It is calibrated on a training set of 8199 organic compounds with reliable logP data through a multivariate linear regression analysis. For a given query compound, the compound showing the highest structural similarity in the training set will be selected as the reference compound. Structural similarity is quantified based on topological torsion descriptors. XLOGP3 has been tested along with its predecessor, i.e., XLOGP2, as well as several popular logP methods on two independent test sets: one contains 406 small-molecule drugs approved by the FDA and the other contains 219 oligopeptides. On both test sets, XLOGP3 produces more accurate predictions than most of the other methods with average unsigned errors of 0.24-0.51 units. Compared to conventional additive methods, XLOGP3 does not rely on an extensive classification of fragments and correction factors in order to improve accuracy. It is also able to utilize the ever-increasing experimentally measured logP data more effectively.

  15. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    SciTech Connect

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpizar, Adan; Schittenhelm, Ralf B.; Ramarathinam, Sri Harsha; Lindestam-Arlehamn, Cecilia S.; Koh, Ching Chiek; Gillet, Ludovic; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony; Rammensee, Hans-Georg; Stevanovic, Stevan; Aebersold, Ruedi

    2015-07-08

    We present a novel proteomics-based workflow and an open source data and computational resource for reproducibly identifying and quantifying HLA-associated peptides at high-throughput. The provided resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra and the analysis of quantitative digital maps of HLA peptidomes generated by SWATH mass spectrometry (MS). This is the first community-based study towards the development of a robust platform for the reproducible and quantitative measurement of HLA peptidomes, an essential step towards the design of efficient immunotherapies.

  16. An open-source computational and data resource to analyze digital maps of immunopeptidomes.

    PubMed

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-07-08

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.

  17. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    SciTech Connect

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B.; Ramarathinam, Sri H.; Lindestam Arlehamn, Cecilia S.; Chiek Koh, Ching; Gillet, Ludovic C.; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S.; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony W.; Rammensee, Hans -Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-07-08

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.

  18. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views.

  19. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  20. A Formal Method for Specifying Computer Resources in an Implementation Independent Manner

    DTIC Science & Technology

    1984-11-01

    NUMBERft) It. CONTROLLING OFFICE NAME AND ADDRESS Chief of Naval Research Arlington, Virginia 22217 14. MONITORING AGENCY NAME...numbsr; 20 ABSTRACT CCont/nu* on reverse aide If neceaeary and identify by block number) This paper is an investigation of a methodology for the...formal specification of computer software or hardware resource interfaces. The objective of the methodology is to make possible the specification

  1. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    PubMed

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  2. A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems

    SciTech Connect

    Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv; Jayaraman, Prem Prakash; Kolodziej, Joanna; Balaji, Pavan; Zeadally, Sherali; Malluhi, Qutaibah Marwan; Tziritas, Nikos; Vishnu, Abhinav; Khan, Samee U.; Zomaya, Albert

    2014-06-06

    In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subject that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.

  3. Planning health education: Internet and computer resources in southwestern Nigeria. 2000-2001.

    PubMed

    Oyadoke, Adebola A; Salami, Kabiru K; Brieger, William R

    The use of the Internet as a health education tool and as a resource in health education planning is widely accepted as the norm in industrialized countries. Unfortunately, access to computers and the Internet is quite limited in developing countries. Not all licensed service providers operate, many users are actually foreign nationals, telephone connections are unreliable, and electricity supplies are intermittent. In this context, computer, e-mail, Internet, and CD-Rom use by health and health education program officers in five states in southwestern Nigeria were assessed to document their present access and use. Eight of the 30 organizations visited were government health ministry departments, while the remainder were non-governmental organizations (NGOs). Six NGOs and four State Ministry of Health (MOH) departments had no computers, but nearly two-thirds of both types of agency had e-mail, less than one-third had Web browsing facilities, and six had CD-Roms, all of whom were NGOs. Only 25 of the 48 individual respondents had computer use skills. Narrative responses from individual employees showed a qualitative difference between computer and Internet access and use and type of agency. NGO staff in organizations with computers indicated having relatively free access to a computer and the Internet and used these for both program planning and administrative purposes. In government offices it appeared that computers were more likely to be located in administrative or statistics offices and used for management tasks like salaries and correspondence, limiting the access of individual health staff. These two different organizational cultures must be considered when plans are made for increasing computer availability and skills for health education planning.

  4. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    NASA Astrophysics Data System (ADS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-07-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi-Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources.

  5. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  6. Addition of flexible body option to the TOLA computer program. Part 2: User and programmer documentation

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    User and programmer oriented documentation for the flexible body option of the Takeoff and Landing Analysis (TOLA) computer program are provided. The user information provides sufficient knowledge of the development and use of the option to enable the engineering user to successfully operate the modified program and understand the results. The programmer's information describes the option structure and logic enabling a programmer to make major revisions to this part of the TOLA computer program.

  7. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  8. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    PubMed Central

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-01-01

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies. DOI: http://dx.doi.org/10.7554/eLife.07661.001 PMID:26154972

  9. Elastic Extension of a CMS Computing Centre Resources on External Clouds

    NASA Astrophysics Data System (ADS)

    Codispoti, G.; Di Maria, R.; Aiftimiei, C.; Bonacorsi, D.; Calligola, P.; Ciaschini, V.; Costantini, A.; Dal Pra, S.; DeGirolamo, D.; Grandi, C.; Michelotto, D.; Panella, M.; Peco, G.; Sapunenko, V.; Sgaravatto, M.; Taneja, S.; Zizzi, G.

    2016-10-01

    After the successful LHC data taking in Run-I and in view of the future runs, the LHC experiments are facing new challenges in the design and operation of the computing facilities. The computing infrastructure for Run-II is dimensioned to cope at most with the average amount of data recorded. The usage peaks, as already observed in Run-I, may however originate large backlogs, thus delaying the completion of the data reconstruction and ultimately the data availability for physics analysis. In order to cope with the production peaks, CMS - along the lines followed by other LHC experiments - is exploring the opportunity to access Cloud resources provided by external partners or commercial providers. Specific use cases have already been explored and successfully exploited during Long Shutdown 1 (LS1) and the first part of Run 2. In this work we present the proof of concept of the elastic extension of a CMS site, specifically the Bologna Tier-3, on an external OpenStack infrastructure. We focus on the “Cloud Bursting” of a CMS Grid site using a newly designed LSF configuration that allows the dynamic registration of new worker nodes to LSF. In this approach, the dynamically added worker nodes instantiated on the OpenStack infrastructure are transparently accessed by the LHC Grid tools and at the same time they serve as an extension of the farm for the local usage. The amount of resources allocated thus can be elastically modeled to cope up with the needs of CMS experiment and local users. Moreover, a direct access/integration of OpenStack resources to the CMS workload management system is explored. In this paper we present this approach, we report on the performances of the on-demand allocated resources, and we discuss the lessons learned and the next steps.

  10. Exploiting short-term memory in soft body dynamics as a computational resource

    PubMed Central

    Nakajima, K.; Li, T.; Hauser, H.; Pfeifer, R.

    2014-01-01

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems. PMID:25185579

  11. Resource quality of a symmetry-protected topologically ordered phase for quantum computation.

    PubMed

    Miller, Jacob; Miyake, Akimasa

    2015-03-27

    We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing.

  12. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  13. Research and Demonstration for a Comprehensive Package of Computer Programs to Serve Community College Learning Resource Centers. Final Report.

    ERIC Educational Resources Information Center

    Weiss, Jack A.

    One of 15 members of the Northern Illinois Learning Resources Cooperative (NILRC), Elgin Community College served as host institution for a project to design, develop, test, and install computer programs in a community college resource center environment. The service functions identified for systems development included circulation, serial…

  14. Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters

    PubMed Central

    Torres-Huitzil, Cesar

    2013-01-01

    Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k2 − 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456

  15. Can Computer-Assisted Discovery Learning Foster First Graders' Fluency with the Most Basic Addition Combinations?

    ERIC Educational Resources Information Center

    Baroody, Arthur J.; Eiland, Michael D.; Purpura, David J.; Reid, Erin E.

    2013-01-01

    In a 9-month training experiment, 64 first graders with a risk factor were randomly assigned to computer-assisted structured discovery of the add-1 rule (e.g., the sum of 7 + 1 is the number after "seven" when we count), unstructured discovery learning of this regularity, or an active-control group. Planned contrasts revealed that the…

  16. Improving hospital bed occupancy and resource utilization through queuing modeling and evolutionary computation.

    PubMed

    Belciug, Smaranda; Gorunescu, Florin

    2015-02-01

    Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application.

  17. Review: Computer-based models for managing the water-resource problems of irrigated agriculture

    NASA Astrophysics Data System (ADS)

    Singh, Ajay

    2015-09-01

    Irrigation is essential for achieving food security to the burgeoning global population but unplanned and injudicious expansion of irrigated areas causes waterlogging and salinization problems. Under this backdrop, groundwater resources management is a critical issue for fulfilling the increasing water demand for agricultural, industrial, and domestic uses. Various simulation and optimization approaches were used to solve the groundwater management problems. This paper presents a review of the individual and combined applications of simulation and optimization modeling for the management of groundwater-resource problems associated with irrigated agriculture. The study revealed that the combined use of simulation-optimization modeling is very suitable for achieving an optimal solution for groundwater-resource problems, even with a large number of variables. Independent model tools were used to solve the problems of uncertainty analysis and parameter estimation in groundwater modelling studies. Artificial neural networks were used to minimize the problem of computational complexity. The incorporation of socioeconomic aspects into the groundwater management modeling would be an important development in future studies.

  18. The Effects of Computer-Assisted Instruction on Student Achievement in Addition and Subtraction at First Grade Level.

    ERIC Educational Resources Information Center

    Spivey, Patsy M.

    This study was conducted to determine whether the traditional classroom approach to instruction involving the addition and subtraction of number facts (digits 0-6) is more or less effective than the traditional classroom approach plus a commercially-prepared computer game. A pretest-posttest control group design was used with two groups of first…

  19. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  20. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 17 Commodity and Securities Exchanges 1 2011-04-01 2011-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  1. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 1 2013-04-01 2013-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  2. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 17 Commodity and Securities Exchanges 1 2012-04-01 2012-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  3. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 17 Commodity and Securities Exchanges 1 2014-04-01 2014-04-01 false Adjustments for Additions and Withdrawals in the Computation of Rate of Return B Appendix B to Part 4 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION COMMODITY POOL OPERATORS AND COMMODITY TRADING ADVISORS Pt. 4, App....

  4. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    SciTech Connect

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  5. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 1: Communication Aids. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on communication aids. The book's six chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by function, input/output…

  6. Computational methods and resources for the interpretation of genomic variants in cancer.

    PubMed

    Tian, Rui; Basu, Malay K; Capriotti, Emidio

    2015-01-01

    The recent improvement of the high-throughput sequencing technologies is having a strong impact on the detection of genetic variations associated with cancer. Several institutions worldwide have been sequencing the whole exomes and or genomes of cancer patients in the thousands, thereby providing an invaluable collection of new somatic mutations in different cancer types. These initiatives promoted the development of methods and tools for the analysis of cancer genomes that are aimed at studying the relationship between genotype and phenotype in cancer. In this article we review the online resources and computational tools for the analysis of cancer genome. First, we describe the available repositories of cancer genome data. Next, we provide an overview of the methods for the detection of genetic variation and computational tools for the prioritization of cancer related genes and causative somatic variations. Finally, we discuss the future perspectives in cancer genomics focusing on the impact of computational methods and quantitative approaches for defining personalized strategies to improve the diagnosis and treatment of cancer.

  7. Computational methods and resources for the interpretation of genomic variants in cancer

    PubMed Central

    2015-01-01

    The recent improvement of the high-throughput sequencing technologies is having a strong impact on the detection of genetic variations associated with cancer. Several institutions worldwide have been sequencing the whole exomes and or genomes of cancer patients in the thousands, thereby providing an invaluable collection of new somatic mutations in different cancer types. These initiatives promoted the development of methods and tools for the analysis of cancer genomes that are aimed at studying the relationship between genotype and phenotype in cancer. In this article we review the online resources and computational tools for the analysis of cancer genome. First, we describe the available repositories of cancer genome data. Next, we provide an overview of the methods for the detection of genetic variation and computational tools for the prioritization of cancer related genes and causative somatic variations. Finally, we discuss the future perspectives in cancer genomics focusing on the impact of computational methods and quantitative approaches for defining personalized strategies to improve the diagnosis and treatment of cancer. PMID:26111056

  8. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  9. Postprocessing of Voxel-Based Topologies for Additive Manufacturing Using the Computational Geometry Algorithms Library (CGAL)

    DTIC Science & Technology

    2015-06-01

    that a structure is built up by layers. Typically, additive manufacturing devices (3-dimensional [3-D] printers , e.g.), use the stereolithography (STL...begin with a standard, voxel-based topology optimization scheme and end with an STL file, ready for use in a 3-D printer or other additive manufacturing...S, Yvinec M. Cgal 4.6 - 3d alpha shapes. 2015 [accessed 2015 May 18]. http://doc.cgal.org/latest/Alpha_shapes_3/index.html#Chapter_3D_ Alpha_Shapes

  10. Addition of higher order plate and shell elements into NASTRAN computer program

    NASA Technical Reports Server (NTRS)

    Narayanaswami, R.; Goglia, G. L.

    1976-01-01

    Two higher order plate elements, the linear strain triangular membrane element and the quintic bending element, along with a shallow shell element, suitable for inclusion into the NASTRAN (NASA Structural Analysis) program are described. Additions to the NASTRAN Theoretical Manual, Users' Manual, Programmers' Manual and the NASTRAN Demonstration Problem Manual, for inclusion of these elements into the NASTRAN program are also presented.

  11. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic

  12. Addition of visual noise boosts evoked potential-based brain-computer interface

    PubMed Central

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-01-01

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7–36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs. PMID:24828128

  13. Efficient method for computing the maximum-likelihood quantum state from measurements with additive Gaussian noise.

    PubMed

    Smolin, John A; Gambetta, Jay M; Smith, Graeme

    2012-02-17

    We provide an efficient method for computing the maximum-likelihood mixed quantum state (with density matrix ρ) given a set of measurement outcomes in a complete orthonormal operator basis subject to Gaussian noise. Our method works by first changing basis yielding a candidate density matrix μ which may have nonphysical (negative) eigenvalues, and then finding the nearest physical state under the 2-norm. Our algorithm takes at worst O(d(4)) for the basis change plus O(d(3)) for finding ρ where d is the dimension of the quantum state. In the special case where the measurement basis is strings of Pauli operators, the basis change takes only O(d(3)) as well. The workhorse of the algorithm is a new linear-time method for finding the closest probability distribution (in Euclidean distance) to a set of real numbers summing to one.

  14. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  15. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing.

    PubMed

    Howard, Mark; Campbell, Earl

    2017-03-03

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas-the most general synthesis scenario-then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  16. DtiStudio: resource program for diffusion tensor computation and fiber bundle tracking.

    PubMed

    Jiang, Hangyi; van Zijl, Peter C M; Kim, Jinsuh; Pearlson, Godfrey D; Mori, Susumu

    2006-02-01

    A versatile resource program was developed for diffusion tensor image (DTI) computation and fiber tracking. The software can read data formats from a variety of MR scanners. Tensor calculation is performed by solving an over-determined linear equation system using least square fitting. Various types of map data, such as tensor elements, eigenvalues, eigenvectors, diffusion anisotropy, diffusion constants, and color-coded orientations can be calculated. The results are visualized interactively in orthogonal views and in three-dimensional mode. Three-dimensional tract reconstruction is based on the Fiber Assignment by Continuous Tracking (FACT) algorithm and a brute-force reconstruction approach. To improve the time and memory efficiency, a rapid algorithm to perform the FACT is adopted. An index matrix for the fiber data is introduced to facilitate various types of fiber bundles selection based on approaches employing multiple regions of interest (ROIs). The program is developed using C++ and OpenGL on a Windows platform.

  17. Development of a Computer-Based Resource for Inclusion Science Classrooms

    NASA Astrophysics Data System (ADS)

    Olsen, J. K.; Slater, T.

    2005-12-01

    Current instructional issues necessitate educators start with curriculum and determine how educational technology can assist students in achieving positive learning goals, functionally supplementing the classroom instruction. Technology projects incorporating principles of situated learning have been shown to provide effective framework for learning, and computer technology has been shown to facilitate learning among special needs students. Students with learning disabilities may benefit from assistive technology, but these resources are not always utilized during classroom instruction: technology is only effective if teachers view it as an integral part of the learning process. The materials currently under development are in the domain of earth and space science, part of the Arizona 5-8 Science Content Standards. The concern of this study is to determine a means of assisting inclusive education that is both feasible and effective in ensuring successful science learning outcomes for all students whether regular education or special needs.

  18. Application of a Resource Theory for Magic States to Fault-Tolerant Quantum Computing

    NASA Astrophysics Data System (ADS)

    Howard, Mark; Campbell, Earl

    2017-03-01

    Motivated by their necessity for most fault-tolerant quantum computation schemes, we formulate a resource theory for magic states. First, we show that robustness of magic is a well-behaved magic monotone that operationally quantifies the classical simulation overhead for a Gottesman-Knill-type scheme using ancillary magic states. Our framework subsequently finds immediate application in the task of synthesizing non-Clifford gates using magic states. When magic states are interspersed with Clifford gates, Pauli measurements, and stabilizer ancillas—the most general synthesis scenario—then the class of synthesizable unitaries is hard to characterize. Our techniques can place nontrivial lower bounds on the number of magic states required for implementing a given target unitary. Guided by these results, we have found new and optimal examples of such synthesis.

  19. Resources.

    ERIC Educational Resources Information Center

    Aviation/Space, 1980

    1980-01-01

    The resources listed different types of materials related to the aerospace science under specified categories: free materials and inexpensive, selected government publication, audiovisual (government, nongovernment), aviation books, and space books. The list includes the publisher's name and the price for each publication. (SK)

  20. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    PubMed

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  1. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction

    PubMed Central

    Nezarat, Amin; Dastghaibifard, GH

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer’s utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider. PMID:26431035

  2. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    NASA Astrophysics Data System (ADS)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  3. Enantioselective conjugate addition of nitro compounds to α,β-unsaturated ketones: an experimental and computational study.

    PubMed

    Manzano, Rubén; Andrés, José M; Álvarez, Rosana; Muruzábal, María D; de Lera, Ángel R; Pedrosa, Rafael

    2011-05-16

    A series of chiral thioureas derived from easily available diamines, prepared from α-amino acids, have been tested as catalysts in the enantioselective Michael additions of nitroalkanes to α,β-unsaturated ketones. The best results are obtained with the bifunctional catalyst prepared from L-valine. This thiourea promotes the reaction with high enantioselectivities and chemical yields for aryl/vinyl ketones, but the enantiomeric ratio for alkyl/vinyl derivatives is very modest. The addition of substituted nitromethanes led to the corresponding adducts with excellent enantioselectivity but very poor diastereoselectivity. Evidence for the isomerization of the addition products has been obtained from the reaction of chalcone with [D(3)]nitromethane, which shows that the final addition products epimerize under the reaction conditions. The epimerization explains the low diastereoselectivity observed in the formation of adducts with two adjacent tertiary stereocenters. Density functional studies of the transition structures corresponding to two alternative activation modes of the nitroalkanes and α,β-unsaturated ketones by the bifunctional organocatalyst have been carried out at the B3LYP/3-21G* level. The computations are consistent with a reaction model involving the Michael addition of the thiourea-activated nitronate to the ketone activated by the protonated amine of the organocatalyst. The enantioselectivities predicted by the computations are consistent with the experimental values obtained for aryl- and alkyl-substituted α,β-unsaturated ketones.

  4. Definition and computation of intermolecular contact in liquids using additively weighted Voronoi tessellation.

    PubMed

    Isele-Holder, Rolf E; Rabideau, Brooks D; Ismail, Ahmed E

    2012-05-10

    We present a definition of intermolecular surface contact by applying weighted Voronoi tessellations to configurations of various organic liquids and water obtained from molecular dynamics simulations. This definition of surface contact is used to link the COSMO-RS model and molecular dynamics simulations. We demonstrate that additively weighted tessellation is the superior tessellation type to define intermolecular surface contact. Furthermore, we fit a set of weights for the elements C, H, O, N, F, and S for this tessellation type to obtain optimal agreement between the models. We use these radii to successfully predict contact statistics for compounds that were excluded from the fit and mixtures. The observed agreement between contact statistics from COSMO-RS and molecular dynamics simulations confirms the capability of the presented method to describe intermolecular contact. Furthermore, we observe that increasing polarity of the surfaces of the examined molecules leads to weaker agreement in the contact statistics. This is especially pronounced for pure water.

  5. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  6. SARANA: language, compiler and run-time system support for spatially aware and resource-aware mobile computing.

    PubMed

    Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei

    2008-10-28

    Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.

  7. Aging as an evolvability-increasing program which can be switched off by organism to mobilize additional resources for survival.

    PubMed

    Skulachev, Maxim V; Severin, Fedor F; Skulachev, Vladimir P

    2015-01-01

    During the last decade, several pieces of convincing evidence were published indicating that aging of living organisms is programmed, being a particular case of programmed death of organism (phenoptosis). Among them, the following observations can be mentioned. (1) Species were described that show negligible aging. In mammals, the naked mole rat is the most impressive example. This is a rodent of mouse size living at least 10-fold longer than a mouse and having fecundity higher than a mouse and no agerelated diseases. (2) In some species with high aging rate, genes responsible for active organization of aging by poisoning of the organism with endogenous metabolites have been identified. (3) In women, standard deviations divided by the mean are the same for age of menarche (an event controlled by the ontogenetic program) and for age of menopause (an aging-related event). (4) Inhibitors of programmed cell death (apoptosis and necrosis) retard and in certain cases even reverse the development of age-dependent pathologies. (5) In aging species, the rate of aging is regulated by the individual which responds by changes in this rate to changes in the environmental conditions. In this review, we consider point (5) in detail. Data are summarized suggesting that inhibition of aging rate by moderate food restriction can be explained assuming that such restriction is perceived by the organism as a signal of future starvation. In response to this dramatic signal, the organism switches off such an optional program as aging, mobilizing in such a way additional reserves for survival. A similar explanation is postulated for geroprotective effects of heavy muscle work, a lowering or a rise in the external temperature, small amounts of metabolic poisons (hormesis), low doses of radiation, and other deleterious events. On the contrary, sometimes certain positive signals can prolong life by inhibiting the aging program in individuals who are useful for the community (e

  8. Aging As An Evolvability-Increasing Program Which Can Be Switched Off By Organism To Mobilize Additional Resources For Survival.

    PubMed

    Skulachev, Maxim V; Severin, Fedor F; Skulachev, Vladimir P

    2015-04-22

    During the last decade, several pieces of convincing evidence were published indicating that aging of living organisms is programmed, being a particular case of programmed death of organism (phenoptosis). Among them, the following observations can be mentioned [1]. Species were described that show negligible aging. In mammals, the naked mole rat is the most impressive example. This is a rodent of mouse size living at least 10-fold longer than a mouse and having fecundity higher than a mouse and no age-related diseases [2]. In some species with high aging rate, genes responsible for active organization of aging by poisoning of the organism with endogenous metabolites have been identified [3]. In women, standard deviations divided by the mean are the same for age of menarche (an event controlled by the ontogenetic program) and for age of menopause (an aging-related event) [4]. Inhibitors of programmed cell death (apoptosis and necrosis) retard and in certain cases even reverse the development of age-dependent pathologies [5]. In aging species, the rate of aging is regulated by the individual which responds by changes in this rate to changes in the environmental conditions. In this review, we consider point [5] in detail. Data are summarized suggesting that inhibition of aging rate by moderate food restriction can be explained assuming that such restriction is perceived by the organism as a signal of future starvation. In response to this dramatic signal, the organism switches off such an optional program as aging, mobilizing in such a way additional reserves for survival. A similar explanation is postulated for geroprotective effects of heavy muscle work, a lowering or a rise in the external temperature, small amounts of metabolic poisons (hormesis), low doses of radiation, and other deleterious events. On the contrary, sometimes certain positive signals can prolong life by inhibiting the aging program in individuals who are useful for the community (e

  9. Vocational Instructor Teaching Skills Project. Evaluating Your Teaching Effectiveness. Resource Packet. [and] Computer-Based Education. Resource Packet.

    ERIC Educational Resources Information Center

    Portland Community Coll., OR.

    A project was conducted at Mt. Hood (Oregon) Community College to develop modules to upgrade the teaching skills of community college teachers. Two modules developed through this project, Evaluating Your Teaching Effectiveness and Computer-Based Education, are included in this document. The Evaluating Your Teaching Effectiveness packet consists of…

  10. Exaggerated force production in altered Gz-levels during parabolic flight: the role of computational resources allocation.

    PubMed

    Mierau, Andreas; Girgenrath, Michaela

    2010-02-01

    The purpose of the present experiment was to examine whether the previously observed exaggerated isometric force production in changed-Gz during parabolic flight (Mierau et al. 2008) can be explained by a higher computational demand and, thus, inadequate allocation of the brain's computational resources to the task. Subjects (n = 12) were tested during the micro-Gz, high-Gz and normal-Gz episodes of parabolic flight. They produced isometric forces of different magnitudes and directions, according to visually prescribed vectors with their right, dominant hand and performed a choice reaction-time task with their left hand. Tasks were performed either separately (single-task) or simultaneously (dual-task). Dual-task interference was present for both tasks, indicating that each task was resources-demanding. However, this interference remained unaffected by the Gz-level. It was concluded that exaggerated force production in changed-Gz is probably not related to inadequate allocation of the brain's computational resources to the force production task. Statement of Relevance: The present study shows that deficient motor performance in changed-Gz environments (both micro-Gz and high-Gz) is not necessarily related to inadequate computational resources allocation, as was suggested in some previous studies. This finding is of great relevance not only for fundamental research, but also for the training and safety of humans operating in changed-Gz environments, such as astronauts and jet pilots.

  11. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    SciTech Connect

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; Rambharos, R.

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop low utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.

  12. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    PubMed

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  13. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGES

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; ...

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  14. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    PubMed

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  15. Literacy effects on language and vision: emergent effects from an amodal shared resource (ASR) computational model.

    PubMed

    Smith, Alastair C; Monaghan, Padraic; Huettig, Falk

    2014-12-01

    Learning to read and write requires an individual to connect additional orthographic representations to pre-existing mappings between phonological and semantic representations of words. Past empirical results suggest that the process of learning to read and write (at least in alphabetic languages) elicits changes in the language processing system, by either increasing the cognitive efficiency of mapping between representations associated with a word, or by changing the granularity of phonological processing of spoken language, or through a combination of both. Behavioural effects of literacy have typically been assessed in offline explicit tasks that have addressed only phonological processing. However, a recent eye tracking study compared high and low literate participants on effects of phonology and semantics in processing measured implicitly using eye movements. High literates' eye movements were more affected by phonological overlap in online speech than low literates, with only subtle differences observed in semantics. We determined whether these effects were due to cognitive efficiency and/or granularity of speech processing in a multimodal model of speech processing - the amodal shared resource model (ASR, Smith, Monaghan, & Huettig, 2013a,b). We found that cognitive efficiency in the model had only a marginal effect on semantic processing and did not affect performance for phonological processing, whereas fine-grained versus coarse-grained phonological representations in the model simulated the high/low literacy effects on phonological processing, suggesting that literacy has a focused effect in changing the grain-size of phonological mappings.

  16. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    This report describes a computer-based information system designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. For this study, soil, land cover/use, topographic and geological maps were used as a data base to develop an eleven map portfolio. The major themes of the portfolio are land cover/use, nonpoint water pollution, waste disposal, and ground water recharge.

  17. Turbulence computations with 3-D small-scale additive turbulent decomposition and data-fitting using chaotic map combinations

    SciTech Connect

    Mukerji, Sudip

    1997-01-01

    Although the equations governing turbulent fluid flow, the Navier-Stokes (N.S.) equations, have been known for well over a century and there is a clear technological necessity in obtaining solutions to these equations, turbulence remains one of the principal unsolved problems in physics today. It is still not possible to make accurate quantitative predictions about turbulent flows without relying heavily on empirical data. In principle, it is possible to obtain turbulent solutions from a direct numerical simulation (DNS) of the N.-S. equations. The author first provides a brief introduction to the dynamics of turbulent flows. The N.-S. equations which govern fluid flow, are described thereafter. Then he gives a brief overview of DNS calculations and where they stand at present. He next introduces the two most popular approaches for doing turbulent computations currently in use, namely, the Reynolds averaging of the N.-S. equations (RANS) and large-eddy simulation (LES). Approximations, often ad hoc ones, are present in these methods because use is made of heuristic models for turbulence quantities (the Reynolds stresses) which are otherwise unknown. They then introduce a new computational method called additive turbulent decomposition (ATD), the small-scale version of which is the topic of this research. The rest of the thesis is organized as follows. In Chapter 2 he describes the ATD procedure in greater detail; how dependent variables are split and the decomposition into large- and small-scale sets of equations. In Chapter 3 the spectral projection of the small-scale momentum equations are derived in detail. In Chapter 4 results of the computations with the small-scale ATD equations are presented. In Chapter 5 he describes the data-fitting procedure which can be used to directly specify the parameters of a chaotic-map turbulence model.

  18. A Simple and Resource-efficient Setup for the Computer-aided Drug Design Laboratory.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-10-01

    Undertaking modelling investigations for Computer-Aided Drug Design (CADD) requires a proper environment. In principle, this could be done on a single computer, but the reality of a drug discovery program requires robustness and high-throughput computing (HTC) to efficiently support the research. Therefore, a more capable alternative is needed but its implementation has no widespread solution. Here, the realization of such a computing facility is discussed, from general layout to technical details all aspects are covered.

  19. Innovative Computer-based Medical Knowledge Resources for Primary Care (SIG MED).

    ERIC Educational Resources Information Center

    Lei, Polin P.; McCain, Katherine

    2000-01-01

    Describes three innovative projects for a proposed session that address the information needs of primary care physicians: a standardized assessment method for information systems to integrate knowledge resources into clinical practice; a Web-based resource with clinical medical information from primary medical literature; and videotaped medical…

  20. Linking process, structure, property, and performance for metal-based additive manufacturing: computational approaches with experimental support

    NASA Astrophysics Data System (ADS)

    Smith, Jacob; Xiong, Wei; Yan, Wentao; Lin, Stephen; Cheng, Puikei; Kafka, Orion L.; Wagner, Gregory J.; Cao, Jian; Liu, Wing Kam

    2016-04-01

    Additive manufacturing (AM) methods for rapid prototyping of 3D materials (3D printing) have become increasingly popular with a particular recent emphasis on those methods used for metallic materials. These processes typically involve an accumulation of cyclic phase changes. The widespread interest in these methods is largely stimulated by their unique ability to create components of considerable complexity. However, modeling such processes is exceedingly difficult due to the highly localized and drastic material evolution that often occurs over the course of the manufacture time of each component. Final product characterization and validation are currently driven primarily by experimental means as a result of the lack of robust modeling procedures. In the present work, the authors discuss primary detrimental hurdles that have plagued effective modeling of AM methods for metallic materials while also providing logical speculation into preferable research directions for overcoming these hurdles. The primary focus of this work encompasses the specific areas of high-performance computing, multiscale modeling, materials characterization, process modeling, experimentation, and validation for final product performance of additively manufactured metallic components.

  1. Concrete resource analysis of the quantum linear-system algorithm used to compute the electromagnetic scattering cross section of a 2D target

    NASA Astrophysics Data System (ADS)

    Scherer, Artur; Valiron, Benoît; Mau, Siun-Chuon; Alexander, Scott; van den Berg, Eric; Chapuran, Thomas E.

    2017-03-01

    We provide a detailed estimate for the logical resource requirements of the quantum linear-system algorithm (Harrow et al. in Phys Rev Lett 103:150502, 2009) including the recently described elaborations and application to computing the electromagnetic scattering cross section of a metallic target (Clader et al. in Phys Rev Lett 110:250504, 2013). Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width (related to parallelism), circuit depth (total number of steps), the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set { X, Y, Z, H, S, T, { CNOT } }. In order to perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the explicit example problem size N=332{,}020{,}680 beyond which, according to a crude big-O complexity comparison, the quantum linear-system algorithm is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy ɛ =0.01 requires an approximate circuit width 340 and circuit depth of order 10^{25} if oracle costs are excluded, and a circuit width and circuit depth of order 10^8 and 10^{29}, respectively, if the resource requirements of oracles are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly (using a fine-grained approach rather than relying on coarse big-O asymptotic approximations) how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient

  2. Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise

    PubMed Central

    Burge, Johannes

    2017-01-01

    Accuracy Maximization Analysis (AMA) is a recently developed Bayesian ideal observer method for task-specific dimensionality reduction. Given a training set of proximal stimuli (e.g. retinal images), a response noise model, and a cost function, AMA returns the filters (i.e. receptive fields) that extract the most useful stimulus features for estimating a user-specified latent variable from those stimuli. Here, we first contribute two technical advances that significantly reduce AMA’s compute time: we derive gradients of cost functions for which two popular estimators are appropriate, and we implement a stochastic gradient descent (AMA-SGD) routine for filter learning. Next, we show how the method can be used to simultaneously probe the impact on neural encoding of natural stimulus variability, the prior over the latent variable, noise power, and the choice of cost function. Then, we examine the geometry of AMA’s unique combination of properties that distinguish it from better-known statistical methods. Using binocular disparity estimation as a concrete test case, we develop insights that have general implications for understanding neural encoding and decoding in a broad class of fundamental sensory-perceptual tasks connected to the energy model. Specifically, we find that non-orthogonal (partially redundant) filters with scaled additive noise tend to outperform orthogonal filters with constant additive noise; non-orthogonal filters and scaled additive noise can interact to sculpt noise-induced stimulus encoding uncertainty to match task-irrelevant stimulus variability. Thus, we show that some properties of neural response thought to be biophysical nuisances can confer coding advantages to neural systems. Finally, we speculate that, if repurposed for the problem of neural systems identification, AMA may be able to overcome a fundamental limitation of standard subunit model estimation. As natural stimuli become more widely used in the study of psychophysical and

  3. Projections of costs, financing, and additional resource requirements for low- and lower middle-income country immunization programs over the decade, 2011-2020.

    PubMed

    Gandhi, Gian; Lydon, Patrick; Cornejo, Santiago; Brenzel, Logan; Wrobel, Sandra; Chang, Hugh

    2013-04-18

    The Decade of Vaccines Global Vaccine Action Plan has outlined a set of ambitious goals to broaden the impact and reach of immunization across the globe. A projections exercise has been undertaken to assess the costs, financing availability, and additional resource requirements to achieve these goals through the delivery of vaccines against 19 diseases across 94 low- and middle-income countries for the period 2011-2020. The exercise draws upon data from existing published and unpublished global forecasts, country immunization plans, and costing studies. A combination of an ingredients-based approach and use of approximations based on past spending has been used to generate vaccine and non-vaccine delivery costs for routine programs, as well as supplementary immunization activities (SIAs). Financing projections focused primarily on support from governments and the GAVI Alliance. Cost and financing projections are presented in constant 2010 US dollars (US$). Cumulative total costs for the decade are projected to be US$57.5 billion, with 85% for routine programs and the remaining 15% for SIAs. Delivery costs account for 54% of total cumulative costs, and vaccine costs make up the remainder. A conservative estimate of total financing for immunization programs is projected to be $34.3 billion over the decade, with country governments financing 65%. These projections imply a cumulative funding gap of $23.2 billion. About 57% of the total resources required to close the funding gap are needed just to maintain existing programs and scale up other currently available vaccines (i.e., before adding in the additional costs of vaccines still in development). Efforts to mobilize additional resources, manage program costs, and establish mutual accountability between countries and development partners will all be necessary to ensure the goals of the Decade of Vaccines are achieved. Establishing or building on existing mechanisms to more comprehensively track resources and

  4. Aggregating data for computational toxicology applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System.

    PubMed

    Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.

  5. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    SciTech Connect

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  6. University Students and Ethics of Computer Technology Usage: Human Resource Development

    ERIC Educational Resources Information Center

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  7. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  8. Staff Computer Literacy in the Community College: A Resource Inventory and Directory, 1984.

    ERIC Educational Resources Information Center

    Stewart, Anne

    In spring 1984, a survey was undertaken to determine what activities were being conducted by member institutions of the League for Innovation in the Community College to help staff members become more comfortable with and knowledgeable about computers. League members were asked to provide information on the current level of computer usage among…

  9. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    USGS Publications Warehouse

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  10. A brief description of an Earth Resources Technology Satellite (ERTS) computer data analysis and management program

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.

    1973-01-01

    A data analysis and management procedure currently being used at Marshall Space Flight Center to analyze ERTS digital data is described. The objective is to acquaint potential users with the various computer programs that are available for analysis of multispectral digital imagery and to show how these programs are used in the overall data management plan. The report contains a brief description of each computer routine, and references are provided for obtaining more detailed information.

  11. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    PubMed

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  12. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    NASA Astrophysics Data System (ADS)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  13. Computer-Mediated Communications for Distance Education and Training: Literature Review and International Resources

    DTIC Science & Technology

    1991-01-01

    Communications for Distance Education and Training: Literature Review and International Resources 12. PERSONAL AUTHOR(S) Wells, Rosalie A. (Boise State...7 Age ................ ..................... 7 Geographic Dispersion ......... ................. 8 Personality ...modernizing, even accelerating growth (P. Levinson, personal communication, Nov. 11, 1989). The rapid pace of technological developments is complemented by

  14. Trace ResourceBook: Assistive Technologies for Communication, Control & Computer Access. 1996-97 Edition.

    ERIC Educational Resources Information Center

    Borden, Peter A., Ed.; And Others

    This resource book lists approximately 1,500 products designed specifically for the needs of people with disabilities. Typically, each product is pictured; basic information is provided including manufacturer name, product cost, size, and weight; and the product is briefly described. The book's four sections each describe products designed for…

  15. Recommendations for protecting National Library of Medicine Computing and Networking Resources

    SciTech Connect

    Feingold, R.

    1994-11-01

    Protecting Information Technology (IT) involves a number of interrelated factors. These include mission, available resources, technologies, existing policies and procedures, internal culture, contemporary threats, and strategic enterprise direction. In the face of this formidable list, a structured approach provides cost effective actions that allow the organization to manage its risks. We face fundamental challenges that will persist for at least the next several years. It is difficult if not impossible to precisely quantify risk. IT threats and vulnerabilities change rapidly and continually. Limited organizational resources combined with mission restraints-such as availability and connectivity requirements-will insure that most systems will not be absolutely secure (if such security were even possible). In short, there is no technical (or administrative) {open_quotes}silver bullet.{close_quotes} Protection is employing a stratified series of recommendations, matching protection levels against information sensitivities. Adaptive and flexible risk management is the key to effective protection of IT resources. The cost of the protection must be kept less than the expected loss, and one must take into account that an adversary will not expend more to attack a resource than the value of its compromise to that adversary. Notwithstanding the difficulty if not impossibility to precisely quantify risk, the aforementioned allows us to avoid the trap of choosing a course of action simply because {open_quotes}it`s safer{close_quotes} or ignoring an area because no one had explored its potential risk. Recommendations for protecting IT resources begins with discussing contemporary threats and vulnerabilities, and then procedures from general to specific preventive measures. From a risk management perspective, it is imperative to understand that today, the vast majority of threats are against UNIX hosts connected to the Internet.

  16. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  17. Earth Resources

    ERIC Educational Resources Information Center

    Brewer, Tom

    1970-01-01

    Reviews some of the more concerted, large-scale efforts in the earth resources areas" in order to help the computer community obtain insights into the activities it can jointly particpate in withthe earth resources community." (Author)

  18. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    PubMed Central

    Judson, Richard S.; Martin, Matthew T.; Egeghy, Peter; Gangwal, Sumit; Reif, David M.; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A.; Richard, Ann M.

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases. PMID:22408426

  19. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    NASA Astrophysics Data System (ADS)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  20. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    PubMed

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  1. Development of a Computational Framework for Stochastic Co-optimization of Water and Energy Resource Allocations under Climatic Uncertainty

    NASA Astrophysics Data System (ADS)

    Xuan, Y.; Mahinthakumar, K.; Arumugam, S.; DeCarolis, J.

    2015-12-01

    Owing to the lack of a consistent approach to assimilate probabilistic forecasts for water and energy systems, utilization of climate forecasts for conjunctive management of these two systems is very limited. Prognostic management of these two systems presents a stochastic co-optimization problem that seeks to determine reservoir releases and power allocation strategies while minimizing the expected operational costs subject to probabilistic climate forecast constraints. To address these issues, we propose a high performance computing (HPC) enabled computational framework for stochastic co-optimization of water and energy resource allocations under climate uncertainty. The computational framework embodies a new paradigm shift in which attributes of climate (e.g., precipitation, temperature) and its forecasted probability distribution are employed conjointly to inform seasonal water availability and electricity demand. The HPC enabled cyberinfrastructure framework is developed to perform detailed stochastic analyses, and to better quantify and reduce the uncertainties associated with water and power systems management by utilizing improved hydro-climatic forecasts. In this presentation, our stochastic multi-objective solver extended from Optimus (Optimization Methods for Universal Simulators), is introduced. The solver uses parallel cooperative multi-swarm method to solve for efficient solution of large-scale simulation-optimization problems on parallel supercomputers. The cyberinfrastructure harnesses HPC resources to perform intensive computations using ensemble forecast models of streamflow and power demand. The stochastic multi-objective particle swarm optimizer we developed is used to co-optimize water and power system models under constraints over a large number of ensembles. The framework sheds light on the application of climate forecasts and cyber-innovation framework to improve management and promote the sustainability of water and energy systems.

  2. Computer Resources: Asset or Liability for Institutional Research. SAIR Conference Paper.

    ERIC Educational Resources Information Center

    Sanford, Timothy R.

    The most advantageous relationship between computer technology and institutional research is considered. Three potential problem areas are discussed: those associated with a central data processing center, those germane to minicomputers or terminals within the institutional research office, and those nondiscriminating types which cover both…

  3. Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.

    ERIC Educational Resources Information Center

    Beltrametti, Monica; English, Will

    1994-01-01

    Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…

  4. ERDC MSRC Resource. High Performance Computing for the Warfighter. Fall 2006

    DTIC Science & Technology

    2006-01-01

    develop from wind blowing over the water. Waves can be very compli- cated as demonstrated by Figure 1 (photograph cour- tesy of Dr. Fred Tracy) showing...University of Puerto Rico, Mayaguez, August 10 LTG Carl A. Strock (far left) walks through Joint Computing Facility as Dr. Jeffery P. Holland (second

  5. Computer and Video Games in Family Life: The Digital Divide as a Resource in Intergenerational Interactions

    ERIC Educational Resources Information Center

    Aarsand, Pal Andre

    2007-01-01

    In this ethnographic study of family life, intergenerational video and computer game activities were videotaped and analysed. Both children and adults invoked the notion of a digital divide, i.e. a generation gap between those who master and do not master digital technology. It is argued that the digital divide was exploited by the children to…

  6. Method and apparatus for offloading compute resources to a flash co-processing appliance

    DOEpatents

    Tzelnic, Percy; Faibish, Sorin; Gupta, Uday K.; Bent, John; Grider, Gary Alan; Chen, Hsing -bung

    2015-10-13

    Solid-State Drive (SSD) burst buffer nodes are interposed into a parallel supercomputing cluster to enable fast burst checkpoint of cluster memory to or from nearby interconnected solid-state storage with asynchronous migration between the burst buffer nodes and slower more distant disk storage. The SSD nodes also perform tasks offloaded from the compute nodes or associated with the checkpoint data. For example, the data for the next job is preloaded in the SSD node and very fast uploaded to the respective compute node just before the next job starts. During a job, the SSD nodes perform fast visualization and statistical analysis upon the checkpoint data. The SSD nodes can also perform data reduction and encryption of the checkpoint data.

  7. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S r

  8. Collaborative Human-Computer Decision Making for Command and Control Resource Allocation

    DTIC Science & Technology

    2007-08-01

    modifying other assignments at higher priority levels. In the experiment, six subjects participated in a cognitive walkthrough of the mission planning...students with extensive backgrounds in UAV operation and Human-Computer Interaction, two of them being USAF 2nd Lieutenants. A cognitive walkthrough ... evaluates how well a skilled user can perform novel or occasionally performed tasks. In this usability inspection method, ease of learning, ease of

  9. New resource for the computation of cartilage biphasic material properties with the interpolant response surface method.

    PubMed

    Keenan, Kathryn E; Kourtis, Lampros C; Besier, Thor F; Lindsey, Derek P; Gold, Garry E; Delp, Scott L; Beaupre, Gary S

    2009-08-01

    Cartilage material properties are important for understanding joint function and diseases, but can be challenging to obtain. Three biphasic material properties (aggregate modulus, Poisson's ratio and permeability) can be determined using an analytical or finite element model combined with optimisation to find the material properties values that best reproduce an experimental creep curve. The purpose of this study was to develop an easy-to-use resource to determine biphasic cartilage material properties. A Cartilage Interpolant Response Surface was generated from interpolation of finite element simulations of creep indentation tests. Creep indentation tests were performed on five sites across a tibial plateau. A least-squares residual search of the Cartilage Interpolant Response Surface resulted in a best-fit curve for each experimental condition with corresponding material properties. These sites provided a representative range of aggregate moduli (0.48-1.58 MPa), Poisson's ratio (0.00-0.05) and permeability (1.7 x 10(- 15)-5.4 x 10(- 15) m(4)/N s) values found in human cartilage. The resource is freely available from https://simtk.org/home/va-squish.

  10. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  11. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  12. Computational resources to filter gravitational wave data with P-approximant templates

    NASA Astrophysics Data System (ADS)

    Porter, Edward K.

    2002-08-01

    The prior knowledge of the gravitational waveform from compact binary systems makes matched filtering an attractive detection strategy. This detection method involves the filtering of the detector output with a set of theoretical waveforms or templates. One of the most important factors in this strategy is knowing how many templates are needed in order to reduce the loss of possible signals. In this study, we calculate the number of templates and computational power needed for a one-step search for gravitational waves from inspiralling binary systems. We build on previous works by first expanding the post-Newtonian waveforms to 2.5-PN order and second, for the first time, calculating the number of templates needed when using P-approximant waveforms. The analysis is carried out for the four main first-generation interferometers, LIGO, GEO600, VIRGO and TAMA. As well as template number, we also calculate the computational cost of generating banks of templates for filtering GW data. We carry out the calculations for two initial conditions. In the first case we assume a minimum individual mass of 1 Msolar and in the second, we assume a minimum individual mass of 5 Msolar. We find that, in general, we need more P-approximant templates to carry out a search than if we use standard PN templates. This increase varies according to the order of PN-approximation, but can be as high as a factor of 3 and is explained by the smaller span of the P-approximant templates as we go to higher masses. The promising outcome is that for 2-PN templates, the increase is small and is outweighed by the known robustness of the 2-PN P-approximant templates.

  13. New resource for the computation of cartilage biphasic material properties with the interpolant response surface method

    PubMed Central

    Keenan, Kathryn E.; Kourtis, Lampros C.; Besier, Thor F.; Lindsey, Derek P.; Gold, Garry E.; Delp, Scott L.; Beaupre, Gary S.

    2009-01-01

    Cartilage material properties are important for understanding joint function and diseases, but can be challenging to obtain. Three biphasic material properties (aggregate modulus, Poisson's ratio and permeability) can be determined using an analytical or finite element model combined with optimisation to find the material properties values that best reproduce an experimental creep curve. The purpose of this study was to develop an easy-to-use resource to determine biphasic cartilage material properties. A Cartilage Interpolant Response Surface was generated from interpolation of finite element simulations of creep indentation tests. Creep indentation tests were performed on five sites across a tibial plateau. A least-squares residual search of the Cartilage Interpolant Response Surface resulted in a best-fit curve for each experimental condition with corresponding material properties. These sites provided a representative range of aggregate moduli (0.48–1.58 MPa), Poisson's ratio (0.00–0.05) and permeability (1.7 × 10−15−5.4 × 10−15 m4/N s) values found in human cartilage. PMID:19675978

  14. Computational tools for exploring sequence databases as a resource for antimicrobial peptides.

    PubMed

    Porto, W F; Pires, A S; Franco, O L

    Data mining has been recognized by many researchers as a hot topic in different areas. In the post-genomic era, the growing number of sequences deposited in databases has been the reason why these databases have become a resource for novel biological information. In recent years, the identification of antimicrobial peptides (AMPs) in databases has gained attention. The identification of unannotated AMPs has shed some light on the distribution and evolution of AMPs and, in some cases, indicated suitable candidates for developing novel antimicrobial agents. The data mining process has been performed mainly by local alignments and/or regular expressions. Nevertheless, for the identification of distant homologous sequences, other techniques such as antimicrobial activity prediction and molecular modelling are required. In this context, this review addresses the tools and techniques, and also their limitations, for mining AMPs from databases. These methods could be helpful not only for the development of novel AMPs, but also for other kinds of proteins, at a higher level of structural genomics. Moreover, solving the problem of unannotated proteins could bring immeasurable benefits to society, especially in the case of AMPs, which could be helpful for developing novel antimicrobial agents and combating resistant bacteria.

  15. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  16. Water resources climate change projections using supervised nonlinear and multivariate soft computing techniques

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Johnson, Fiona; Mehrotra, Raj; Sharma, Ashish

    2016-05-01

    Accurate projection of global warming on the probabilistic behavior of hydro-climate variables is one of the main challenges in climate change impact assessment studies. Due to the complexity of climate-associated processes, different sources of uncertainty influence the projected behavior of hydro-climate variables in regression-based statistical downscaling procedures. The current study presents a comprehensive methodology to improve the predictive power of the procedure to provide improved projections. It does this by minimizing the uncertainty sources arising from the high-dimensionality of atmospheric predictors, the complex and nonlinear relationships between hydro-climate predictands and atmospheric predictors, as well as the biases that exist in climate model simulations. To address the impact of the high dimensional feature spaces, a supervised nonlinear dimensionality reduction algorithm is presented that is able to capture the nonlinear variability among projectors through extracting a sequence of principal components that have maximal dependency with the target hydro-climate variables. Two soft-computing nonlinear machine-learning methods, Support Vector Regression (SVR) and Relevance Vector Machine (RVM), are engaged to capture the nonlinear relationships between predictand and atmospheric predictors. To correct the spatial and temporal biases over multiple time scales in the GCM predictands, the Multivariate Recursive Nesting Bias Correction (MRNBC) approach is used. The results demonstrate that this combined approach significantly improves the downscaling procedure in terms of precipitation projection.

  17. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    NASA Astrophysics Data System (ADS)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  18. An Experimental and Computational Approach to Defining Structure/Reactivity Relationships for Intramolecular Addition Reactions to Bicyclic Epoxonium Ions

    PubMed Central

    Wan, Shuangyi; Gunaydin, Hakan; Houk, K. N.; Floreancig, Paul E.

    2008-01-01

    In this manuscript we report that oxidative cleavage reactions can be used to form oxocarbenium ions that react with pendent epoxides to form bicyclic epoxonium ions as an entry to the formation of cyclic oligoether compounds. Bicyclic epoxonium ion structure was shown to have a dramatic impact on the ratio of exo- to endo-cyclization reactions, with bicyclo[4.1.0] intermediates showing a strong preference for endo-closures and bicyclo[3.1.0] intermediates showing a preference for exo-closures. Computational studies on the structures and energetics of the transition states using the B3LYP/6-31G(d) method provide substantial insight into the origins of this selectivity. PMID:17547399

  19. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  20. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  1. Computer Model of Biopolymer Crystal Growth and Aggregation by Addition of Macromolecular Units — a Comparative Study

    NASA Astrophysics Data System (ADS)

    Siódmiak, J.; Gadomski, A.

    We discuss the results of a computer simulation of the biopolymer crystal growth and aggregation based on the 2D lattice Monte Carlo technique and the HP approximation of the biopolymers. As a modeled molecule (growth unit) we comparatively consider the previously studied non-mutant lysozyme protein, Protein Data Bank (PDB) ID: 193L, which forms, under a certain set of thermodynamic-kinetic conditions, the tetragonal crystals, and an amyloidogenic variant of the lysozyme, PDB ID: 1LYY, which is known as fibril-yielding and prone-to-aggregation agent. In our model, the site-dependent attachment, detachment and migration processes are involved. The probability of growth unit motion, attachment and detachment to/from the crystal surface are assumed to be proportional to the orientational factor representing the anisotropy of the molecule. Working within a two-dimensional representation of the truly three-dimensional process, we also argue that the crystal grows in a spiral way, whereby one or more screw dislocations on the crystal surface give rise to a terrace. We interpret the obtained results in terms of known models of crystal growth and aggregation such as B-C-F (Burton-Cabrera-Frank) dislocation driven growth and M-S (Mullins-Sekerka) instability concept, with stochastic aspects supplementing the latter. We discuss the conditions under which crystals vs non-crystalline protein aggregates appear, and how the process depends upon difference in chemical structure of the protein molecule seen as the main building block of the elementary crystal cell.

  2. Effects of protonation and C5 methylation on the electrophilic addition reaction of cytosine: a computational study.

    PubMed

    Jin, Lingxia; Wang, Wenliang; Hu, Daodao; Min, Suotian

    2013-01-10

    The mechanism for the effects of protonation and C5 methylation on the electrophilic addition reaction of Cyt has been explored by means of CBS-QB3 and CBS-QB3/PCM methods. In the gas phase, three paths, two protonated paths (N3 and O2 protonated paths B and C) as well as one neutral path (path A), were mainly discussed, and the calculated results indicate that the reaction of the HSO(3)(-) group with neutral Cyt is unlikely because of its high activation free energy, whereas O2-protonated path (path C) is the most likely to occur. In the aqueous phase, path B is the most feasible mechanism to account for the fact that the activation free energy of path B decreases compared with the corresponding path in the gas phase, whereas those of paths A and C increase. The main striking results are that the HSO(3)(-) group directly interacts with the C5═C6 bond rather than the N3═C4 bond and that the C5 methylation, compared with Cyt, by decreasing values of global electrophilicity index manifests that C5 methylation forms are less electrophilic power as well as by decreasing values of NPA charges on C5 site of the intermediates make the trend of addition reaction weaken, which is in agreement with the experimental observation that the rate of 5-MeCyt reaction is approximately 2 orders of magnitude slower than that of Cyt in the presence of bisulfite. Apart from cis and trans isomers, the rare third isomer where both the CH(3) and SO(3) occupy axial positions has been first found in the reactions of neutral and protonated 5-MeCyt with the HSO(3)(-) group. Furthermore, the transformation of the third isomer from the cis isomer can occur easily.

  3. Student use of computer tools designed to scaffold scientific problem-solving with hypermedia resources: A case study

    NASA Astrophysics Data System (ADS)

    Oliver, Kevin Matthew

    National science standards call for increasing student exposure to inquiry and real-world problem solving. Students can benefit from open-ended learning environments that stress the engagement of real problems and the development of thinking skills and processes. The Internet is an ideal resource for context-bound problems with its seemingly endless supply of resources. Problems may arise, however, since young students are cognitively ill-prepared to manage open-ended learning and may have difficulty processing hypermedia. Computer tools were used in a qualitative case study with 12 eighth graders to determine how such implements might support the process of solving open-ended problems. A preliminary study proposition suggested students would solve open-ended problems more appropriately if they used tools in a manner consistent with higher-order critical and creative thinking. Three research questions sought to identify: how students used tools, the nature of science learning in open-ended environments, and any personal or environmental barriers effecting problem solving. The findings were mixed. The participants did not typically use the tools and resources effectively. They successfully collected basic information, but infrequently organized, evaluated, generated, and justified their ideas. While the students understood how to use most tools procedurally, they lacked strategic understanding for why tool use was necessary. Students scored average to high on assessments of general content understanding, but developed artifacts suggesting their understanding of specific micro problems was naive and rife with misconceptions. Process understanding was also inconsistent, with some students describing basic problem solving processes, but most students unable to describe how tools could support open-ended inquiry. Barriers to effective problem solving were identified in the study. Personal barriers included naive epistemologies, while environmental barriers included a

  4. Impact of remote sensing upon the planning, management and development of water resources. Summary of computers and computer growth trends for hydrologic modeling and the input of ERTS image data processing load

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L., Jr.

    1975-01-01

    An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.

  5. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 4: Update to Books 1, 2, and 3.

    ERIC Educational Resources Information Center

    Borden, Peter A., Ed.; Vanderheiden, Gregg C., Ed.

    This update to the three-volume first edition of the "Rehab/Education ResourceBook Series" describes special software and products pertaining to communication, control, and computer access, designed specifically for the needs of disabled and elderly people. The 22 chapters cover: speech aids; pointing and typing aids; training and communication…

  6. Linear Equations and Rap Battles: How Students in a Wired Classroom Utilized the Computer as a Resource to Coordinate Personal and Mathematical Positional Identities in Hybrid Spaces

    ERIC Educational Resources Information Center

    Langer-Osuna, Jennifer

    2015-01-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small…

  7. Technology in the Curriculum. Resource Guide 1988 Update. A Guide to the Instructional Use of Computers and Video in K-12.

    ERIC Educational Resources Information Center

    1988

    This resource guide lists 47 computer software and 29 instructional video programs recommended for use in grades K through 12 to help teachers achieve the learning objectives set forth by their school districts and the State of California. Programs are organized in six curriculum areas: foreign language, history-social science, language arts,…

  8. Computational Resources for GTL

    SciTech Connect

    Herbert M. Sauro

    2007-12-18

    This final report summarizes the work conducted under our three year DOE GTL grant ($459,402). The work involved a number of areas, including standardization, the Systems Biology Workbench, Visual Editors, collaboration with other groups and the development of new theory and algorithms. Our work has played a key part in helping to further develop SBML, the de facto standard for System Biology Model exchange and SBGN, the developing standard for visual representation for biochemical models. Our work has also made significant contributions to developing SBW, the systems biology workbench which is now very widely used in the community (roughly 30 downloads per day for the last three years, which equates to about 30,000 downloads in total). We have also used the DOE funding to collaborate extensively with nine different groups around the world. Finally we have developed new methods to reduce model size which are now used by all the major simulation packages, including Matlab. All in all, we consider the last three years to be highly productive and influential in the systems biology community. The project resulted in 16 peer review publications.

  9. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    SciTech Connect

    Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley; Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2009-05-04

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads per MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.

  10. Comparison of computer-based and manual coal resource estimation methods for the Cache coal bed, Recluse Geologic Model Area, Wyoming

    USGS Publications Warehouse

    Schneider, Gary B.; Crowley, Sharon S.; Carey, Mary Alice

    1984-01-01

    Coal resources have been estimated, using both manual and computer methods, for the Cache coal bed in the Recluse Geologic Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7?-minute quadrangles in Campbell County, Wyoming. Approximately 300 coal thickness measurements from drill-hole logs are distributed throughout the area The Cache coal bed and associated strata are in the Paleocene Tongue River Member of the Fort Union Formation. The depth to the Cache coal bed ranges from 269 to 1,257 feet. The coal bed is as much as 31 feet thick but is absent in places. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources estimated by hand show the bed to contain 2,228 million short tons or about 2.6 percent more than the computer-calculated figure of 2,169 million short tons.

  11. Optimal distributed computing resources for mask synthesis and tape-out in production environment: an economic analysis

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Chacko, Manoj; Levi, Shimon

    2005-11-01

    At the deep Subwavelength process nodes, the use of the aggressive optical proximity correction (OPC) and resolution enhancement techniques (RET) is fostering an exponential increase in output database size causing the CPU time required for mask tape-out to increase significantly. This sets up challenging scenarios for integrated device manufacturers (IDMs), and Foundries. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround for a CMOS process around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by applying a combination of extra computing resources and enhancements in the OPC tool like Fracture Friendly OPC (FFOPC) . Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl's law. Very few are efficient enough to allow the effective use of 100's of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.

  12. An economic analysis for optimal distributed computing resources for mask synthesis and tape-out in production environment

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon

    2005-06-01

    With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.

  13. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  14. Assessment of potential additions to conventional oil and gas resources of the world (outside the United States) from reserve growth, 2012

    USGS Publications Warehouse

    Klett, Timothy R.; Cook, Troy A.; Charpentier, Ronald R.; Tennyson, Marilyn E.; Attanasi, E.D.; Freeman, Phil A.; Ryder, Robert T.; Gautier, Donald L.; Verma, Mahendra K.; Le, Phuong A.; Schenk, Christopher J.

    2012-01-01

    The U.S. Geological Survey estimated volumes of technically recoverable, conventional petroleum resources resulting from reserve growth for discovered fields outside the United States that have reported in-place oil and gas volumes of 500 million barrels of oil equivalent or greater. The mean volumes were estimated at 665 billion barrels of crude oil, 1,429 trillion cubic feet of natural gas, and 16 billion barrels of natural gas liquids. These volumes constitute a significant portion of the world's oil and gas resources.

  15. Assessment of potential additions to conventional oil and gas resources in discovered fields of the United States from reserve growth, 2012

    USGS Publications Warehouse

    ,

    2012-01-01

    The U.S. Geological Survey estimated volumes of technically recoverable, conventional petroleum resources that have the potential to be added to reserves from reserve growth in 70 discovered oil and gas accumulations of the United States, excluding Federal offshore areas. The mean estimated volumes are 32 billion barrels of crude oil, 291 trillion cubic feet of natural gas, and 10 billion barrels of natural gas liquids.

  16. Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016.

    SciTech Connect

    Reno, Matthew J.; Riehm, Andrew Charles; Hoekstra, Robert John; Munoz-Ramirez, Karina; Stamp, Jason Edwin; Phillips, Laurence R.; Adams, Brian M.; Russo, Thomas V.; Oldfield, Ron A.; McLendon, William Clarence, III; Nelson, Jeffrey Scott; Hansen, Clifford W.; Richardson, Bryan T.; Stein, Joshua S.; Schoenwald, David Alan; Wolfenbarger, Paul R.

    2011-02-01

    Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

  17. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    EPA Science Inventory

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for...

  18. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention.

    PubMed

    Akutsu, Yasushi; Hamazaki, Yuji; Sekimoto, Teruo; Kaneko, Kyouichi; Kodama, Yusuke; Li, Hui-Ling; Suyama, Jumpei; Gokan, Takehiko; Sakai, Koshiro; Kosaki, Ryota; Yokota, Hiroyuki; Tsujita, Hiroaki; Tsukamoto, Shigeto; Sakurai, Masayuki; Sambe, Takehiko; Oguchi, Katsuji; Uchida, Naoki; Kobayashi, Shinichi; Aoki, Atsushi; Kobayashi, Youichi

    2016-06-01

    Our data shows the regional coronary artery calcium scores (lesion CAC) on multidetector computed tomography (MDCT) and the cross-section imaging on MDCT angiography (CTA) in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI). CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany) before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan) and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions.

  19. Dataset of calcified plaque condition in the stenotic coronary artery lesion obtained using multidetector computed tomography to indicate the addition of rotational atherectomy during percutaneous coronary intervention

    PubMed Central

    Akutsu, Yasushi; Hamazaki, Yuji; Sekimoto, Teruo; Kaneko, Kyouichi; Kodama, Yusuke; Li, Hui-Ling; Suyama, Jumpei; Gokan, Takehiko; Sakai, Koshiro; Kosaki, Ryota; Yokota, Hiroyuki; Tsujita, Hiroaki; Tsukamoto, Shigeto; Sakurai, Masayuki; Sambe, Takehiko; Oguchi, Katsuji; Uchida, Naoki; Kobayashi, Shinichi; Aoki, Atsushi; Kobayashi, Youichi

    2016-01-01

    Our data shows the regional coronary artery calcium scores (lesion CAC) on multidetector computed tomography (MDCT) and the cross-section imaging on MDCT angiography (CTA) in the target lesion of the patients with stable angina pectoris who were scheduled for percutaneous coronary intervention (PCI). CAC and CTA data were measured using a 128-slice scanner (Somatom Definition AS+; Siemens Medical Solutions, Forchheim, Germany) before PCI. CAC was measured in a non-contrast-enhanced scan and was quantified using the Calcium Score module of SYNAPSE VINCENT software (Fujifilm Co. Tokyo, Japan) and expressed in Agatston units. CTA were then continued with a contrast-enhanced ECG gating to measure the severity of the calcified plaque condition. We present that both CAC and CTA data are used as a benchmark to consider the addition of rotational atherectomy during PCI to severely calcified plaque lesions. PMID:26977441

  20. Synthesis of Bridged Heterocycles via Sequential 1,4- and 1,2-Addition Reactions to α,β-Unsaturated N-Acyliminium Ions: Mechanistic and Computational Studies.

    PubMed

    Yazici, Arife; Wille, Uta; Pyne, Stephen G

    2016-02-19

    Novel tricyclic bridged heterocyclic systems can be readily prepared from sequential 1,4- and 1,2-addition reactions of allyl and 3-substituted allylsilanes to indolizidine and quinolizidine α,β-unsaturated N-acyliminium ions. These reactions involve a novel N-assisted, transannular 1,5-hydride shift. Such a mechanism was supported by examining the reaction of a dideuterated indolizidine, α,β-unsaturated N-acyliminium ion precursor, which provided specifically dideuterated tricyclic bridged heterocyclic products, and from computational studies. In contrast, the corresponding pyrrolo[1,2-a]azepine system did not provide the corresponding tricyclic bridged heterocyclic product and gave only a bis-allyl adduct, while more substituted versions gave novel furo[3,2-d]pyrrolo[1,2-a]azepine products. Such heterocyclic systems would be expected to be useful scaffolds for the preparation of libraries of novel compounds for new drug discovery programs.

  1. The addition of decision support into computerized physician order entry reduces red blood cell transfusion resource utilization in the intensive care unit.

    PubMed

    Fernández Pérez, Evans R; Winters, Jeffrey L; Gajic, Ognjen

    2007-07-01

    Computerized physician order entry (CPOE) has the potential for cost containment in critically ill patients through practice standardization and elimination of unnecessary interventions. Previous study demonstrated the beneficial short-term effect of adding a decision support for red blood cell (RBC) transfusion into the hospital CPOE. We evaluated the effect of such intervention on RBC resource utilization during the two-year study period. From the institutional APACHE III database we identified 2,200 patients with anemia, but no active bleeding on admission: 1,100 during a year before and 1,100 during a year after the intervention. The mean number of RBC transfusions per patient decreased from 1.5 +/- 1.9 units to 1.3 +/- 1.8 units after the intervention (P = 0.045). RBC transfusion cost decreased from $616,442 to $556,226 after the intervention. Hospital length of stay and adjusted hospital mortality did not differ before and after protocol implementation. In conclusion, the implementation of an evidenced-based decision support system through a CPOE can decrease RBC transfusion resource utilization in critically ill patients.

  2. Development and Use of a Computer-Based Interactive Resource for Teaching and Learning Probability in Primary Classrooms

    ERIC Educational Resources Information Center

    Trigueros, Maria; Lozano, Maria Dolores; Lage, Ana Elisa

    2006-01-01

    "Enciclomedia" is a Mexican project for primary school teaching using computers in the classroom. Within this project, and following an enactivist theoretical perspective and methodology, we have designed a computer-based package called "Dados", which, together with teaching guides, is intended to support the teaching and…

  3. Resource Manual

    ERIC Educational Resources Information Center

    Human Development Institute, 2008

    2008-01-01

    This manual was designed primarily for use by individuals with developmental disabilities and related conditions. The main focus of this manual is to provide easy-to-read information concerning available resources, and to provide immediate contact information for the purpose of applying for resources and/or locating additional information. The…

  4. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems.

    PubMed

    Li, Ying

    2016-09-16

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  5. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems

    NASA Astrophysics Data System (ADS)

    Li, Ying

    2016-09-01

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g., superconducting circuits or quantum dots, is studied in this Letter. Errors caused by topologically unprotected quantum systems need to be corrected with error-correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about 1,000 normal qubits.

  6. Management and Support of Commercial Off-the-Shelf (COTS) Computer Resources Used in Weapon System Applications

    DTIC Science & Technology

    1988-09-01

    relatively small portion of the commercial b market , the Air Force does not have the purchaoing leverage to keep these production lines open. Furthermore...advantages which can significantly reduce these risks (30:6): a. Current & Advancing Technology b. Market -Based Pricing c. Up-Front Product...computer and x ro-electronics markets are rapidly decreasing hardware costs and increasing computing performance. Market -Based Pricing. Since the

  7. Proceedings of the Joint Logistics Commanders Joint Policy Coordinating Group on Computer Resource Management; Computer Software Management Software Workshop, 2-5 April 1979.

    DTIC Science & Technology

    1979-08-21

    REPORT NUMBER 2-5 April 1979 . ,. . ................ 7. AUTHOR(*) 8. CONTRACT OR GRANT NUMBER(s) 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10... organizations . This conclusion recognizes the current, sharply divided opinion relative to the advisability of a standard dealing with what until now...concept and relate the needs of the support organization in acquisition and development plans. For example, such items as the development of comp)uter

  8. Proceedings of the Joint Logistics Commanders Joint Policy Coordinating Group on Computer Resource Management; Computer Software Management Subgroup. Second Software Workshop, 22-25 June 1981

    DTIC Science & Technology

    1981-11-01

    or array used in the computer program. Paragraph 3.3.2, Storage Allocation : This paragraph shall include a memory map which describes the...is for software. It was recognized tUhat there "is a bisic d-fference in the storage and nmtdification oý firmoare and that the existiiig documents do...identifies each requirement specified in the System/Segment Specification and provides a mechanism for tracing the allocation of these requirements to

  9. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  10. Parallel high-performance grid computing: capabilities and opportunities of a novel demanding service and business class allowing highest resource efficiency.

    PubMed

    Kepper, Nick; Ettig, Ramona; Dickmann, Frank; Stehr, Rene; Grosveld, Frank G; Wedemann, Gero; Knoch, Tobias A

    2010-01-01

    Especially in the life-science and the health-care sectors the huge IT requirements are imminent due to the large and complex systems to be analysed and simulated. Grid infrastructures play here a rapidly increasing role for research, diagnostics, and treatment, since they provide the necessary large-scale resources efficiently. Whereas grids were first used for huge number crunching of trivially parallelizable problems, increasingly parallel high-performance computing is required. Here, we show for the prime example of molecular dynamic simulations how the presence of large grid clusters including very fast network interconnects within grid infrastructures allows now parallel high-performance grid computing efficiently and thus combines the benefits of dedicated super-computing centres and grid infrastructures. The demands for this service class are the highest since the user group has very heterogeneous requirements: i) two to many thousands of CPUs, ii) different memory architectures, iii) huge storage capabilities, and iv) fast communication via network interconnects, are all needed in different combinations and must be considered in a highly dedicated manner to reach highest performance efficiency. Beyond, advanced and dedicated i) interaction with users, ii) the management of jobs, iii) accounting, and iv) billing, not only combines classic with parallel high-performance grid usage, but more importantly is also able to increase the efficiency of IT resource providers. Consequently, the mere "yes-we-can" becomes a huge opportunity like e.g. the life-science and health-care sectors as well as grid infrastructures by reaching higher level of resource efficiency.

  11. A computational study of the addition of ReO3L (L = Cl(-), CH3, OCH3 and Cp) to ethenone.

    PubMed

    Aniagyei, Albert; Tia, Richard; Adei, Evans

    2016-01-01

    The periselectivity and chemoselectivity of the addition of transition metal oxides of the type ReO3L (L = Cl, CH3, OCH3 and Cp) to ethenone have been explored at the MO6 and B3LYP/LACVP* levels of theory. The activation barriers and reaction energies for the stepwise and concerted addition pathways involving multiple spin states have been computed. In the reaction of ReO3L (L = Cl(-), OCH3, CH3 and Cp) with ethenone, the concerted [2 + 2] addition of the metal oxide across the C=C and C=O double bond to form either metalla-2-oxetane-3-one or metalla-2,4-dioxolane is the most kinetically favored over the formation of metalla-2,5-dioxolane-3-one from the direct [3 + 2] addition pathway. The trends in activation and reaction energies for the formation of metalla-2-oxetane-3-one and metalla-2,4-dioxolane are Cp < Cl(-) < OCH3 < CH3 and Cp < OCH3 < CH3 < Cl(-) and for the reaction energies are Cp < OCH3 < Cl(-) < CH3 and Cp < CH3 < OCH3 < Cl CH3. The concerted [3 + 2] addition of the metal oxide across the C=C double of the ethenone to form species metalla-2,5-dioxolane-3-one is thermodynamically the most favored for the ligand L = Cp. The direct [2 + 2] addition pathways leading to the formations of metalla-2-oxetane-3-one and metalla-2,4-dioxolane is thermodynamically the most favored for the ligands L = OCH3 and Cl(-). The difference between the calculated [2 + 2] activation barriers for the addition of the metal oxide LReO3 across the C=C and C=O functionalities of ethenone are small except for the case of L = Cl(-) and OCH3. The rearrangement of the metalla-2-oxetane-3-one-metalla-2,5-dioxolane-3-one even though feasible, are unfavorable due to high activation energies of their rate-determining steps. For the rearrangement of the metalla-2-oxetane-3-one to metalla-2,5-dioxolane-3-one, the trends in activation barriers is found to follow the order OCH3 < Cl(-) < CH3 < Cp. The trends in the activation energies for

  12. Determination of Zinc-Based Additives in Lubricating Oils by Flow-Injection Analysis with Flame-AAS Detection Exploiting Injection with a Computer-Controlled Syringe.

    PubMed

    Pignalosa, Gustavo; Knochen, Moisés; Cabrera, Noel

    2005-01-01

    A flow-injection system is proposed for the determination of metal-based additives in lubricating oils. The system, operating under computer control uses a motorised syringe for measuring and injecting the oil sample (200 muL) in a kerosene stream, where it is dispersed by means of a packed mixing reactor and carried to an atomic absorption spectrometer which is used as detector. Zinc was used as model analyte. Two different systems were evaluated, one for low concentrations (range 0-10 ppm) and the second capable of providing higher dilution rates for high concentrations (range 0.02%-0.2% w/w). The sampling frequency was about 30 samples/h. Calibration curves fitted a second-degree regression model (r(2) = 0.996). Commercial samples with high and low zinc levels were analysed by the proposed method and the results were compared with those obtained with the standard ASTM method. The t test for mean values showed no significant differences at the 95% confidence level. Precision (RSD%) was better than 5% (2% typical) for the high concentrations system. The carryover between successive injections was found to be negligible.

  13. Determination of Zinc-Based Additives in Lubricating Oils by Flow-Injection Analysis with Flame-AAS Detection Exploiting Injection with a Computer-Controlled Syringe

    PubMed Central

    Pignalosa, Gustavo; Cabrera, Noel

    2005-01-01

    A flow-injection system is proposed for the determination of metal-based additives in lubricating oils. The system, operating under computer control uses a motorised syringe for measuring and injecting the oil sample (200 μL) in a kerosene stream, where it is dispersed by means of a packed mixing reactor and carried to an atomic absorption spectrometer which is used as detector. Zinc was used as model analyte. Two different systems were evaluated, one for low concentrations (range 0–10 ppm) and the second capable of providing higher dilution rates for high concentrations (range 0.02%–0.2% w/w). The sampling frequency was about 30 samples/h. Calibration curves fitted a second-degree regression model (r 2 = 0.996). Commercial samples with high and low zinc levels were analysed by the proposed method and the results were compared with those obtained with the standard ASTM method. The t test for mean values showed no significant differences at the 95% confidence level. Precision (RSD%) was better than 5% (2% typical) for the high concentrations system. The carryover between successive injections was found to be negligible. PMID:18924720

  14. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  15. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  16. Linear equations and rap battles: how students in a wired classroom utilized the computer as a resource to coordinate personal and mathematical positional identities in hybrid spaces

    NASA Astrophysics Data System (ADS)

    Langer-Osuna, Jennifer

    2015-03-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.

  17. Identification and Mapping of Soils, Vegetation, and Water Resources of Lynn County, Texas, by Computer Analysis of ERTS MSS Data

    NASA Technical Reports Server (NTRS)

    Baumgardner, M. F.; Kristof, S. J.; Henderson, J. A., Jr.

    1973-01-01

    Results of the analysis and interpretation of ERTS multispectral data obtained over Lynn County, Texas, are presented. The test site was chosen because it embodies a variety of problems associated with the development and management of agricultural resources in the Southern Great Plains. Lynn County is one of ten counties in a larger test site centering around Lubbock, Texas. The purpose of this study is to examine the utility of ERTS data in identifying, characterizing, and mapping soils, vegetation, and water resources in this semiarid region. Successful application of multispectral remote sensing and machine-processing techniques to arid and seminarid land-management problems will provide valuable new tools for the more than one-third of the world's lands lying in arid-semiarid regions.

  18. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  19. Winning the Popularity Contest: Researcher Preference When Selecting Resources for Civil Engineering, Computer Science, Mathematics and Physics Dissertations

    ERIC Educational Resources Information Center

    Dotson, Daniel S.; Franks, Tina P.

    2015-01-01

    More than 53,000 citations from 609 dissertations published at The Ohio State University between 1998-2012 representing four science disciplines--civil engineering, computer science, mathematics and physics--were examined to determine what, if any, preferences or trends exist. This case study seeks to identify whether or not researcher preferences…

  20. A Framework for Safe Composition of Heterogeneous SOA Services in a Pervasive Computing Environment with Resource Constraints

    ERIC Educational Resources Information Center

    Reyes Alamo, Jose M.

    2010-01-01

    The Service Oriented Computing (SOC) paradigm, defines services as software artifacts whose implementations are separated from their specifications. Application developers rely on services to simplify the design, reduce the development time and cost. Within the SOC paradigm, different Service Oriented Architectures (SOAs) have been developed.…

  1. Minimal-resource computer program for automatic generation of ocean wave ray or crest diagrams in shoaling waters

    NASA Technical Reports Server (NTRS)

    Poole, L. R.; Lecroy, S. R.; Morris, W. D.

    1977-01-01

    A computer program for studying linear ocean wave refraction is described. The program features random-access modular bathymetry data storage. Three bottom topography approximation techniques are available in the program which provide varying degrees of bathymetry data smoothing. Refraction diagrams are generated automatically and can be displayed graphically in three forms: Ray patterns with specified uniform deepwater ray density, ray patterns with controlled nearshore ray density, or crest patterns constructed by using a cubic polynomial to approximate crest segments between adjacent rays.

  2. Resource Destroying Maps.

    PubMed

    Liu, Zi-Wen; Hu, Xueyuan; Lloyd, Seth

    2017-02-10

    Resource theory is a widely applicable framework for analyzing the physical resources required for given tasks, such as computation, communication, and energy extraction. In this Letter, we propose a general scheme for analyzing resource theories based on resource destroying maps, which leave resource-free states unchanged but erase the resource stored in all other states. We introduce a group of general conditions that determine whether a quantum operation exhibits typical resource-free properties in relation to a given resource destroying map. Our theory reveals fundamental connections among basic elements of resource theories, in particular, free states, free operations, and resource measures. In particular, we define a class of simple resource measures that can be calculated without optimization, and that are monotone nonincreasing under operations that commute with the resource destroying map. We apply our theory to the resources of coherence and quantum correlations (e.g., discord), two prominent features of nonclassicality.

  3. Resource Destroying Maps

    NASA Astrophysics Data System (ADS)

    Liu, Zi-Wen; Hu, Xueyuan; Lloyd, Seth

    2017-02-01

    Resource theory is a widely applicable framework for analyzing the physical resources required for given tasks, such as computation, communication, and energy extraction. In this Letter, we propose a general scheme for analyzing resource theories based on resource destroying maps, which leave resource-free states unchanged but erase the resource stored in all other states. We introduce a group of general conditions that determine whether a quantum operation exhibits typical resource-free properties in relation to a given resource destroying map. Our theory reveals fundamental connections among basic elements of resource theories, in particular, free states, free operations, and resource measures. In particular, we define a class of simple resource measures that can be calculated without optimization, and that are monotone nonincreasing under operations that commute with the resource destroying map. We apply our theory to the resources of coherence and quantum correlations (e.g., discord), two prominent features of nonclassicality.

  4. Operating Systems Standards Working Group (OSSWG) Next Generation Computer Resources (NGCR) Program First Annual Report - October 1990

    DTIC Science & Technology

    1991-04-01

    Plaza Hotel 22-26 Jan 1990 Mobile, AL _ 9th meeting: NAVSWC, White Oak, MD 6-8 Mar 1990 10th meeting: SEI, Pittsburgh, PA 17-19 Apr 1990 11th meeting...meetings, and no cost to the Navy. Hotels are suitable, but a commitment from the hotel for meeting space may be difficult to get unless attendance can be...Workshop on Operating Systems For Mission Critical Computing, which will be held September 19-20 at the Marriot in Greenbelt, Maryland. Phil, Tricia

  5. Information technology resources assessment

    SciTech Connect

    Loken, S.C.

    1993-01-01

    The emphasis in Information Technology (IT) development has shifted from technology management to information management, and the tools of information management are increasingly at the disposal of end-users, people who deal with information. Moreover, the interactive capabilities of technologies such as hypertext, scientific visualization, virtual reality, video conferencing, and even database management systems have placed in the hands of users a significant amount of discretion over how these resources will be used. The emergence of high-performance networks, as well as network operating systems, improved interoperability, and platform independence of applications will eliminate technical barriers to the use of data, increase the power and range of resources that can be used cooperatively, and open up a wealth of possibilities for new applications. The very scope of these prospects for the immediate future is a problem for the IT planner or administrator. Technology procurement and implementation, integration of new technologies into the existing infrastructure, cost recovery and usage of networks and networked resources, training issues, and security concerns such as data protection and access to experiments are just some of the issues that need to be considered in the emerging IT environment. As managers we must use technology to improve competitiveness. When procuring new systems, we must take advantage of scalable resources. New resources such as distributed file systems can improve access to and efficiency of existing operating systems. In addition, we must assess opportunities to improve information worker productivity and information management through tedmologies such as distributed computational visualization and teleseminar applications.

  6. Additive Manufactured Product Integrity

    NASA Technical Reports Server (NTRS)

    Waller, Jess; Wells, Doug; James, Steve; Nichols, Charles

    2017-01-01

    NASA is providing key leadership in an international effort linking NASA and non-NASA resources to speed adoption of additive manufacturing (AM) to meet NASA's mission goals. Participants include industry, NASA's space partners, other government agencies, standards organizations and academia. Nondestructive Evaluation (NDE) is identified as a universal need for all aspects of additive manufacturing.

  7. Accessing opportunistic resources with Bosco

    NASA Astrophysics Data System (ADS)

    Weitzel, D.; Sfiligoi, I.; Bockelman, B.; Frey, J.; Wuerthwein, F.; Fraser, D.; Swanson, D.

    2014-06-01

    Bosco is a software project developed by the Open Science Grid to help scientists better utilize their on-campus computing resources. Instead of submitting jobs through a dedicated gatekeeper, as most remote submission mechanisms use, it uses the built-in SSH protocol to gain access to the cluster. By using a common access method, SSH, we are able to simplify the interaction with the cluster, making the submission process more user friendly. Additionally, it does not add any extra software to be installed on the cluster making Bosco an attractive option for the cluster administrator. In this paper, we will describe Bosco, the personal supercomputing assistant, and how Bosco is used by researchers across the U.S. to manage their computing workflows. In addition, we will also talk about how researchers are using it, including an unique use of Bosco to submit CMS reconstruction jobs to an opportunistic XSEDE resource.

  8. Isothiourea-catalysed enantioselective pyrrolizine synthesis: synthetic and computational studies† †Electronic supplementary information (ESI) available: NMR spectra, HPLC analysis and computational co-ordinates. Data available.12 CCDC 1483759. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c6ob01557c Click here for additional data file. Click here for additional data file. Click here for additional data file.

    PubMed Central

    Stark, Daniel G.; Williamson, Patrick; Gayner, Emma R.; Musolino, Stefania F.; Kerr, Ryan W. F.; Taylor, James E.; Slawin, Alexandra M. Z.; O'Riordan, Timothy J. C.

    2016-01-01

    The catalytic enantioselective synthesis of a range of cis-pyrrolizine carboxylate derivatives with outstanding stereocontrol (14 examples, >95 : 5 dr, >98 : 2 er) through an isothiourea-catalyzed intramolecular Michael addition-lactonisation and ring-opening approach from the corresponding enone acid is reported. An optimised and straightforward three-step synthetic route to the enone acid starting materials from readily available pyrrole-2-carboxaldehydes is delineated, with benzotetramisole (5 mol%) proving the optimal catalyst for the enantioselective process. Ring-opening of the pyrrolizine dihydropyranone products with either MeOH or a range of amines leads to the desired products in excellent yield and enantioselectivity. Computation has been used to probe the factors leading to high stereocontrol, with the formation of the observed cis-steroisomer predicted to be kinetically and thermodynamically favoured. PMID:27489030

  9. Creation of a full color geologic map by computer: A case history from the Port Moller project resource assessment, Alaska Peninsula: A section in Geologic studies in Alaska by the U.S. Geological Survey, 1988

    USGS Publications Warehouse

    Wilson, Frederic H.

    1989-01-01

    Graphics programs on computers can facilitate the compilation and production of geologic maps, including full color maps of publication quality. This paper describes the application of two different programs, GSMAP and ARC/INFO, to the production of a geologic map of the Port Meller and adjacent 1:250,000-scale quadrangles on the Alaska Peninsula. GSMAP was used at first because of easy digitizing on inexpensive computer hardware. Limitations in its editing capability led to transfer of the digital data to ARC/INFO, a Geographic Information System, which has better editing and also added data analysis capability. Although these improved capabilities are accompanied by increased complexity, the availability of ARC/INFO's data analysis capability provides unanticipated advantages. It allows digital map data to be processed as one of multiple data layers for mineral resource assessment. As a result of development of both software packages, it is now easier to apply both software packages to geologic map production. Both systems accelerate the drafting and revision of maps and enhance the compilation process. Additionally, ARC/ INFO's analysis capability enhances the geologist's ability to develop answers to questions of interest that were previously difficult or impossible to obtain.

  10. Assessment Planning and Evaluation of Renewable Energy Resources: an Interactive Computer Assisted Procedure. [hydroelectricity, biomass, and windpower in the Pittsfield metropolitan region, Massachusetts

    NASA Technical Reports Server (NTRS)

    Aston, T. W.; Fabos, J. G.; Macdougall, E. B.

    1982-01-01

    Adaptation and derivation were used to develop a procedure for assessing the availability of renewable energy resources on the landscape while simultaneously accounting for the economic, legal, social, and environmental issues required. Done in a step-by-step fashion, the procedure can be used interactively at the computer terminals. Its application in determining the hydroelectricity, biomass, and windpower in a 40,000 acre study area of Western Massachusetts shows that: (1) three existing dam sites are physically capable of being retrofitted for hydropower; (2) each of three general areas has a mean annual windspeed exceeding 14 mph and is conductive to windpower; and (3) 20% of the total land area consists of prime agricultural biomass while 30% of the area is prime forest biomass land.

  11. Distributed Computing.

    ERIC Educational Resources Information Center

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  12. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  13. The development and evaluation of a computer-based resource to assist pre-registration nursing students with their preparation for objective structured clinical examinations (OSCEs).

    PubMed

    Bloomfield, Jacqueline; Fordham-Clarke, Carol; Pegram, Anne; Cunningham, Brent

    2010-02-01

    This paper presents a narrative discussion of an innovative, computer-based resource developed, implemented and evaluated by a small project team at a school of nursing and midwifery in London. The interactive resource was designed to assist first and second year pre-registration nursing students with both their clinical skills revision and formative preparation for Objective Structured Clinical Examinations and involved a small range of clinical skills. These included: skin assessment; hand hygiene; reading a drug prescription chart, weighing a baby and assessment of an intravenous cannulae site. The processes involved in the development of the tool are described and, the key drivers informing its development are identified. Although a formal research approach was not adopted a summary of feedback obtained from anonymous student evaluations is included. This provides important insights into the perceived usefulness of the tool and is discussed in light of the challenges and practicalities associated with the content development and technical issues. The paper concludes by identifying proposed future developments and wider applications of this innovative clinical skills education initiative within nursing and healthcare education.

  14. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  15. Segmentation of pulmonary nodules in computed tomography using a regression neural network approach and its application to the Lung Image Database Consortium and Image Database Resource Initiative dataset.

    PubMed

    Messay, Temesguen; Hardie, Russell C; Tuinstra, Timothy R

    2015-05-01

    We present new pulmonary nodule segmentation algorithms for computed tomography (CT). These include a fully-automated (FA) system, a semi-automated (SA) system, and a hybrid system. Like most traditional systems, the new FA system requires only a single user-supplied cue point. On the other hand, the SA system represents a new algorithm class requiring 8 user-supplied control points. This does increase the burden on the user, but we show that the resulting system is highly robust and can handle a variety of challenging cases. The proposed hybrid system starts with the FA system. If improved segmentation results are needed, the SA system is then deployed. The FA segmentation engine has 2 free parameters, and the SA system has 3. These parameters are adaptively determined for each nodule in a search process guided by a regression neural network (RNN). The RNN uses a number of features computed for each candidate segmentation. We train and test our systems using the new Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI) data. To the best of our knowledge, this is one of the first nodule-specific performance benchmarks using the new LIDC-IDRI dataset. We also compare the performance of the proposed methods with several previously reported results on the same data used by those other methods. Our results suggest that the proposed FA system improves upon the state-of-the-art, and the SA system offers a considerable boost over the FA system.

  16. The Genexpress IMAGE knowledge base of the human brain transcriptome: a prototype integrated resource for functional and computational genomics.

    PubMed

    Piétu, G; Mariage-Samson, R; Fayein, N A; Matingou, C; Eveno, E; Houlgatte, R; Decraene, C; Vandenbrouck, Y; Tahi, F; Devignes, M D; Wirkner, U; Ansorge, W; Cox, D; Nagase, T; Nomura, N; Auffray, C

    1999-02-01

    Expression profiles of 5058 human gene transcripts represented by an array of 7451 clones from the first IMAGE Consortium cDNA library from infant brain have been collected by semiquantitative hybridization of the array with complex probes derived by reverse transcription of mRNA from brain and five other human tissues. Twenty-one percent of the clones corresponded to transcripts that could be classified in general categories of low, moderate, or high abundance. These expression profiles were integrated with cDNA clone and sequence clustering and gene mapping information from an upgraded version of the Genexpress Index. For seven gene transcripts found to be transcribed preferentially or specifically in brain, the expression profiles were confirmed by Northern blot analyses of mRNA from eight adult and four fetal tissues, and 15 distinct regions of brain. In four instances, further documentation of the sites of expression was obtained by in situ hybridization of rat-brain tissue sections. A systematic effort was undertaken to further integrate available cytogenetic, genetic, physical, and genic map informations through radiation-hybrid mapping to provide a unique validated map location for each of these genes in relation to the disease map. The resulting Genexpress IMAGE Knowledge Base is illustrated by five examples presented in the printed article with additional data available on a dedicated Web site at the address http://idefix.upr420.vjf.cnrs.fr/EXPR++ +/ welcome.html.

  17. Computational studies on the interactions among redox couples, additives and TiO2: implications for dye-sensitized solar cells.

    PubMed

    Asaduzzaman, Abu Md; Schreckenbach, Georg

    2010-11-21

    One of the major and unique components of dye-sensitized solar cells (DSSC) is the iodide/triiodide redox couple. Periodic density-functional calculations have been carried out to study the interactions among three different components of the DSSC, i.e. the redox shuttle, the TiO(2) semiconductor surface, and nitrogen containing additives, with a focus on the implications for the performance of the DSSC. Iodide and bromide with alkali metal cations as counter ions are strongly adsorbed on the TiO(2) surface. Small additive molecules also strongly interact with TiO(2). Both interactions induce a negative shift of the Fermi energy of TiO(2). The negative shift of the Fermi energy is related to the performance of the cell by increasing the open voltage of the cell and retarding the injection dynamics (decreasing the short circuit current). Additive molecules, however, have relatively weaker interaction with iodide and triiodide.

  18. Computer Stimulation

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth

    1977-01-01

    Discusses computer simulation approach of Limits to Growth, in which interactions of five variables (population, pollution, resources, food per capita, and industrial output per capita) indicate status of the world. Reviews other books that predict future of the world. (CS)

  19. Characterization of pulmonary nodules on computer tomography (CT) scans: the effect of additive white noise on features selection and classification performance

    NASA Astrophysics Data System (ADS)

    Osicka, Teresa; Freedman, Matthew T.; Ahmed, Farid

    2007-03-01

    The goal of this project is to use computer analysis to classify small lung nodules, identified on CT, into likely benign and likely malignant categories. We compared discrete wavelet transforms (DWT) based features and a modification of classical features used and reported by others. To determine the best combination of features for classification, several intensities of white noise were added to the original images to determine the effect of such noise on classification accuracy. Two different approaches were used to determine the effect of noise: in the first method the best features for classification of nodules on the original image were retained as noise was added. In the second approach, we recalculated the results to reselect the best classification features for each particular level of added noise. The CT images are from the National Lung Screening Trial (NLST) of the National Cancer Institute (NCI). For this study, nodules were extracted in window frames of three sizes. Malignant nodules were cytologically or histogically diagnosed, while benign had two-year follow-up. A linear discriminant analysis with Fisher criterion (FLDA) approach was used for feature selection and classification, and decision matrix for matched sample to compare the classification accuracy. The initial features mode revealed sensitivity to both the amount of noise and the size of window frame. The recalculated feature mode proved more robust to noise with no change in terms of classification accuracy. This indicates that the best features for computer classification of lung nodules will differ with noise, and, therefore, with exposure.

  20. Comprehensive cardiac assessment with multislice computed tomography: evaluation of left ventricular function and perfusion in addition to coronary anatomy in patients with previous myocardial infarction

    PubMed Central

    Henneman, M M; Schuijf, J D; Jukema, J W; Lamb, H J; de Roos, A; Dibbets, P; Stokkel, M P; van der Wall, E E; Bax, J J

    2006-01-01

    Objective To evaluate a comprehensive multislice computed tomography (MSCT) protocol in patients with previous infarction, including assessment of coronary artery stenoses, left ventricular (LV) function and perfusion. Patients and methods 16‐slice MSCT was performed in 21 patients with previous infarction; from the MSCT data, coronary artery stenoses, (regional and global) LV function and perfusion were assessed. Invasive coronary angiography and gated single‐photon emission computed tomography (SPECT) served as the reference standards for coronary artery stenoses and LV function/perfusion, respectively. Results 236 of 241 (98%) coronary artery segments were interpretable on MSCT. The sensitivity and specificity for detection of stenoses were 91% and 97%. Pearson's correlation showed excellent agreement for assessment of LV ejection fraction between MSCT and SPECT (49 (13)% v 53 (12)%, respectively, r  =  0.85). Agreement for assessment of regional wall motion was excellent (92%, κ  =  0.77). In 68 of 73 (93%) segments, MSCT correctly identified a perfusion defect as compared with SPECT, whereas the absence of perfusion defects was correctly detected in 277 of 284 (98%) segments. Conclusions MSCT permits accurate, non‐invasive assessment of coronary artery stenoses, LV function and perfusion in patients with previous infarction. All parameters can be assessed from a single dataset. PMID:16740917

  1. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  2. Computer modelling integrated with micro-CT and material testing provides additional insight to evaluate bone treatments: Application to a beta-glycan derived whey protein mice model.

    PubMed

    Sreenivasan, D; Tu, P T; Dickinson, M; Watson, M; Blais, A; Das, R; Cornish, J; Fernandez, J

    2016-01-01

    The primary aim of this study was to evaluate the influence of a whey protein diet on computationally predicted mechanical strength of murine bones in both trabecular and cortical regions of the femur. There was no significant influence on mechanical strength in cortical bone observed with increasing whey protein treatment, consistent with cortical tissue mineral density (TMD) and bone volume changes observed. Trabecular bone showed a significant decline in strength with increasing whey protein treatment when nanoindentation derived Young׳s moduli were used in the model. When microindentation, micro-CT phantom density or normalised Young׳s moduli were included in the model a non-significant decline in strength was exhibited. These results for trabecular bone were consistent with both trabecular bone mineral density (BMD) and micro-CT indices obtained independently. The secondary aim of this study was to characterise the influence of different sources of Young׳s moduli on computational prediction. This study aimed to quantify the predicted mechanical strength in 3D from these sources and evaluate if trends and conclusions remained consistent. For cortical bone, predicted mechanical strength behaviour was consistent across all sources of Young׳s moduli. There was no difference in treatment trend observed when Young׳s moduli were normalised. In contrast, trabecular strength due to whey protein treatment significantly reduced when material properties from nanoindentation were introduced. Other material property sources were not significant but emphasised the strength trend over normalised material properties. This shows strength at the trabecular level was attributed to both changes in bone architecture and material properties.

  3. Do We Really Need Additional Contrast-Enhanced Abdominal Computed Tomography for Differential Diagnosis in Triage of Middle-Aged Subjects With Suspected Biliary Pain

    PubMed Central

    Hwang, In Kyeom; Lee, Yoon Suk; Kim, Jaihwan; Lee, Yoon Jin; Park, Ji Hoon; Hwang, Jin-Hyeok

    2015-01-01

    Abstract Enhanced computed tomography (CT) is widely used for evaluating acute biliary pain in the emergency department (ED). However, concern about radiation exposure from CT has also increased. We investigated the usefulness of pre-contrast CT for differential diagnosis in middle-aged subjects with suspected biliary pain. A total of 183 subjects, who visited the ED for suspected biliary pain from January 2011 to December 2012, were included. Retrospectively, pre-contrast phase and multiphase CT findings were reviewed and the detection rate of findings suggesting disease requiring significant treatment by noncontrast CT (NCCT) was compared with cases detected by multiphase CT. Approximately 70% of total subjects had a significant condition, including 1 case of gallbladder cancer and 126 (68.8%) cases requiring intervention (122 biliary stone-related diseases, 3 liver abscesses, and 1 liver hemangioma). The rate of overlooking malignancy without contrast enhancement was calculated to be 0% to 1.5%. Biliary stones and liver space-occupying lesions were found equally on NCCT and multiphase CT. Calculated probable rates of overlooking acute cholecystitis and biliary obstruction were maximally 6.8% and 4.2% respectively. Incidental significant finding unrelated with pain consisted of 1 case of adrenal incidentaloma, which was also observed in NCCT. NCCT might be sufficient to detect life-threatening or significant disease requiring early treatment in young adults with biliary pain. PMID:25700321

  4. Decreased length of stay after addition of healthcare provider in emergency department triage: a comparison between computer-simulated and real-world interventions

    PubMed Central

    Al-Roubaie, Abdul Rahim; Goldlust, Eric Jonathan

    2013-01-01

    Objective (1) To determine the effects of adding a provider in triage on average length of stay (LOS) and proportion of patients with >6 h LOS. (2) To assess the accuracy of computer simulation in predicting the magnitude of such effects on these metrics. Methods A group-level quasi-experimental trial comparing the St. Louis Veterans Affairs Medical Center emergency department (1) before intervention, (2) after institution of provider in triage, and discrete event simulation (DES) models of similar (3) ‘before’ and (4) ‘after’ conditions. The outcome measures were daily mean LOS and percentage of patients with LOS >6 h. Results The DES-modelled intervention predicted a decrease in the %6-hour LOS from 19.0% to 13.1%, and a drop in the daily mean LOS from 249 to 200 min (p<0.0001). Following (actual) intervention, the number of patients with LOS >6 h decreased from 19.9% to 14.3% (p<0.0001), with the daily mean LOS decreasing from 247 to 210 min (p<0.0001). Conclusion Physician and mid-level provider coverage at triage significantly reduced emergency department LOS in this setting. DES accurately predicted the magnitude of this effect. These results suggest further work in the generalisability of triage providers and in the utility of DES for predicting quantitative effects of process changes. PMID:22398851

  5. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  6. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  7. Regulatory use of computational toxicology tools and databases at the United States Food and Drug Administration's Office of Food Additive Safety.

    PubMed

    Arvidson, Kirk B; Chanderbhan, Ronald; Muldoon-Jacobs, Kristi; Mayer, Julie; Ogungbesan, Adejoke

    2010-07-01

    Over 10 years ago, the Office of Food Additive Safety (OFAS) in the FDA's Center for Food Safety and Applied Nutrition implemented the formal use of structure-activity relationship analysis and quantitative structure-activity relationship (QSAR) analysis in the premarket review of food-contact substances. More recently, OFAS has implemented the use of multiple QSAR software packages and has begun investigating the use of metabolism data and metabolism predictive models in our QSAR evaluations of food-contact substances. In this article, we provide an overview of the programs used in OFAS as well as a perspective on how to apply multiple QSAR tools in the review process of a new food-contact substance.

  8. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

    NASA Astrophysics Data System (ADS)

    Gill, Wonpyong

    2016-01-01

    This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

  9. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  10. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  11. Argonne's Laboratory computing center - 2007 annual report.

    SciTech Connect

    Bair, R.; Pieper, G. W.

    2008-05-28

    performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

  12. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    SciTech Connect

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  13. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    2000-01-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues

  14. Scheduling Nonconsumable Resources

    NASA Technical Reports Server (NTRS)

    Porta, Harry J.

    1990-01-01

    Users manual describes computer program SWITCH that schedules use of resources - by appliances switched on and off and use resources while they are on. Plans schedules according to predetermined goals; revises schedules when new goals imposed. Program works by depth-first searching with strict chronological back-tracking. Proceeds to evaluate alternatives as necessary, sometimes interacting with user.

  15. Resources for Teaching Astronomy.

    ERIC Educational Resources Information Center

    Grafton, Teresa; Suggett, Martin

    1991-01-01

    Resources that are available for teachers presenting astronomy in the National Curriculum are listed. Included are societies and organizations, resource centers and places to visit, planetaria, telescopes and binoculars, planispheres, star charts, night sky diaries, equipment, audiovisual materials, computer software, books, and magazines. (KR)

  16. Algae Resources

    SciTech Connect

    2016-06-01

    Algae are highly efficient at producing biomass, and they can be found all over the planet. Many use sunlight and nutrients to create biomass, which contain key components—including lipids, proteins, and carbohydrates— that can be converted and upgraded to a variety of biofuels and products. A functional algal biofuels production system requires resources such as suitable land and climate, sustainable management of water resources, a supplemental carbon dioxide (CO2) supply, and other nutrients (e.g., nitrogen and phosphorus). Algae can be an attractive feedstock for many locations in the United States because their diversity allows for highpotential biomass yields in a variety of climates and environments. Depending on the strain, algae can grow by using fresh, saline, or brackish water from surface water sources, groundwater, or seawater. Additionally, they can grow in water from second-use sources such as treated industrial wastewater; municipal, agricultural, or aquaculture wastewater; or produced water generated from oil and gas drilling operations.

  17. The Computer Fraud and Abuse Act of 1986. Hearing before the Committee on the Judiciary, United States Senate, Ninety-Ninth Congress, Second Session on S.2281, a Bill To Amend Title 18, United States Code, To Provide Additional Penalties for Fraud and Related Activities in Connection with Access Devices and Computers, and for Other Purposes.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on the Judiciary.

    The proposed legislation--S. 2281--would amend federal laws to provide additional penalties for fraud and related activities in connection with access devices and computers. The complete text of the bill and proceedings of the hearing are included in this report. Statements and materials submitted by the following committee members and witnesses…

  18. Neutron Characterization for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.

    2013-01-01

    Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL

  19. The Integrator Role in Academic Computing. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Masland, Andrew T.

    The role of integrators, individuals who facilitate communication and understanding, in facilitating the use of computer technology was explored, based on five case studies. It is suggested that a major obstacle to successful academic computing is getting faculty and students to use available computer resources effectively. In addition to helping…

  20. Computer Graphics on the Complex Plane: An Introduction to Julia Sets and Fractals.

    ERIC Educational Resources Information Center

    Thomas, David A.

    1988-01-01

    Provides an introduction to the mathematics and art of the complex plane. Includes two computer programs written in GW BASIC to facilitate a reader's initial investigation. Identifies additional software resources. Stresses the appeal to future scientists, mathematicians, and artists. (CW)

  1. Distributive Computer Networking: Making It Work on a Regional Basis: Effective sharing through a network requires new management and resource distribution techniques.

    PubMed

    Cornew, R W; Morse, P M

    1975-08-15

    After 4 years of operation the NERComP network is now a self-supporting success. Some of the reasons for its success are that (i) the network started small and built up utilization; (ii) the members, through monthly trustee meetings, practiced "participatory management" from the outset; (iii) unlike some networks, NERComP appealed to individual academic and research users who were terminal-oriented and who controlled their own budgets; (iv) the compactness of the New England region made it an ideal laboratory for testing networking concepts; and (v) a dedicated staff was willing to work hard in the face of considerable uncertainty. While the major problems were "political, organizational and economic" (1) we have found that they can be solved if the network meets real needs. We have also found that it is difficult to proceed beyond a certain point without investing responsibility and authority in the networking organization. Conversely, there is a need to distribute some responsibilities such as marketing and user services back to the member institutions. By adopting a modest starting point and achieving limited goals the necessary trust and working relationships between institutions can be built. In our case the necessary planning has been facilitated by recognizing three distinct network functions: governance, user services, and technical operations. Separating out the three essential networing tasks and dealing with each individually through advisory committees, each with its own staff coordinator, has overcome a distracting tendency to address all issues at once. It has also provided an element of feedback between the end user and the supplier not usually present in networking activity. The success of NERComP demonstrates that a distributive-type network can work. Our experiences in New England-which, because of its numerous colleges and universities free from domination by any single institution, is a microcosm for academic computing in the United States

  2. Future computing needs for Fermilab

    SciTech Connect

    Not Available

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should

  3. Negotiating two electronic resources for nursing.

    PubMed

    Kirkpatrick, J; Kuipers, J

    1995-07-01

    The five primary knowledge resource databases available in the Virginia Henderson International Library are reviewed and other menu items are discussed including library services currently under development. This article guides nurse managers to access the library on their own computer. In addition, the opportunities and advantages of the new On-line Journal of Knowledge Synthesis for Nursing are related and requirements for accessing the journal are outlined.

  4. Computational aerodynamics and artificial intelligence

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.; Kutler, P.

    1984-01-01

    The general principles of artificial intelligence are reviewed and speculations are made concerning how knowledge based systems can accelerate the process of acquiring new knowledge in aerodynamics, how computational fluid dynamics may use expert systems, and how expert systems may speed the design and development process. In addition, the anatomy of an idealized expert system called AERODYNAMICIST is discussed. Resource requirements for using artificial intelligence in computational fluid dynamics and aerodynamics are examined. Three main conclusions are presented. First, there are two related aspects of computational aerodynamics: reasoning and calculating. Second, a substantial portion of reasoning can be achieved with artificial intelligence. It offers the opportunity of using computers as reasoning machines to set the stage for efficient calculating. Third, expert systems are likely to be new assets of institutions involved in aeronautics for various tasks of computational aerodynamics.

  5. Polylactides in additive biomanufacturing.

    PubMed

    Poh, Patrina S P; Chhaya, Mohit P; Wunner, Felix M; De-Juan-Pardo, Elena M; Schilling, Arndt F; Schantz, Jan-Thorsten; van Griensven, Martijn; Hutmacher, Dietmar W

    2016-12-15

    New advanced manufacturing technologies under the alias of additive biomanufacturing allow the design and fabrication of a range of products from pre-operative models, cutting guides and medical devices to scaffolds. The process of printing in 3 dimensions of cells, extracellular matrix (ECM) and biomaterials (bioinks, powders, etc.) to generate in vitro and/or in vivo tissue analogue structures has been termed bioprinting. To further advance in additive biomanufacturing, there are many aspects that we can learn from the wider additive manufacturing (AM) industry, which have progressed tremendously since its introduction into the manufacturing sector. First, this review gives an overview of additive manufacturing and both industry and academia efforts in addressing specific challenges in the AM technologies to drive toward AM-enabled industrial revolution. After which, considerations of poly(lactides) as a biomaterial in additive biomanufacturing are discussed. Challenges in wider additive biomanufacturing field are discussed in terms of (a) biomaterials; (b) computer-aided design, engineering and manufacturing; (c) AM and additive biomanufacturing printers hardware; and (d) system integration. Finally, the outlook for additive biomanufacturing was discussed.

  6. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  7. The Merit Computer Network

    ERIC Educational Resources Information Center

    Aupperle, Eric M.; Davis, Donna L.

    1978-01-01

    The successful Merit Computer Network is examined in terms of both technology and operational management. The network is fully operational and has a significant and rapidly increasing usage, with three major institutions currently sharing computer resources. (Author/CMV)

  8. Herpes - resources

    MedlinePlus

    Genital herpes - resources; Resources - genital herpes ... The following organizations are good resources for information on genital herpes : March of Dimes -- www.marchofdimes.com/pregnancy/complications-herpes The American College of Obstetricians and Gynecologists -- ...

  9. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  10. Instructional Resources. Training and Adult Education Resources.

    ERIC Educational Resources Information Center

    Wurster, Susann L., Ed.

    1995-01-01

    Describes training and adult education resources available from ERIC: "Applications of an Adult Motivational Instructional Design Model"; "Visual and Digital Technologies for Adult Learning"; "Applications of Computer-Aided Instruction in Adult Education and Literacy"; and "The San Diego CWELL Project. Report of…

  11. An investigation on the effect of second-order additional thickness distributions to the upper surface of an NACA 64 sub 1-212 airfoil. [using flow equations and a CDC 7600 digital computer

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Merz, A. W.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of an NACA 64 sub 1 - 212 airfoil. Additional thickness distributions employed were in the form of two second-order polynomial arcs which have a specified thickness at a given chordwise location. The forward arc disappears at the airfoil leading edge, the aft arc disappears at the airfoil trailing edge. At the juncture of the two arcs, x = x, continuity of slope is maintained. The effect of varying the maximum additional thickness and its chordwise location on airfoil lift coefficient, pitching moment, and pressure distribution was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic NACA 64 sub 1 - 212 airfoil, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  12. Theoretical effect of modifications to the upper surface of two NACA airfoils using smooth polynomial additional thickness distributions which emphasize leading edge profile and which vary quadratically at the trailing edge. [using flow equations and a CDC 7600 computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of the NACA 64-206 and 64 sub 1 - 212 airfoils. The additional thickness distribution had the form of a continuous mathematical function which disappears at both the leading edge and the trailing edge. The function behaves as a polynomial of order epsilon sub 1 at the leading edge, and a polynomial of order epsilon sub 2 at the trailing edge. Epsilon sub 2 is a constant and epsilon sub 1 is varied over a range of practical interest. The magnitude of the additional thickness, y, is a second input parameter, and the effect of varying epsilon sub 1 and y on the aerodynamic performance of the airfoil was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic airfoils, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  13. An investigation on the effect of second-order additional thickness distributions to the upper surface of an NACA 64-206 airfoil. [using flow equations and a CDC 7600 digital computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of an NACA 64-206 airfoil. Additional thickness distributions employed were in the form of two second-order polynomial arcs which have a specified thickness at a given chordwise location. The forward arc disappears at the airfoil leading edge, the aft arc disappears at the airfoil trailing edge. At the juncture of the two arcs, x = x, continuity of slope is maintained. The effect of varying the maximum additional thickness and its chordwise location on airfoil lift coefficient, pitching moment, and pressure distribution was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic NACA 64-206 airfoil, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  14. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...Summary Cloud Computing is in essence an economic model • It is a different way to acquire and manage IT resources There are multiple cloud providers...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  15. Computers and small satellites: How FORTE is utilizing the WWW as a {open_quotes}paperless{close_quotes} information resource and the development of a unique resource management planning tool

    SciTech Connect

    Roussel-Dupre, D.; Carter, M.; Franz, R.

    1997-10-01

    The Fast-On-orbit Recording of Transient Events (FORTE) satellite is the second satellite to be developed and flown by Los Alamos National Laboratory and is scheduled to be launched August, 1997 by a Pegasus XL rocket. FORTE follows in the footsteps of the ALEXIS satellite in utilizing a very small operations crew for mission operations. Partially based upon the ALEXIS automation and World Wide Web (WWW) usage for data dissemination, FORTE began at an early stage of ground processing to use the web as a repository of information about all aspects of the satellite. Detailed descriptions of the various satellite and experiment components, cable diagrams, integration photographs as well as extensive test data have all been compiled into a single site as a means of archiving the data at a single location. In this manner, it is readily available during times of ground testing, ground station operation training as well as anomaly resolution. Small satellites usually require extensive effort to optimize operation under minimal resources. For the FORTE satellite, a unique planning tool has been developed over the past 2 years which balances the various resources of the satellite (power, memory, downlink, on board command buffer, etc.) to provide the maximum data acquisition. This paper will concentrate on a description of both the extensive web interface and the planning tool. 6 refs.

  16. Program Evaluation Resources

    EPA Pesticide Factsheets

    These resources list tools to help you conduct evaluations, find organizations outside of EPA that are useful to evaluators, and find additional guides on how to do evaluations from organizations outside of EPA.

  17. The Universal Protein Resource (UniProt)

    PubMed Central

    2008-01-01

    The Universal Protein Resource (UniProt) provides a stable, comprehensive, freely accessible, central resource on protein sequences and functional annotation. The UniProt Consortium is a collaboration between the European Bioinformatics Institute (EBI), the Protein Information Resource (PIR) and the Swiss Institute of Bioinformatics (SIB). The core activities include manual curation of protein sequences assisted by computational analysis, sequence archiving, development of a user-friendly UniProt website, and the provision of additional value-added information through cross-references to other databases. UniProt is comprised of four major components, each optimized for different uses: the UniProt Knowledgebase, the UniProt Reference Clusters, the UniProt Archive and the UniProt Metagenomic and Environmental Sequences database. UniProt is updated and distributed every three weeks, and can be accessed online for searches or download at http://www.uniprot.org. PMID:18045787

  18. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software

    PubMed Central

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2014-01-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely. PMID:23088273

  19. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software.

    PubMed

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2012-10-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely.

  20. Computer-Aided Drafting and Design Series. Educational Resources for the Machine Tool Industry, Course Syllabi, [and] Instructor's Handbook. Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment in computer-aided drafting and design in the machine tool industry. The program was developed through a modification of the DACUM (Developing a Curriculum)…

  1. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  2. Fermilab computing at the Intensity Frontier

    SciTech Connect

    Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less on the development of tools and infrastructure.

  3. High-performance computing and communications

    SciTech Connect

    Stevens, R.

    1993-11-01

    This presentation has two parts. The first part discusses the US High-Performance Computing and Communications program -- its goals, funding, process, revisions, and research in high-performance computing systems, advanced software technology, and basic research and human resources. The second part of the presentation covers specific work conducted under this program at Argonne National Laboratory. Argonne`s efforts focus on computational science research, software tool development, and evaluation of experimental computer architectures. In addition, the author describes collaborative activities at Argonne in high-performance computing, including an Argonne/IBM project to evaluate and test IBM`s newest parallel computers and the Scalable I/O Initiative being spearheaded by the Concurrent Supercomputing Consortium.

  4. Reconciling resource utilization and resource selection functions

    USGS Publications Warehouse

    Hooten, Mevin B.; Hanks, Ephraim M.; Johnson, Devin S.; Alldredge, Mat W.

    2013-01-01

    Summary: 1. Analyses based on utilization distributions (UDs) have been ubiquitous in animal space use studies, largely because they are computationally straightforward and relatively easy to employ. Conventional applications of resource utilization functions (RUFs) suggest that estimates of UDs can be used as response variables in a regression involving spatial covariates of interest. 2. It has been claimed that contemporary implementations of RUFs can yield inference about resource selection, although to our knowledge, an explicit connection has not been described. 3. We explore the relationships between RUFs and resource selection functions from a hueristic and simulation perspective. We investigate several sources of potential bias in the estimation of resource selection coefficients using RUFs (e.g. the spatial covariance modelling that is often used in RUF analyses). 4. Our findings illustrate that RUFs can, in fact, serve as approximations to RSFs and are capable of providing inference about resource selection, but only with some modification and under specific circumstances. 5. Using real telemetry data as an example, we provide guidance on which methods for estimating resource selection may be more appropriate and in which situations. In general, if telemetry data are assumed to arise as a point process, then RSF methods may be preferable to RUFs; however, modified RUFs may provide less biased parameter estimates when the data are subject to location error.

  5. Computation Directorate 2008 Annual Report

    SciTech Connect

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  6. ALS - resources

    MedlinePlus

    Resources - ALS ... The following organizations are good resources for information on amyotrophic lateral sclerosis : Muscular Dystrophy Association -- www.mda.org/disease/amyotrophic-lateral-sclerosis National Amyotrophic Lateral Sclerosis (ALS) ...

  7. Breastfeeding - resources

    MedlinePlus

    Resources - breastfeeding ... The following organizations are good resources for information on breastfeeding and breastfeeding problems : La Leche League International Inc. -- www.lalecheleague.org March of Dimes -- www.marchofdimes.com/ ...

  8. Alcoholism - resources

    MedlinePlus

    Resources - alcoholism ... The following organizations are good resources for information on alcoholism : Alcoholics Anonymous -- www.aa.org Al-Anon Family Groups www.al-anon.org National Institute on Alcohol ...

  9. Scoliosis - resources

    MedlinePlus

    Resources - scoliosis ... The following organizations are good resources for information on scoliosis : American Academy of Orthopedic Surgeons -- orthoinfo.aaos.org/topic.cfm?topic=A00626 National Institute of Arthritis and ...

  10. Migraine - resources

    MedlinePlus

    Resources - migraine ... The following organizations are good resources for information on migraines : American Migraine Foundation -- www.americanmigrainefoundation.org National Headache Foundation -- www.headaches.org National Institute of Neurological Disorders ...

  11. Incontinence - resources

    MedlinePlus

    Resources - incontinence ... The following organizations are good resources for information on incontinence. Fecal incontinence : The American Congress of Obstetricians and Gynecologists -- www.acog.org/~/media/for%20patients/faq139.ashx ...

  12. Blindness - resources

    MedlinePlus

    Resources - blindness ... The following organizations are good resources for information on blindness : American Foundation for the Blind -- www.afb.org Foundation Fighting Blindness -- www.blindness.org National Eye Institute -- ...

  13. Epilepsy - resources

    MedlinePlus

    Resources - epilepsy ... The following organizations are good resources for information on epilepsy : Epilepsy Foundation -- www.epilepsy.com National Institute of Neurological Disorders and Stroke -- www.ninds.nih.gov/disorders/ ...

  14. Infertility - resources

    MedlinePlus

    Resources - infertility ... The following organizations are good resources for information on infertility : Centers for Disease Control and Prevention -- www.cdc/gov/reproductivehealth/infertility March of Dimes -- www.marchofdimes.com/ ...

  15. Ostomy - resources

    MedlinePlus

    Resources - ostomy ... The following organizations are good resources for information on ostomies: American Society of Colon and Rectal Surgeons -- www.fascrs.org/patients/disease-condition/ostomy-expanded-version United ...

  16. Psoriasis - resources

    MedlinePlus

    Resources - psoriasis ... The following organizations are good resources for information about psoriasis : American Academy of Dermatology -- www.aad.org/skin-conditions/dermatology-a-to-z/psoriasis National Institute of ...

  17. Lupus - resources

    MedlinePlus

    Resources - lupus ... The following organizations are good resources for information on systemic lupus erythematosus : The Lupus Foundation of America -- www.lupus.org The National Institute of Arthritis and Musculoskeletal ...

  18. Scleroderma - resources

    MedlinePlus

    Resources - scleroderma ... The following organizations are good resources for information on scleroderma : American College of Rheumatology -- www.rheumatology.org/practice/clinical/patients/diseases_and_conditions/scleroderma.asp National Institute ...

  19. Alzheimer - resources

    MedlinePlus

    Resources - Alzheimer ... The following organizations are good resources for information on Alzheimer disease : Alzheimer's Association -- www.alz.org Alzheimer's Disease Education and Referral Center -- www.nia.nih.gov/alzheimers ...

  20. Cancer - resources

    MedlinePlus

    Resources - cancer ... The following organizations are good resources for information on cancer : American Cancer Society -- www.cancer.org Cancer Care -- www.cancercare.org Cancer.Net -- www.cancer.net/coping- ...

  1. SIDS - resources

    MedlinePlus

    Resources - SIDS ... The following organizations are good resources for information on SIDS (Sudden Infant Death Syndrome) : American SIDS Institute -- sids.org Centers for Disease Control and Prevention -- www.cdc. ...

  2. Janus/Ada Implementation of a Star Cluster Network of Personal Computers with Interface to an ETHERNET LAN Allowing Access to DDN Resources.

    DTIC Science & Technology

    1986-06-01

    Systems Corporation , St. Joseph, Michigan *Z-DOS Operating System Z-100 Microcomputer Microsoft Corporation , Belview, Washington -’ MS-DOS Operating...System Digital Research Incorporated, Pacific Grove, California CP/M-86 Operating System PL/I-86 Programming Language Intel Corporation , Santa Clara...California 86/12A Single Board Computer HULTIBUS Architecture Digital Equipment Corporation , Maynard, Massachusetts VAX 11/780 Minicomputer VMS Operating

  3. Kentucky Department for Natural Resources and Environmental Protection permit application for air contaminant source: SRC-I demonstration plant, Newman, Kentucky. Supplement I. [Additional information on 38 items requested by KY/DNREP

    SciTech Connect

    Pearson, Jr., John F.

    1981-02-13

    In response to a letter from KY/DNREP, January 19, 1981, ICRC and DOE have prepared the enclosed supplement to the Kentucky Department for Natural Resources and Environmental Protection Permit Application for Air Contaminant Source for the SRC-I Demonstration Plant. Each of the 38 comments contained in the letter has been addressed in accordance with the discussions held in Frankfort on January 28, 1981, among representatives of KY/DNREP, EPA Region IV, US DOE, and ICRC. The questions raised involve requests for detailed information on the performance and reliability of proprietary equipment, back-up methods, monitoring plans for various pollutants, composition of wastes to flares, emissions estimates from particular operations, origin of baseline information, mathematical models, storage tanks, dusts, etc. (LTN)

  4. Distributed computing in bioinformatics.

    PubMed

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  5. Act for Better Child Care Services of 1988. Report from the Committee on Labor and Human Resources Together with Additional Views (To Accompany S. 1885). 100th Congress, 2nd Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.

    The Act for Better Child Care Services of 1988, additional views of members of the United States Senate, and related materials are reported. The purpose of the Act is to increase the availability, affordability, and quality of child care throughout the nation. The legislation provides direct financial assistance to low-income and working families…

  6. Concurrent negotiation and coordination for grid resource coallocation.

    PubMed

    Sim, Kwang Mong; Shi, Benyun

    2010-06-01

    Bolstering resource coallocation is essential for realizing the Grid vision, because computationally intensive applications often require multiple computing resources from different administrative domains. Given that resource providers and consumers may have different requirements, successfully obtaining commitments through concurrent negotiations with multiple resource providers to simultaneously access several resources is a very challenging task for consumers. The impetus of this paper is that it is one of the earliest works that consider a concurrent negotiation mechanism for Grid resource coallocation. The concurrent negotiation mechanism is designed for 1) managing (de)commitment of contracts through one-to-many negotiations and 2) coordination of multiple concurrent one-to-many negotiations between a consumer and multiple resource providers. The novel contributions of this paper are devising 1) a utility-oriented coordination (UOC) strategy, 2) three classes of commitment management strategies (CMSs) for concurrent negotiation, and 3) the negotiation protocols of consumers and providers. Implementing these ideas in a testbed, three series of experiments were carried out in a variety of settings to compare the following: 1) the CMSs in this paper with the work of others in a single one-to-many negotiation environment for one resource where decommitment is allowed for both provider and consumer agents; 2) the performance of the three classes of CMSs in different resource market types; and 3) the UOC strategy with the work of others [e.g., the patient coordination strategy (PCS )] for coordinating multiple concurrent negotiations. Empirical results show the following: 1) the UOC strategy achieved higher utility, faster negotiation speed, and higher success rates than PCS for different resource market types; and 2) the CMS in this paper achieved higher final utility than the CMS in other works. Additionally, the properties of the three classes of CMSs in

  7. Agriculture, forestry, range resources

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.

    1974-01-01

    The necessary elements to perform global inventories of agriculture, forestry, and range resources are being brought together through the use of satellites, sensors, computers, mathematics, and phenomenology. Results of ERTS-1 applications in these areas, as well as soil mapping, are described.

  8. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  9. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2016-07-12

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  10. Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Insolia, Gerard

    This document contains course outlines in computer-aided manufacturing developed for a business-industry technology resource center for firms in eastern Pennsylvania by Northampton Community College. The four units of the course cover the following: (1) introduction to computer-assisted design (CAD)/computer-assisted manufacturing (CAM); (2) CAM…

  11. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  12. SLA-aware Resource Management

    NASA Astrophysics Data System (ADS)

    Sun, Yih Leong; Perrott, Ron; Harmer, Terence J.; Cunningham, Christina; Wright, Peter; Kennedy, John; Edmonds, Andy; Bayon, Victor; Maza, Jacek; Berginc, Gregor; Hadalin, Primož

    The management of infrastructure resources in a large-scale environment such as Grid Computing is a challenging task and places significant demands on resource discovery, scheduling and the underlying communication channels. The fulfillment of the business goals and service quality in such an environment requires an infrastructure to cope with changes in demand and infrastructure performance. In this paper, we propose an abstract service-oriented framework for SLA-aware dynamic resource management. The framework provides selfmanaging, self-configuration and self-healing strategies in order to support autonomic and ambient service management. We study an SLA negotiation process at the infrastructure resource layer, live migration for resource re-provisioning, a multi-layer architecture framework to monitor infrastructure resources and a harmonized interface to access arbitrary sources of infrastructure resources based on SLA requirements. Resource usage will be optimized according to the provider policies and SLA requirements.

  13. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1998

    1998-01-01

    Lists educational media-related journals, books, ERIC documents, journal articles, and nonprint resources classified by Artificial Intelligence, Robotics, Electronic Performance Support Systems; Computer-Assisted Instruction; Distance Education; Educational Research; Educational Technology; Electronic Publishing; Information Science and…

  14. Uniform Additivity in Classical and Quantum Information

    NASA Astrophysics Data System (ADS)

    Cross, Andrew; Li, Ke; Smith, Graeme

    2017-01-01

    Information theory quantifies the optimal rates of resource interconversions, usually in terms of entropies. However, nonadditivity often makes evaluating entropic formulas intractable. In a few auspicious cases, additivity allows a full characterization of optimal rates. We study uniform additivity of formulas, which is easily evaluated and captures all known additive quantum formulas. Our complete characterization of uniform additivity exposes an intriguing new additive quantity and identifies a remarkable coincidence—the classical and quantum uniformly additive functions with one auxiliary variable are identical.

  15. Contextuality supplies the 'magic' for quantum computation.

    PubMed

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-19

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.

  16. Teardrop bladder: additional considerations

    SciTech Connect

    Wechsler, R.J.; Brennan, R.E.

    1982-07-01

    Nine cases of teardrop bladder (TDB) seen at excretory urography are presented. In some of these patients, the iliopsoas muscles were at the upper limit of normal in size, and additional evaluation of the perivesical structures with computed tomography (CT) was necessary. CT demonstrated only hypertrophied muscles with or without perivesical fat. The psoas muscles and pelvic width were measured in 8 patients and compared with the measurements of a control group of males without TDB. Patients with TDB had large iliopsoas muscles and narrow pelves compared with the control group. The psoas muscle width/pelvic width ratio was significantly greater (p < 0.0005) in patients with TDB than in the control group, with values of 1.04 + 0.05 and 0.82 + 0.09, respectively. It is concluded that TDB is not an uncommon normal variant in black males. Both iliopsoas muscle hypertrophy and a narrow pelvis are factors that predispose a patient to TDB.

  17. Hydropower and Environmental Resource Assessment (HERA): a computational tool for the assessment of the hydropower potential of watersheds considering engineering and socio-environmental aspects.

    NASA Astrophysics Data System (ADS)

    Martins, T. M.; Kelman, R.; Metello, M.; Ciarlini, A.; Granville, A. C.; Hespanhol, P.; Castro, T. L.; Gottin, V. M.; Pereira, M. V. F.

    2015-12-01

    The hydroelectric potential of a river is proportional to its head and water flows. Selecting the best development alternative for Greenfield projects watersheds is a difficult task, since it must balance demands for infrastructure, especially in the developing world where a large potential remains unexplored, with environmental conservation. Discussions usually diverge into antagonistic views, as in recent projects in the Amazon forest, for example. This motivates the construction of a computational tool that will support a more qualified debate regarding development/conservation options. HERA provides the optimal head division partition of a river considering technical, economic and environmental aspects. HERA has three main components: (i) pre-processing GIS of topographic and hydrologic data; (ii) automatic engineering and equipment design and budget estimation for candidate projects; (iii) translation of division-partition problem into a mathematical programming model. By integrating an automatic calculation with geoprocessing tools, cloud computation and optimization techniques, HERA makes it possible countless head partition division alternatives to be intrinsically compared - a great advantage with respect to traditional field surveys followed by engineering design methods. Based on optimization techniques, HERA determines which hydro plants should be built, including location, design, technical data (e.g. water head, reservoir area and volume, engineering design (dam, spillways, etc.) and costs). The results can be visualized in the HERA interface, exported to GIS software, Google Earth or CAD systems. HERA has a global scope of application since the main input data area a Digital Terrain Model and water inflows at gauging stations. The objective is to contribute to an increased rationality of decisions by presenting to the stakeholders a clear and quantitative view of the alternatives, their opportunities and threats.

  18. Environmental Education CBRU Resource Manual.

    ERIC Educational Resources Information Center

    New Jersey State Dept. of Education, Trenton. Div. of Curriculum and Instruction.

    This environmental education resource manual deals with the computer based resource unit (CBRU) program available to New Jersey school teachers. The program is designed to unburden the teacher in planning classroom and out-of-school learning experiences based upon specific learner objectives for many grade levels and learning variables. Each…

  19. ERDC MSRC Resource. Fall 2007

    DTIC Science & Technology

    2007-01-01

    high performance computing (HPC). As you read through the articles in this edition of the Resource, you will see evidence of this Center’s continual...attempt to stay focused on this important goal. The Resource articles in this issue help to validate that we here at ERDC, through the DoD High... article ), our folks affiliated with universities for technology transfer (see CaseMan article ), and our visualization resources that aid DoD

  20. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  1. Municipal Solid Waste Resources

    SciTech Connect

    2016-06-01

    Municipal solid waste (MSW) is a source of biomass material that can be utilized for bioenergy production with minimal additional inputs. MSW resources include mixed commercial and residential garbage such as yard trimmings, paper and paperboard, plastics, rubber, leather, textiles, and food wastes. Waste resources such as landfill gas, mill residues, and waste grease are already being utilized for cost-effective renewable energy generation. MSW for bioenergy also represents an opportunity to divert greater volumes of residential and commercial waste from landfills.

  2. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  3. Computer-aided analysis of Skylab multispectral scanner data in mountainous terrain for land use, forestry, water resource, and geologic applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. One of the most significant results of this Skylab research involved the geometric correction and overlay of the Skylab multispectral scanner data with the LANDSAT multispectral scanner data, and also with a set of topographic data, including elevation, slope, and aspect. The Skylab S192 multispectral scanner data had distinct differences in noise level of the data in the various wavelength bands. Results of the temporal evaluation of the SL-2 and SL-3 photography were found to be particularly important for proper interpretation of the computer-aided analysis of the SL-2 and SL-3 multispectral scanner data. There was a quality problem involving the ringing effect introduced by digital filtering. The modified clustering technique was found valuable when working with multispectral scanner data involving many wavelength bands and covering large geographic areas. Analysis of the SL-2 scanner data involved classification of major cover types and also forest cover types. Comparison of the results obtained wth Skylab MSS data and LANDSAT MSS data indicated that the improved spectral resolution of the Skylab scanner system enabled a higher classification accuracy to be obtained for forest cover types, although the classification performance for major cover types was not significantly different.

  4. Technology Changes and VA Mental Health Computer Applications

    PubMed Central

    Gottfredson, Douglas; Finkelstein, Allan; Christensen, Phillip; Weaver, Richard; Sells, Jeffery; Miller, David; Anderson, Ronald

    1993-01-01

    Since 1972, the Department of Veterans Affairs has had mental health computer applications for clinicians, managers, and researchers, operating on main frame and mini computers. The advent of personal computers has provided the opportunity to further enhance mental health automation. With Congressional support, VA's Mental Health and Behavioral Sciences Service placed micro computers in 168 VA Medical Centers and developed additional mental health applications. Using wide area networking procedures, a National Mental Health Database System (NMHDS) was established. In addition, a Computer-assisted Assessment, Psychotherapy, Education, and Research system (CAPER), a Treatment Planner, a Suicide and Assaultive Behavior Monitoring system, and a national registry of VA mental health treatment resources were developed. Each of these computer applications is demonstrated and discussed.

  5. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  6. Extractable resources

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The use of information from space systems in the operation of extractive industries, particularly in exploration for mineral and fuel resources was reviewed. Conclusions and recommendations reported are based on the fundamental premise that survival of modern industrial society requires a continuing secure flow of resources for energy, construction and manufacturing, and for use as plant foods.

  7. 18 CFR 5.21 - Additional information.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Additional information. 5.21 Section 5.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT INTEGRATED LICENSE APPLICATION PROCESS §...

  8. 18 CFR 5.21 - Additional information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Additional information. 5.21 Section 5.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT INTEGRATED LICENSE APPLICATION PROCESS §...

  9. 18 CFR 5.21 - Additional information.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Additional information. 5.21 Section 5.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT INTEGRATED LICENSE APPLICATION PROCESS §...

  10. 18 CFR 5.21 - Additional information.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Additional information. 5.21 Section 5.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT INTEGRATED LICENSE APPLICATION PROCESS §...

  11. 18 CFR 5.21 - Additional information.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Additional information. 5.21 Section 5.21 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT INTEGRATED LICENSE APPLICATION PROCESS §...

  12. 30 CFR 256.53 - Additional bonds.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false Additional bonds. 256.53 Section 256.53 Mineral Resources BUREAU OF OCEAN ENERGY MANAGEMENT, REGULATION, AND ENFORCEMENT, DEPARTMENT OF THE INTERIOR... the Government and the estimated costs of lease abandonment and cleanup are less than the...

  13. 18 CFR 33.10 - Additional information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Additional information. 33.10 Section 33.10 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT APPLICATIONS UNDER FEDERAL POWER ACT SECTION...

  14. UT-CT: A National Resource for Applications of High-Resolution X-ray Computed Tomography in the Geological Sciences

    NASA Astrophysics Data System (ADS)

    Carlson, W. D.; Ketcham, R. A.; Rowe, T. B.

    2002-12-01

    An NSF-sponsored (EAR-IF) shared multi-user facility dedicated to research applications of high-resolution X-ray computed tomography (CT) in the geological sciences has been in operation since 1997 at the University of Texas at Austin. The centerpiece of the facility is an industrial CT scanner custom-designed for geological applications. Because the instrument can optimize trade-offs among penetrating ability, spatial resolution, density discrimination, imaging modes, and scan times, it can image a very broad range of geological specimens and materials, and thus offers significant advantages over medical scanners and desktop microtomographs. Two tungsten-target X-ray sources (200-kV microfocal and 420-kV) and three X-ray detectors (image-intensifier, high-sensitivity cadmium tungstate linear array, and high-resolution gadolinium-oxysulfide radiographic line scanner) can be used in various combinations to meet specific imaging goals. Further flexibility is provided by multiple imaging modes: second-generation (translate-rotate), third-generation (rotate-only; centered and variably offset), and cone-beam (volume CT). The instrument can accommodate specimens as small as about 1 mm on a side, and as large as 0.5 m in diameter and 1.5 m tall. Applications in petrology and structural geology include measuring crystal sizes and locations to identify mechanisms governing the kinetics of metamorphic reactions; visualizing relationships between alteration zones and abundant macrodiamonds in Siberian eclogites to elucidate metasomatic processes in the mantle; characterizing morphologies of spiral inclusion trails in garnet to test hypotheses of porphyroblast rotation during growth; measuring vesicle size distributions in basaltic flows for determination of elevation at the time of eruption to constrain timing and rates of continental uplift; analysis of the geometry, connectivity, and tortuosity of migmatite leucosomes to define the topology of melt flow paths, for numerical

  15. Graphic engine resource management

    NASA Astrophysics Data System (ADS)

    Bautin, Mikhail; Dwarakinath, Ashok; Chiueh, Tzi-cker

    2008-01-01

    Modern consumer-grade 3D graphic cards boast a computation/memory resource that can easily rival or even exceed that of standard desktop PCs. Although these cards are mainly designed for 3D gaming applications, their enormous computational power has attracted developers to port an increasing number of scientific computation programs to these cards, including matrix computation, collision detection, cryptography, database sorting, etc. As more and more applications run on 3D graphic cards, there is a need to allocate the computation/memory resource on these cards among the sharing applications more fairly and efficiently. In this paper, we describe the design, implementation and evaluation of a Graphic Processing Unit (GPU) scheduler based on Deficit Round Robin scheduling that successfully allocates to every process an equal share of the GPU time regardless of their demand. This scheduler, called GERM, estimates the execution time of each GPU command group based on dynamically collected statistics, and controls each process's GPU command production rate through its CPU scheduling priority. Measurements on the first GERM prototype show that this approach can keep the maximal GPU time consumption difference among concurrent GPU processes consistently below 5% for a variety of application mixes.

  16. Resources Management System

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Delta Data Systems, Inc. was originally formed by NASA and industry engineers to produce a line of products that evolved from ELAS, a NASA-developed computer program. The company has built on that experience, using ELAS as the basis for other remote sensing products. One of these is AGIS, a computer package for geographic and land information systems. AGIS simultaneously processes remotely sensed and map data. The software is designed to operate on a low cost microcomputer, putting resource management tools within reach of small operators.

  17. A resource management architecture for metacomputing systems.

    SciTech Connect

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  18. Hemophilia - resources

    MedlinePlus

    Resources - hemophilia ... The following organizations provide further information on hemophilia : Centers for Disease Control and Prevention -- www.cdc.gov/ncbddd/hemophilia/index.html National Heart, Lung, and Blood Institute -- www.nhlbi.nih.gov/ ...

  19. Arthritis - resources

    MedlinePlus

    Resources - arthritis ... The following organizations provide more information on arthritis : American Academy of Orthopaedic Surgeons -- orthoinfo.aaos.org/menus/arthritis.cfm Arthritis Foundation -- www.arthritis.org Centers for Disease Control and Prevention -- www. ...

  20. Diabetes - resources

    MedlinePlus

    Resources - diabetes ... The following sites provide further information on diabetes: American Diabetes Association -- www.diabetes.org Juvenile Diabetes Research Foundation International -- www.jdrf.org National Center for Chronic Disease Prevention and Health Promotion -- ...

  1. Depression - resources

    MedlinePlus

    Resources - depression ... Depression is a medical condition. If you think you may be depressed, see a health care provider. ... following organizations are good sources of information on depression : American Psychological Association -- www.apa.org/topics/depress/ ...

  2. Mars resources

    NASA Technical Reports Server (NTRS)

    Duke, Michael B.

    1986-01-01

    The most important resources of Mars for the early exploration phase will be oxygen and water, derived from the Martian atmosphere and regolith, which will be used for propellant and life support. Rocks and soils may be used in unprocessed form as shielding materials for habitats, or in minimally processed form to expand habitable living and work space. Resources necessary to conduct manufacturing and agricultural projects are potentially available, but will await advanced stages of Mars habitation before they are utilized.

  3. Forest Resources

    SciTech Connect

    2016-06-01

    Forest biomass is an abundant biomass feedstock that complements the conventional forest use of wood for paper and wood materials. It may be utilized for bioenergy production, such as heat and electricity, as well as for biofuels and a variety of bioproducts, such as industrial chemicals, textiles, and other renewable materials. The resources within the 2016 Billion-Ton Report include primary forest resources, which are taken directly from timberland-only forests, removed from the land, and taken to the roadside.

  4. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  5. Cloud computing basics for librarians.

    PubMed

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article.

  6. Resources for flow and image cytometry

    SciTech Connect

    Cassidy, M.

    1990-01-01

    This paper describes resources available to the flow and image cytometry community. I have been asked to limit the discussion to resources available in the United States, so reference to resources exclusively available in Japan, Europe, or Australia are not included. It is not the intention of this paper to include each and every resource available, rather, to describe the types available and give some examples. Included in this manuscript are listings of some of the examples of resources which readers may find useful. Addresses of commercial companies are not included in the interest of space. Most of the examples listed advertise on a regular basis in journals publishing in cytometry fields. The resources to be described are divided into five categories: instrument resources, computer and software resources, standards, physical or user'' resources, and instructional resources. Each of these resources will be discussed separately. 4 tabs.

  7. COMPUTATIONAL TOXICOLOGY-WHERE IS THE DATA? ...

    EPA Pesticide Factsheets

    This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource). This talk will briefly describe the state of the data world for computational toxicology and one approach to improve the situation, called ACToR (Aggregated Computational Toxicology Resource).

  8. High Performance Computing and Storage Requirements for Biological and Environmental Research Target 2017

    SciTech Connect

    Gerber, Richard; Wasserman, Harvey

    2013-05-01

    The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In addition to large-­scale computing and storage resources NERSC provides support and expertise that help scientists make efficient use of its systems. The latest review revealed several key requirements, in addition to achieving its goal of characterizing BER computing and storage needs.

  9. Aviation & Space Education: A Teacher's Resource Guide.

    ERIC Educational Resources Information Center

    Texas State Dept. of Aviation, Austin.

    This resource guide contains information on curriculum guides, resources for teachers, computer software and computer related programs, audio/visual presentations, model aircraft and demonstration aids, training seminars and career education, and an aerospace bibliography for primary grades. Each entry includes all or some of the following items:…

  10. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  11. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  12. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  13. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  14. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  15. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  16. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  17. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  18. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  19. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 3 2011-07-01 2011-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  20. Community-Authored Resources for Education

    ERIC Educational Resources Information Center

    Tinker, Robert; Linn, Marcia; Gerard, Libby; Staudt, Carolyn

    2010-01-01

    Textbooks are resources for learning that provide the structure, content, assessments, and teacher guidance for an entire course. Technology can provide a far better resource that provides the same functions, but takes full advantage of computers. This resource would be much more than text on a screen. It would use the best technology; it would be…

  1. An iron–oxygen intermediate formed during the catalytic cycle of cysteine dioxygenase† †Electronic supplementary information (ESI) available: Experimental and computational details. See DOI: 10.1039/c6cc03904a Click here for additional data file.

    PubMed Central

    Tchesnokov, E. P.; Faponle, A. S.; Davies, C. G.; Quesne, M. G.; Turner, R.; Fellner, M.; Souness, R. J.; Wilbanks, S. M.

    2016-01-01

    Cysteine dioxygenase is a key enzyme in the breakdown of cysteine, but its mechanism remains controversial. A combination of spectroscopic and computational studies provides the first evidence of a short-lived intermediate in the catalytic cycle. The intermediate decays within 20 ms and has absorption maxima at 500 and 640 nm. PMID:27297454

  2. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  3. Electronic neural network for dynamic resource allocation

    NASA Technical Reports Server (NTRS)

    Thakoor, A. P.; Eberhardt, S. P.; Daud, T.

    1991-01-01

    A VLSI implementable neural network architecture for dynamic assignment is presented. The resource allocation problems involve assigning members of one set (e.g. resources) to those of another (e.g. consumers) such that the global 'cost' of the associations is minimized. The network consists of a matrix of sigmoidal processing elements (neurons), where the rows of the matrix represent resources and columns represent consumers. Unlike previous neural implementations, however, association costs are applied directly to the neurons, reducing connectivity of the network to VLSI-compatible 0 (number of neurons). Each row (and column) has an additional neuron associated with it to independently oversee activations of all the neurons in each row (and each column), providing a programmable 'k-winner-take-all' function. This function simultaneously enforces blocking (excitatory/inhibitory) constraints during convergence to control the number of active elements in each row and column within desired boundary conditions. Simulations show that the network, when implemented in fully parallel VLSI hardware, offers optimal (or near-optimal) solutions within only a fraction of a millisecond, for problems up to 128 resources and 128 consumers, orders of magnitude faster than conventional computing or heuristic search methods.

  4. Advances in water resources technology

    NASA Astrophysics Data System (ADS)

    The presentation of technological advances in the field of water resources will be the focus of Advances in Water Resources Technology, a conference to be held in Athens, Greece, March 20-23, 1991. Organized by the European Committee for Water Resources Management, in cooperation with the National Technical University of Athens, the conference will feature state-of-the art papers, contributed original research papers, and poster papers. Session subjects will include surface water, groundwater, water resources conservation, water quality and reuse, computer modeling and simulation, real-time control of water resources systems, and institutions and methods for technology.The official language of the conference will be English. Special meetings and discussions will be held for investigating methods of effective technology transfer among European countries. For this purpose, a wide representation of research institutions, universities and companies involved in water resources technology will be attempted.

  5. Gasoline and Diesel Fuel Test Methods Additional Resources

    EPA Pesticide Factsheets

    Supporting documents on the Direct Final Rule that allows refiners and laboratories to use more current and improved fuel testing procedures for twelve American Society for Testing and Materials analytical test methods.

  6. Modifications to the Renewable Fuel Standard Program Additional Resources

    EPA Pesticide Factsheets

    EPA withdraws the heating oil definition and transmix amendments addressed in EPA's direct final rule on October 9, 2012. You will find October's final rule and withdrawal notice of this final rule on this page.

  7. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  8. Computed Tomography (CT) -- Sinuses

    MedlinePlus

    ... More Info Images/Videos About Us News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment to evaluate the paranasal sinus cavities – hollow, air-filled spaces within the bones of the face surrounding the ...

  9. Computer Software Reviews.

    ERIC Educational Resources Information Center

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  10. Metacomputing on Commodity Computers

    DTIC Science & Technology

    1999-05-01

    Journal on Future Generation Computer Systems , 1999. [17] L. Beca, G. Cheng, G. Fox, T. Jurga, K. Olszewski, M. Podgorny, P. Sokol- wski, and K...Katramatos, J. Karpovich, and A. Grimshaw. Resource man- agement in Legion. International Journal on Future Generation Computer Systems (to appear

  11. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  12. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non

  13. Lunar Resources

    NASA Technical Reports Server (NTRS)

    Edmunson, Jennifer

    2010-01-01

    This slide presentation reviews the lunar resources that we know are available for human use while exploration of the moon. Some of the lunar resources that are available for use are minerals, sunlight, solar wind, water and water ice, rocks and regolith. The locations for some of the lunar resouces and temperatures are reviewed. The Lunar CRater Observation and Sensing Satellite (LCROSS) mission, and its findings are reviewed. There is also discussion about water retention in Permament Shadowed Regions of the Moon. There is also discussion about the Rock types on the lunar surface. There is also discussion of the lunar regolith, the type and the usages that we can have from it.

  14. Genetic toxicology: web resources.

    PubMed

    Young, Robert R

    2002-04-25

    available online in the field of genetic toxicology. As molecular biology and computational tools improve, new areas within genetic toxicology such as structural activity relationship analysis, mutational spectra databases and toxicogenomics, now have resources online as well.

  15. Adapting Computer Resources to School Communication Needs.

    ERIC Educational Resources Information Center

    Cleveland, Clem

    The benefits that the microcomputer offers the school district--and the district's public relations office in particular--are significant and should be taken advantage of. Although microcomputers increase the already large amount of information available to administrators, they also enable the user to select and coordinate information efficiently.…

  16. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  17. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  18. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  19. The Evolution of Cloud Computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Berghaus, Frank; Brasolin, Franco; Domingues Cordeiro, Cristovao Jose; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; LeBlanc, Matthew; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-12-01

    The ATLAS experiment at the LHC has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing Infrastructure as a Service resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, a system for dynamic location-based discovery of caching proxy servers, and the usage of a data federation to unify the worldwide grid of storage elements into a single namespace and access point. The usage of the experiment's high level trigger farm for Monte Carlo production, in a specialized cloud environment, is presented. Finally, we evaluate and compare the performance of commercial clouds using several benchmarks.

  20. The evolution of self-replicating computer organisms

    NASA Astrophysics Data System (ADS)

    Pargellis, A. N.

    A computer model is described that explores some of the possible behavior of biological life during the early stages of evolution. The simulation starts with a primordial soup composed of randomly generated sequences of computer operations selected from a basis set of 16 opcodes. With a probability of about 10 -4, these sequences spontaneously generate large and inefficient self-replicating “organisms”. Driven by mutations, these protobiotic ancestors more efficiently generate offspring by initially eliminating unnecessary code. Later they increase their complexity by adding additional subroutines as they compete for the system's two limited resources, computer memory and CPU time. The ensuing biology includes replicating hosts, parasites and colonies.

  1. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  2. Cluster State Quantum Computation

    DTIC Science & Technology

    2014-02-01

    nearest neighbor cluster state has been shown to be a universal resource for MBQC thus we can say our quantum computer is universal. We note that...CLUSTER STATE QUANTUM COMPUTATION FEBRUARY 2014 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED STINFO COPY AIR FORCE...TITLE AND SUBTITLE CLUSTER STATE QUANTUM COMPUTATION 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 62788F 6

  3. 15 CFR 990.66 - Additional considerations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Additional considerations. 990.66 Section 990.66 Commerce and Foreign Trade Regulations Relating to Commerce and Foreign Trade (Continued... NATURAL RESOURCE DAMAGE ASSESSMENTS Restoration Implementation Phase § 990.66 Additional...

  4. CISNET: Resources

    Cancer.gov

    The Publications pages provide lists of all CISNET publications since the inception of CISNET. Publications are listed by Cancer Site or by Research Topic. The Publication Support and Modeling Resources pages provides access to technical modeling information, raw data, and publication extensions stemming from the work of the CISNET consortium.

  5. Urban Resources.

    ERIC Educational Resources Information Center

    Novak, Kathy

    Designed as a resource for urban adult basic education (ABE) program planners, this guidebook describes model linkage strategies between ABE and job placement as well as ABE and job training services that are targeted to urban Americans. The following topics are covered in the guide: linkage strategies (the meaning of the term linkages, community…

  6. Cleavage of ether, ester, and tosylate C(sp3)-O bonds by an iridium complex, initiated by oxidative addition of C-H bonds. Experimental and computational studies.

    PubMed

    Kundu, Sabuj; Choi, Jongwook; Wang, David Y; Choliy, Yuriy; Emge, Thomas J; Krogh-Jespersen, Karsten; Goldman, Alan S

    2013-04-03

    A pincer-ligated iridium complex, (PCP)Ir (PCP = κ(3)-C6H3-2,6-[CH2P(t-Bu)2]2), is found to undergo oxidative addition of C(sp(3))-O bonds of methyl esters (CH3-O2CR'), methyl tosylate (CH3-OTs), and certain electron-poor methyl aryl ethers (CH3-OAr). DFT calculations and mechanistic studies indicate that the reactions proceed via oxidative addition of C-H bonds followed by oxygenate migration, rather than by direct C-O addition. Thus, methyl aryl ethers react via addition of the methoxy C-H bond, followed by α-aryloxide migration to give cis-(PCP)Ir(H)(CH2)(OAr), followed by iridium-to-methylidene hydride migration to give (PCP)Ir(CH3)(OAr). Methyl acetate undergoes C-H bond addition at the carbomethoxy group to give (PCP)Ir(H)[κ(2)-CH2OC(O)Me] which then affords (PCP-CH2)Ir(H)(κ(2)-O2CMe) (6-Me) in which the methoxy C-O bond has been cleaved, and the methylene derived from the methoxy group has migrated into the PCP Cipso-Ir bond. Thermolysis of 6-Me ultimately gives (PCP)Ir(CH3)(κ(2)-O2CR), the net product of methoxy group C-O oxidative addition. Reaction of (PCP)Ir with species of the type ROAr, RO2CMe or ROTs, where R possesses β-C-H bonds (e.g., R = ethyl or isopropyl), results in formation of (PCP)Ir(H)(OAr), (PCP)Ir(H)(O2CMe), or (PCP)Ir(H)(OTs), respectively, along with the corresponding olefin or (PCP)Ir(olefin) complex. Like the C-O bond oxidative additions, these reactions also proceed via initial activation of a C-H bond; in this case, C-H addition at the β-position is followed by β-migration of the aryloxide, carboxylate, or tosylate group. Calculations indicate that the β-migration of the carboxylate group proceeds via an unusual six-membered cyclic transition state in which the alkoxy C-O bond is cleaved with no direct participation by the iridium center.

  7. Space Resources

    NASA Technical Reports Server (NTRS)

    McKay, Mary Fae (Editor); McKay, David S. (Editor); Duke, Michael S. (Editor)

    1992-01-01

    Space resources must be used to support life on the Moon and exploration of Mars. Just as the pioneers applied the tools they brought with them to resources they found along the way rather than trying to haul all their needs over a long supply line, so too must space travelers apply their high technology tools to local resources. The pioneers refilled their water barrels at each river they forded; moonbase inhabitants may use chemical reactors to combine hydrogen brought from Earth with oxygen found in lunar soil to make their water. The pioneers sought temporary shelter under trees or in the lee of a cliff and built sod houses as their first homes on the new land; settlers of the Moon may seek out lava tubes for their shelter or cover space station modules with lunar regolith for radiation protection. The pioneers moved further west from their first settlements, using wagons they had built from local wood and pack animals they had raised; space explorers may use propellant made at a lunar base to take them on to Mars. The concept for this report was developed at a NASA-sponsored summer study in 1984. The program was held on the Scripps campus of the University of California at San Diego (UCSD), under the auspices of the American Society for Engineering Education (ASEE). It was jointly managed under the California Space Inst. and the NASA Johnson Space Center, under the direction of the Office of Aeronautics and Space Technology (OAST) at NASA Headquarters. The study participants (listed in the addendum) included a group of 18 university teachers and researchers (faculty fellows) who were present for the entire 10-week period and a larger group of attendees from universities, Government, and industry who came for a series of four 1-week workshops. The organization of this report follows that of the summer study. Space Resources consists of a brief overview and four detailed technical volumes: (1) Scenarios; (2) Energy, Power, and Transport; (3) Materials; (4

  8. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  9. Resource Balancing Control Allocation

    NASA Technical Reports Server (NTRS)

    Frost, Susan A.; Bodson, Marc

    2010-01-01

    Next generation aircraft with a large number of actuators will require advanced control allocation methods to compute the actuator commands needed to follow desired trajectories while respecting system constraints. Previously, algorithms were proposed to minimize the l1 or l2 norms of the tracking error and of the control effort. The paper discusses the alternative choice of using the l1 norm for minimization of the tracking error and a normalized l(infinity) norm, or sup norm, for minimization of the control effort. The algorithm computes the norm of the actuator deflections scaled by the actuator limits. Minimization of the control effort then translates into the minimization of the maximum actuator deflection as a percentage of its range of motion. The paper shows how the problem can be solved effectively by converting it into a linear program and solving it using a simplex algorithm. Properties of the algorithm are investigated through examples. In particular, the min-max criterion results in a type of resource balancing, where the resources are the control surfaces and the algorithm balances these resources to achieve the desired command. A study of the sensitivity of the algorithms to the data is presented, which shows that the normalized l(infinity) algorithm has the lowest sensitivity, although high sensitivities are observed whenever the limits of performance are reached.

  10. Manganese resources of the Cuyuna range, east-central Minnesota

    SciTech Connect

    Beltrame, R.J.; Holtzman, R.C.; Wahl, T.E.

    1981-01-01

    The Cuyuna range, located in east-central Minnesota, consists of a sequence of argillite, siltstone, iron-formation, graywacke, slate, and quartzite of early Proterozoic age. Manganese-bearing materials occur within the iron-rich strata of the Trommald Formation and the Rabbit Lake Formation. Computer-assisted resource estimates, based on exploration drill hole information, indicate that the Cuyuna range contains a minimum of 176 million metric tons (MMT) of marginally economic manganiferous rock with an average grade of 10.46 weight percent manganese. The calculated 18.5 MMT of manganese on the Cuyuna range could supply this country's needs for this important and strategic metal for nearly 14 years. An additional resource of 6.9 MMT of manganese metal is available in the lower grade deposits The vast majority of these calculated resources are extractable by current surface mining techniques.

  11. The cactus worm : experiments with dynamic resource discovery and allocation in a grid environment.

    SciTech Connect

    Allen, G.; Angulo, D.; Foster, I.; Lanfermann, G.; Liu, C.; Radke, T.; Seidel, E.; Shalf, J.; Mathematics and Computer Science; Albert-Einstein-Inst.; Univ. of Chicago; LBNL

    2001-01-01

    The ability to harness heterogeneous, dynamically available grid resources is attractive to typically resource-starved computational scientists and engineers, as in principle it can increase, by significant factors, the number of cycles that can be delivered to applications. However, new adaptive application structures and dynamic runtime system mechanisms are required if we are to operate effectively in grid environments. To explore some of these issues in a practical setting, the authors are developing an experimental framework, called Cactus, that incorporates both adaptive application structures for dealing with changing resource characteristics and adaptive resource selection mechanisms that allow applications to change their resource allocations (e.g., via migration) when performance falls outside specified limits. The authors describe the adaptive resource selection mechanisms and describe how they are used to achieve automatic application migration to 'better' resources following performance degradation. The results provide insights into the architectural structures required to support adaptive resource selection. In addition, the authors suggest that the Cactus Worm affords many opportunities for grid computing.

  12. Space Resources

    NASA Technical Reports Server (NTRS)

    McKay, Mary Fae (Editor); McKay, David S. (Editor); Duke, Michael S. (Editor)

    1992-01-01

    Space resources must be used to support life on the Moon and exploration of Mars. Just as the pioneers applied the tools they brought with them to resources they found along the way rather than trying to haul all their needs over a long supply line, so too must space travelers apply their high technology tools to local resources. The pioneers refilled their water barrels at each river they forded; moonbase inhabitants may use chemical reactors to combine hydrogen brought from Earth with oxygen found in lunar soil to make their water. The pioneers sought temporary shelter under trees or in the lee of a cliff and built sod houses as their first homes on the new land; settlers of the Moon may seek out lava tubes for their shelter or cover space station modules with lunar regolith for radiation protection. The pioneers moved further west from their first settlements, using wagons they had built from local wood and pack animals they had raised; space explorers may use propellant made at a lunar base to take them on to Mars. The concept for this report was developed at a NASA-sponsored summer study in 1984. The program was held on the Scripps campus of the University of California at San Diego (UCSD), under the auspices of the American Society for Engineering Education (ASEE). It was jointly managed under the California Space Inst. and the NASA Johnson Space Center, under the direction of the Office of Aeronautics and Space Technology (OAST) at NASA Headquarters. The study participants (listed in the addendum) included a group of 18 university teachers and researchers (faculty fellows) who were present for the entire 10-week period and a larger group of attendees from universities, Government, and industry who came for a series of four 1-week workshops. The organization of this report follows that of the summer study. Space Resources consists of a brief overview and four detailed technical volumes: (1) Scenarios; (2) Energy, Power, and Transport; (3) Materials; (4

  13. Space Resource Roundtable Rationale

    NASA Astrophysics Data System (ADS)

    Duke, Michael

    1999-01-01

    Recent progress in the U.S. Space Program has renewed interest in space resource issues. The Lunar Prospector mission conducted in NASA's Discovery Program has yielded interesting new insights into lunar resource issues, particularly the possibility that water is concentrated in cold traps at the lunar poles. This finding has not yet triggered a new program of lunar exploration or development, however it opens the possibility that new Discovery Missions might be viable. Several asteroid missions are underway or under development and a mission to return samples from the Mars satellite, Phobos, is being developed. These exploration missions are oriented toward scientific analysis, not resource development and utilization, but can provide additional insight into the possibilities for mining asteroids. The Mars Surveyor program now includes experiments on the 2001 lander that are directly applicable to developing propellants from the atmosphere of Mars, and the program has solicited proposals for the 2003/2005 missions in the area of resource utilization. These are aimed at the eventual human exploration of Mars. The beginning of construction of the International Space Station has awakened interest in follow-on programs of human exploration, and NASA is once more studying the human exploration of Moon, Mars and asteroids. Resource utilization will be included as objectives by some of these human exploration programs. At the same time, research and technology development programs in NASA such as the Microgravity Materials Science Program and the Cross-Enterprise Technology Development Program are including resource utilization as a valid area for study. Several major development areas that could utilize space resources, such as space tourism and solar power satellite programs, are actively under study. NASA's interests in space resource development largely are associated with NASA missions rather than the economic development of resources for industrial processes. That

  14. Water resources

    NASA Technical Reports Server (NTRS)

    Salomonson, V. V.; Rango, A.

    1973-01-01

    The application of ERTS-1 imagery to the conservation and control of water resources is discussed. The effects of exisiting geology and land use in the water shed area on the hydrologic cycle and the general characteristics of runoff are described. The effects of floods, snowcover, and glaciers are analyzed. The use of ERTS-1 imagery to map surface water and wetland areas to provide rapid inventorying over large regions of water bodies is reported.

  15. Modeling renewable energy resources in integrated resource planning

    SciTech Connect

    Logan, D.; Neil, C.; Taylor, A.

    1994-06-01

    Including renewable energy resources in integrated resource planning (IRP) requires that utility planning models properly consider the relevant attributes of the different renewable resources in addition to conventional supply-side and demand-side options. Otherwise, a utility`s resource plan is unlikely to have an appropriate balance of the various resource options. The current trend toward regulatory set-asides for renewable resources is motivated in part by the perception that the capabilities of current utility planning models are inadequate with regard to renewable resources. Adequate modeling capabilities and utility planning practices are a necessary prerequisite to the long-term penetration of renewable resources into the electric utility industry`s resource mix. This report presents a review of utility planning models conducted for the National Renewable Energy Laboratory (NREL). The review examines the capabilities of utility planning models to address key issues in the choice between renewable resources and other options. The purpose of this review is to provide a basis for identifying high priority areas for advancing the state of the art.

  16. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  17. Lunar Water Resource Demonstration

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony C.

    2008-01-01

    In cooperation with the Canadian Space Agency, the Northern Centre for Advanced Technology, Inc., the Carnegie-Mellon University, JPL, and NEPTEC, NASA has undertaken the In-Situ Resource Utilization (ISRU) project called RESOLVE. This project is a ground demonstration of a system that would be sent to explore permanently shadowed polar lunar craters, drill into the regolith, determine what volatiles are present, and quantify them in addition to recovering oxygen by hydrogen reduction. The Lunar Prospector has determined these craters contain enhanced hydrogen concentrations averaging about 0.1%. If the hydrogen is in the form of water, the water concentration would be around 1%, which would translate into billions of tons of water on the Moon, a tremendous resource. The Lunar Water Resource Demonstration (LWRD) is a part of RESOLVE designed to capture lunar water and hydrogen and quantify them as a backup to gas chromatography analysis. This presentation will briefly review the design of LWRD and some of the results of testing the subsystem. RESOLVE is to be integrated with the Scarab rover from CMIJ and the whole system demonstrated on Mauna Kea on Hawaii in November 2008. The implications of lunar water for Mars exploration are two-fold: 1) RESOLVE and LWRD could be used in a similar fashion on Mars to locate and quantify water resources, and 2) electrolysis of lunar water could provide large amounts of liquid oxygen in LEO, leading to lower costs for travel to Mars, in addition to being very useful at lunar outposts.

  18. Additive Similarity Trees

    ERIC Educational Resources Information Center

    Sattath, Shmuel; Tversky, Amos

    1977-01-01

    Tree representations of similarity data are investigated. Hierarchical clustering is critically examined, and a more general procedure, called the additive tree, is presented. The additive tree representation is then compared to multidimensional scaling. (Author/JKS)

  19. Networked Resources.

    ERIC Educational Resources Information Center

    Nickerson, Gord

    1991-01-01

    Explains File Transfer Protocol (FTP), an application software program that allows a user to transfer files from one computer to another. The benefit of the rapid operating speed of FTP is discussed, the use of FTP on microcomputers, minicomputers, and workstations is described, and FTP problems are considered. (four references) (LRW)

  20. NASA Water Resources Program

    NASA Technical Reports Server (NTRS)

    Toll, David L.

    2011-01-01

    With increasing population pressure and water usage coupled with climate variability and change, water issues are being reported by numerous groups as the most critical environmental problems facing us in the 21st century. Competitive uses and the prevalence of river basins and aquifers that extend across boundaries engender political tensions between communities, stakeholders and countries. In addition to the numerous water availability issues, water quality related problems are seriously affecting human health and our environment. The potential crises and conflicts especially arise when water is competed among multiple uses. For example, urban areas, environmental and recreational uses, agriculture, and energy production compete for scarce resources, not only in the Western U.S. but throughout much of the U.S. and also in numerous parts of the world. Mitigating these conflicts and meeting water demands and needs requires using existing water resources more efficiently. The NASA Water Resources Program Element works to use NASA products and technology to address these critical water issues. The primary goal of the Water Resources is to facilitate application of NASA Earth science products as a routine use in integrated water resources management for the sustainable use of water. This also includes the extreme events of drought and floods and the adaptation to the impacts from climate change. NASA satellite and Earth system observations of water and related data provide a huge volume of valuable data in both near-real-time and extended back nearly 50 years about the Earth's land surface conditions such as precipitation, snow, soil moisture, water levels, land cover type, vegetation type, and health. NASA Water Resources Program works closely to use NASA and Earth science data with other U.S. government agencies, universities, and non-profit and private sector organizations both domestically and internationally. The NASA Water Resources Program organizes its