Science.gov

Sample records for additional computational resources

  1. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  2. African Studies Computer Resources.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    African studies computer resources that are readily available in the United States with linkages to Africa are described, highlighting those most directly corresponding to African content. Africanists can use the following four fundamental computer systems: (1) Internet/Bitnet; (2) Fidonet; (3) Usenet; and (4) dial-up bulletin board services. The…

  3. Additional Resources on Asian Americans.

    ERIC Educational Resources Information Center

    Kodama, Corinne Maekawa; Lee, Sunny; Liang, Christopher T. H.; Alvarez, Alvin N.; McEwen, Marylu K.

    2002-01-01

    The authors identify Asian American associations and organizations, academic journals, periodicals, and media resources. Selected annotated resources on Asian American activism and politics, counseling and psychology, educational issues, gender and sexual orientation, history, policy reports, and racial and ethnic identity are also included.…

  4. Additional Financial Resources for Education.

    ERIC Educational Resources Information Center

    Hubbard, Ben C.

    This paper discusses the continuing need for additional educational funds and suggests that the only way to gain these funds is through concerted and persistent political efforts by supporters of education at both the federal and state levels. The author first points out that for many reasons declining enrollment may not decrease operating costs…

  5. Enabling opportunistic resources for CMS Computing Operations

    SciTech Connect

    Hufnagel, Dick

    2015-11-19

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resourcesresources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  6. Enabling opportunistic resources for CMS Computing Operations

    NASA Astrophysics Data System (ADS)

    Hufnagel, D.; CMS Collaboration

    2015-12-01

    With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.

  7. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  8. Administration of Computer Resources.

    ERIC Educational Resources Information Center

    Franklin, Gene F.

    Computing at Stanford University has, until recently, been performed at one of five facilities. The Stanford hospital operates an IBM 370/135 mainly for administrative use. The university business office has an IBM 370/145 for its administrative needs and support of the medical clinic. Under the supervision of the Stanford Computation Center are…

  9. Learning Resources and Academic Computing.

    ERIC Educational Resources Information Center

    Terwilliger, Gloria

    1987-01-01

    Describes the phases of the integration of academic computing into the operations of the Learning Resource Center (LRC) at the Alexandria campus of Northern Virginia Community College. Reviews planning and operational stages and discusses future directions. (DMM)

  10. Framework Resources Multiply Computing Power

    NASA Technical Reports Server (NTRS)

    2010-01-01

    As an early proponent of grid computing, Ames Research Center awarded Small Business Innovation Research (SBIR) funding to 3DGeo Development Inc., of Santa Clara, California, (now FusionGeo Inc., of The Woodlands, Texas) to demonstrate a virtual computer environment that linked geographically dispersed computer systems over the Internet to help solve large computational problems. By adding to an existing product, FusionGeo enabled access to resources for calculation- or data-intensive applications whenever and wherever they were needed. Commercially available as Accelerated Imaging and Modeling, the product is used by oil companies and seismic service companies, which require large processing and data storage capacities.

  11. Parallel computation using limited resources

    SciTech Connect

    Sugla, B.

    1985-01-01

    This thesis addresses itself to the task of designing and analyzing parallel algorithms when the resources of processors, communication, and time are limited. The two parts of this thesis deal with multiprocessor systems and VLSI - the two important parallel processing environments that are prevalent today. In the first part a time-processor-communication tradeoff analysis is conducted for two kinds of problems - N input, 1 output, and N input, N output computations. In the class of problems of the second kind, the problem of prefix computation, an important problem due to the number of naturally occurring computations it can model, is studied. Finally, a general methodology is given for design of parallel algorithms that can be used to optimize a given design to a wide set of architectural variations. The second part of the thesis considers the design of parallel algorithms for the VLSI model of computation when the resource of time is severely restricted.

  12. Using Computers in the Interrelated Resource Room

    ERIC Educational Resources Information Center

    Bowles, Steffanie

    2006-01-01

    This article explores ways computers can be used in special education resource rooms. Discusses management of computer access, integration of computer into curriculum, software selection, and negative computer practices. Models of computer use, including use of computer as individual learning center and use of computer as a cognitive "tool kit" …

  13. Computer Maintenance Operations Center (CMOC), additional computer support equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA

  14. Computational models and resource allocation for supercomputers

    NASA Technical Reports Server (NTRS)

    Mauney, Jon; Agrawal, Dharma P.; Harcourt, Edwin A.; Choe, Young K.; Kim, Sukil

    1989-01-01

    There are several different architectures used in supercomputers, with differing computational models. These different models present a variety of resource allocation problems that must be solved. The computational needs of a program must be cast in terms of the computational model supported by the supercomputer, and this must be done in a way that makes effective use of the machine's resources. This is the resource allocation problem. The computational models of available supercomputers and the associated resource allocation techniques are surveyed. It is shown that many problems and solutions appear repeatedly in very different computing environments. Some case studies are presented, showing concrete computational models and the allocation strategies used.

  15. COMPUTATIONAL RESOURCES FOR BIOFUEL FEEDSTOCK SPECIES

    SciTech Connect

    Buell, Carol Robin; Childs, Kevin L

    2013-05-07

    While current production of ethanol as a biofuel relies on starch and sugar inputs, it is anticipated that sustainable production of ethanol for biofuel use will utilize lignocellulosic feedstocks. Candidate plant species to be used for lignocellulosic ethanol production include a large number of species within the Grass, Pine and Birch plant families. For these biofuel feedstock species, there are variable amounts of genome sequence resources available, ranging from complete genome sequences (e.g. sorghum, poplar) to transcriptome data sets (e.g. switchgrass, pine). These data sets are not only dispersed in location but also disparate in content. It will be essential to leverage and improve these genomic data sets for the improvement of biofuel feedstock production. The objectives of this project were to provide computational tools and resources for data-mining genome sequence/annotation and large-scale functional genomic datasets available for biofuel feedstock species. We have created a Bioenergy Feedstock Genomics Resource that provides a web-based portal or clearing house for genomic data for plant species relevant to biofuel feedstock production. Sequence data from a total of 54 plant species are included in the Bioenergy Feedstock Genomics Resource including model plant species that permit leveraging of knowledge across taxa to biofuel feedstock species.We have generated additional computational analyses of these data, including uniform annotation, to facilitate genomic approaches to improved biofuel feedstock production. These data have been centralized in the publicly available Bioenergy Feedstock Genomics Resource (http://bfgr.plantbiology.msu.edu/).

  16. Statistics Online Computational Resource for Education

    PubMed Central

    Dinov, Ivo D.; Christou, Nicolas

    2011-01-01

    Summary The Statistics Online Computational Resource (www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials – instructional resources and computational libraries. PMID:21297884

  17. Statistics Online Computational Resource for Education

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Christou, Nicolas

    2009-01-01

    The Statistics Online Computational Resource (http://www.SOCR.ucla.edu) provides one of the largest collections of free Internet-based resources for probability and statistics education. SOCR develops, validates and disseminates two core types of materials--instructional resources and computational libraries. (Contains 2 figures.)

  18. Computational Process Modeling for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2014-01-01

    Computational Process and Material Modeling of Powder Bed additive manufacturing of IN 718. Optimize material build parameters with reduced time and cost through modeling. Increase understanding of build properties. Increase reliability of builds. Decrease time to adoption of process for critical hardware. Potential to decrease post-build heat treatments. Conduct single-track and coupon builds at various build parameters. Record build parameter information and QM Meltpool data. Refine Applied Optimization powder bed AM process model using data. Report thermal modeling results. Conduct metallography of build samples. Calibrate STK models using metallography findings. Run STK models using AO thermal profiles and report STK modeling results. Validate modeling with additional build. Photodiode Intensity measurements highly linear with power input. Melt Pool Intensity highly correlated to Melt Pool Size. Melt Pool size and intensity increase with power. Applied Optimization will use data to develop powder bed additive manufacturing process model.

  19. A model nursing computer resource center.

    PubMed

    Mueller, Sheryl S; Pullen, Richard L; McGee, K Sue

    2002-01-01

    Nursing graduates are required to demonstrate computer technology skills and critical reflective thinking skills in the workplace. The authors discuss a model computer resource center that enhances the acquisition of these requisite skills by students in both an associate degree and vocational nursing program. The computer resource center maximizes student learning and promotes faculty effectiveness and efficiency by a "full-service" approach to computerized testing, information technology instruction, online research, and interactive computer program practice. PMID:12023644

  20. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba); "Computer Instruction…

  1. Production scheduling with discrete and renewable additional resources

    NASA Astrophysics Data System (ADS)

    Kalinowski, K.; Grabowik, C.; Paprocka, I.; Kempa, W.

    2015-11-01

    In this paper an approach to planning of additional resources when scheduling operations are discussed. The considered resources are assumed to be discrete and renewable. In most research in scheduling domain, the basic and often the only type of regarded resources is a workstation. It can be understood as a machine, a device or even as a separated space on the shop floor. In many cases, during the detailed scheduling of operations the need of using more than one resource, required for its implementation, can be indicated. Resource requirements for an operation may relate to different resources or resources of the same type. Additional resources are most often referred to these human resources, tools or equipment, for which the limited availability in the manufacturing system may have an influence on the execution dates of some operations. In the paper the concept of the division into basic and additional resources and their planning method was shown. A situation in which sets of basic and additional resources are not separable - the same additional resource may be a basic resource for another operation is also considered. Scheduling of operations, including greater amount of resources can cause many difficulties, depending on whether the resource is involved in the entire time of operation, only in the selected part(s) of operation (e.g. as auxiliary staff at setup time) or cyclic - e.g. when an operator supports more than one machine, or supervises the execution of several operations. For this reason the dates and work times of resources participation in the operation can be different. Presented issues are crucial when modelling of production scheduling environment and designing of structures for the purpose of scheduling software development.

  2. Computed tomography characterisation of additive manufacturing materials.

    PubMed

    Bibb, Richard; Thompson, Darren; Winder, John

    2011-06-01

    Additive manufacturing, covering processes frequently referred to as rapid prototyping and rapid manufacturing, provides new opportunities in the manufacture of highly complex and custom-fitting medical devices and products. Whilst many medical applications of AM have been explored and physical properties of the resulting parts have been studied, the characterisation of AM materials in computed tomography has not been explored. The aim of this study was to determine the CT number of commonly used AM materials. There are many potential applications of the information resulting from this study in the design and manufacture of wearable medical devices, implants, prostheses and medical imaging test phantoms. A selection of 19 AM material samples were CT scanned and the resultant images analysed to ascertain the materials' CT number and appearance in the images. It was found that some AM materials have CT numbers very similar to human tissues, FDM, SLA and SLS produce samples that appear uniform on CT images and that 3D printed materials show a variation in internal structure.

  3. Computational Process Modeling for Additive Manufacturing (OSU)

    NASA Technical Reports Server (NTRS)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  4. Resources for Computational Geophysics Courses

    NASA Astrophysics Data System (ADS)

    Keers, Henk; Rondenay, Stéphane; Harlap, Yaël.; Nordmo, Ivar

    2014-09-01

    An important skill that students in solid Earth physics need to acquire is the ability to write computer programs that can be used for the processing, analysis, and modeling of geophysical data and phenomena. Therefore, this skill (which we call "computational geophysics") is a core part of any undergraduate geophysics curriculum. In this Forum, we share our personal experience in teaching such a course.

  5. BIONET: national computer resource for molecular biology.

    PubMed Central

    Smith, D H; Brutlag, D; Friedland, P; Kedes, L H

    1986-01-01

    This paper describes briefly the BIONET National Computer Resource for Molecular Biology. This presentation is intended as information for scientists in molecular biology and related disciplines who require access to computational methods for sequence analysis. We describe the goals, and the service and research opportunities offered to the community by BIONET, the relationship of BIONET to other national and regional resources, our recent efforts toward distribution of the resource to BIONET Satellites, and procedures for investigators to gain access to the Resource. PMID:3945548

  6. Methods and systems for providing reconfigurable and recoverable computing resources

    NASA Technical Reports Server (NTRS)

    Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)

    2010-01-01

    A method for optimizing the use of digital computing resources to achieve reliability and availability of the computing resources is disclosed. The method comprises providing one or more processors with a recovery mechanism, the one or more processors executing one or more applications. A determination is made whether the one or more processors needs to be reconfigured. A rapid recovery is employed to reconfigure the one or more processors when needed. A computing system that provides reconfigurable and recoverable computing resources is also disclosed. The system comprises one or more processors with a recovery mechanism, with the one or more processors configured to execute a first application, and an additional processor configured to execute a second application different than the first application. The additional processor is reconfigurable with rapid recovery such that the additional processor can execute the first application when one of the one more processors fails.

  7. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  8. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  9. Computer Network Resources for Physical Geography Instruction.

    ERIC Educational Resources Information Center

    Bishop, Michael P.; And Others

    1993-01-01

    Asserts that the use of computer networks provides an important and effective resource for geography instruction. Describes the use of the Internet network in physical geography instruction. Provides an example of the use of Internet resources in a climatology/meteorology course. (CFR)

  10. Resources for Teaching with Computers in History.

    ERIC Educational Resources Information Center

    Seiter, David M.

    1988-01-01

    Highlights seven resources for teaching history with computers. Titles include "Technology and the Social Studies: A Vision"; "Using Databases in History Teaching"; "Computers and Database Management in the History Curriculum"; "Social Studies. Microsoft Courseware Evaluations"; "Teaching Comparative Local History: Upper Mississippi River Towns";…

  11. Managing Urban School System Resources: New Procedures, Addition Actors.

    ERIC Educational Resources Information Center

    Hentschke, Guilbert C.

    In recent years urban school systems have had to face unusually severe economic constraints. In the process of adjusting to these constraints, urban systems will likely seek new ways to reallocate existing resources and will undertake more cooperative ventures with other organizational entities to gain access to additional resources. Four…

  12. Modern meteorological computing resources - The Maryland experience

    NASA Technical Reports Server (NTRS)

    Huffman, George J.

    1988-01-01

    The Department of Meteorology at the University of Maryland is developing one of the first computer systems in meteorology to take advantage of the new networked computer architecture that has been made possible by recent advances in computer and communication technology. Elements of the department's system include scientific workstations, local mainframe computers, remote mainframe computers, local-area networks,'long-haul' computer-to-computer communications, and 'receive-only' communications. Some background is provided, together with highlights of some lessons that were learned in carrying out the design. In agreement with work in the Unidata Project, this work shows that the networked computer architecture discussed here presents a new style of resources for solving problems that arise in meteorological research and education.

  13. Argonne's Laboratory computing resource center : 2006 annual report.

    SciTech Connect

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

  14. Argonne Laboratory Computing Resource Center - FY2004 Report.

    SciTech Connect

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  15. Argonne's Laboratory Computing Resource Center 2009 annual report.

    SciTech Connect

    Bair, R. B.

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  16. Exploiting volatile opportunistic computing resources with Lobster

    NASA Astrophysics Data System (ADS)

    Woodard, Anna; Wolf, Matthias; Mueller, Charles; Tovar, Ben; Donnelly, Patrick; Hurtado Anampa, Kenyi; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2015-12-01

    Analysis of high energy physics experiments using the Compact Muon Solenoid (CMS) at the Large Hadron Collider (LHC) can be limited by availability of computing resources. As a joint effort involving computer scientists and CMS physicists at Notre Dame, we have developed an opportunistic workflow management tool, Lobster, to harvest available cycles from university campus computing pools. Lobster consists of a management server, file server, and worker processes which can be submitted to any available computing resource without requiring root access. Lobster makes use of the Work Queue system to perform task management, while the CMS specific software environment is provided via CVMFS and Parrot. Data is handled via Chirp and Hadoop for local data storage and XrootD for access to the CMS wide-area data federation. An extensive set of monitoring and diagnostic tools have been developed to facilitate system optimisation. We have tested Lobster using the 20 000-core cluster at Notre Dame, achieving approximately 8-10k tasks running simultaneously, sustaining approximately 9 Gbit/s of input data and 340 Mbit/s of output data.

  17. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    SciTech Connect

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    comprehensive scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has begun developing a 'path forward' plan for additional computing resources.

  18. ACToR - Aggregated Computational Toxicology Resource

    SciTech Connect

    Judson, Richard Richard, Ann; Dix, David; Houck, Keith; Elloumi, Fathi; Martin, Matthew; Cathey, Tommy; Transue, Thomas R.; Spencer, Richard; Wolf, Maritja

    2008-11-15

    ACToR (Aggregated Computational Toxicology Resource) is a database and set of software applications that bring into one central location many types and sources of data on environmental chemicals. Currently, the ACToR chemical database contains information on chemical structure, in vitro bioassays and in vivo toxicology assays derived from more than 150 sources including the U.S. Environmental Protection Agency (EPA), Centers for Disease Control (CDC), U.S. Food and Drug Administration (FDA), National Institutes of Health (NIH), state agencies, corresponding government agencies in Canada, Europe and Japan, universities, the World Health Organization (WHO) and non-governmental organizations (NGOs). At the EPA National Center for Computational Toxicology, ACToR helps manage large data sets being used in a high-throughput environmental chemical screening and prioritization program called ToxCast{sup TM}.

  19. Addition of multiple limiting resources reduces grassland diversity

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Niche dimensionality is the most general theoretical explanation for biodiversity: more niches allow for more ecological tradeoffs between species and thus greater opportunities for coexistence. Resource competition theory predicts that removing resource limitations, by increasing resource availabil...

  20. Additional development of the XTRAN3S computer program

    NASA Technical Reports Server (NTRS)

    Borland, C. J.

    1989-01-01

    Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.

  1. Scalable resource management in high performance computers.

    SciTech Connect

    Frachtenberg, E.; Petrini, F.; Fernandez Peinador, J.; Coll, S.

    2002-01-01

    Clusters of workstations have emerged as an important platform for building cost-effective, scalable and highly-available computers. Although many hardware solutions are available today, the largest challenge in making large-scale clusters usable lies in the system software. In this paper we present STORM, a resource management tool designed to provide scalability, low overhead and the flexibility necessary to efficiently support and analyze a wide range of job scheduling algorithms. STORM achieves these feats by closely integrating the management daemons with the low-level features that are common in state-of-the-art high-performance system area networks. The architecture of STORM is based on three main technical innovations. First, a sizable part of the scheduler runs in the thread processor located on the network interface. Second, we use hardware collectives that are highly scalable both for implementing control heartbeats and to distribute the binary of a parallel job in near-constant time, irrespective of job and machine sizes. Third, we use an I/O bypass protocol that allows fast data movements from the file system to the communication buffers in the network interface and vice versa. The experimental results show that STORM can launch a job with a binary of 12MB on a 64 processor/32 node cluster in less than 0.25 sec on an empty network, in less than 0.45 sec when all the processors are busy computing other jobs, and in less than 0.65 sec when the network is flooded with a background traffic. This paper provides experimental and analytical evidence that these results scale to a much larger number of nodes. To the best of our knowledge, STORM is at least two orders of magnitude faster than existing production schedulers in launching jobs, performing resource management tasks and gang scheduling.

  2. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  3. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  4. NASA Center for Computational Sciences: History and Resources

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  5. Resource estimation in high performance medical image computing.

    PubMed

    Banalagay, Rueben; Covington, Kelsie Jade; Wilkes, D M; Landman, Bennett A

    2014-10-01

    Medical imaging analysis processes often involve the concatenation of many steps (e.g., multi-stage scripts) to integrate and realize advancements from image acquisition, image processing, and computational analysis. With the dramatic increase in data size for medical imaging studies (e.g., improved resolution, higher throughput acquisition, shared databases), interesting study designs are becoming intractable or impractical on individual workstations and servers. Modern pipeline environments provide control structures to distribute computational load in high performance computing (HPC) environments. However, high performance computing environments are often shared resources, and scheduling computation across these resources necessitates higher level modeling of resource utilization. Submission of 'jobs' requires an estimate of the CPU runtime and memory usage. The resource requirements for medical image processing algorithms are difficult to predict since the requirements can vary greatly between different machines, different execution instances, and different data inputs. Poor resource estimates can lead to wasted resources in high performance environments due to incomplete executions and extended queue wait times. Hence, resource estimation is becoming a major hurdle for medical image processing algorithms to efficiently leverage high performance computing environments. Herein, we present our implementation of a resource estimation system to overcome these difficulties and ultimately provide users with the ability to more efficiently utilize high performance computing resources.

  6. Resource estimation in high performance medical image computing.

    PubMed

    Banalagay, Rueben; Covington, Kelsie Jade; Wilkes, D M; Landman, Bennett A

    2014-10-01

    Medical imaging analysis processes often involve the concatenation of many steps (e.g., multi-stage scripts) to integrate and realize advancements from image acquisition, image processing, and computational analysis. With the dramatic increase in data size for medical imaging studies (e.g., improved resolution, higher throughput acquisition, shared databases), interesting study designs are becoming intractable or impractical on individual workstations and servers. Modern pipeline environments provide control structures to distribute computational load in high performance computing (HPC) environments. However, high performance computing environments are often shared resources, and scheduling computation across these resources necessitates higher level modeling of resource utilization. Submission of 'jobs' requires an estimate of the CPU runtime and memory usage. The resource requirements for medical image processing algorithms are difficult to predict since the requirements can vary greatly between different machines, different execution instances, and different data inputs. Poor resource estimates can lead to wasted resources in high performance environments due to incomplete executions and extended queue wait times. Hence, resource estimation is becoming a major hurdle for medical image processing algorithms to efficiently leverage high performance computing environments. Herein, we present our implementation of a resource estimation system to overcome these difficulties and ultimately provide users with the ability to more efficiently utilize high performance computing resources. PMID:24906466

  7. Diversity in computing technologies and strategies for dynamic resource allocation

    SciTech Connect

    Garzoglio, G.; Gutsche, O.

    2015-01-01

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  8. Computing the Envelope for Stepwise-Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Computing tight resource-level bounds is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with nodes equal to the events and edges equal to the necessary predecessor links between events. A staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. Each stage has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible and promising for use in the inner loop of flexible-time scheduling algorithms.

  9. Resource Provisioning in SLA-Based Cluster Computing

    NASA Astrophysics Data System (ADS)

    Xiong, Kaiqi; Suh, Sang

    Cluster computing is excellent for parallel computation. It has become increasingly popular. In cluster computing, a service level agreement (SLA) is a set of quality of services (QoS) and a fee agreed between a customer and an application service provider. It plays an important role in an e-business application. An application service provider uses a set of cluster computing resources to support e-business applications subject to an SLA. In this paper, the QoS includes percentile response time and cluster utilization. We present an approach for resource provisioning in such an environment that minimizes the total cost of cluster computing resources used by an application service provider for an e-business application that often requires parallel computation for high service performance, availability, and reliability while satisfying a QoS and a fee negotiated between a customer and the application service provider. Simulation experiments demonstrate the applicability of the approach.

  10. Computing the Envelope for Stepwise Constant Resource Allocations

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Estimating tight resource level is a fundamental problem in the construction of flexible plans with resource utilization. In this paper we describe an efficient algorithm that builds a resource envelope, the tightest possible such bound. The algorithm is based on transforming the temporal network of resource consuming and producing events into a flow network with noises equal to the events and edges equal to the necessary predecessor links between events. The incremental solution of a staged maximum flow problem on the network is then used to compute the time of occurrence and the height of each step of the resource envelope profile. The staged algorithm has the same computational complexity of solving a maximum flow problem on the entire flow network. This makes this method computationally feasible for use in the inner loop of search-based scheduling algorithms.

  11. ACToR A Aggregated Computational Toxicology Resource (S)

    EPA Science Inventory

    We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.

  12. Resource requirements for digital computations on electrooptical systems.

    PubMed

    Eshaghian, M M; Panda, D K; Kumar, V K

    1991-03-10

    In this paper we study the resource requirements of electrooptical organizations in performing digital computing tasks. We define a generic model of parallel computation using optical interconnects, called the optical model of computation (OMC). In this model, computation is performed in digital electronics and communication is performed using free space optics. Using this model we derive relationships between information transfer and computational resources in solving a given problem. To illustrate our results, we concentrate on a computationally intensive operation, 2-D digital image convolution. Irrespective of the input/output scheme and the order of computation, we show a lower bound of ?(nw) on the optical volume required for convolving a w x w kernel with an n x n image, if the input bits are given to the system only once.

  13. Trading Classical and Quantum Computational Resources

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey; Smith, Graeme; Smolin, John A.

    2016-04-01

    We propose examples of a hybrid quantum-classical simulation where a classical computer assisted by a small quantum processor can efficiently simulate a larger quantum system. First, we consider sparse quantum circuits such that each qubit participates in O (1 ) two-qubit gates. It is shown that any sparse circuit on n +k qubits can be simulated by sparse circuits on n qubits and a classical processing that takes time 2O (k )poly (n ) . Second, we study Pauli-based computation (PBC), where allowed operations are nondestructive eigenvalue measurements of n -qubit Pauli operators. The computation begins by initializing each qubit in the so-called magic state. This model is known to be equivalent to the universal quantum computer. We show that any PBC on n +k qubits can be simulated by PBCs on n qubits and a classical processing that takes time 2O (k )poly (n ). Finally, we propose a purely classical algorithm that can simulate a PBC on n qubits in a time 2α npoly (n ) , where α ≈0.94 . This improves upon the brute-force simulation method, which takes time 2npoly (n ). Our algorithm exploits the fact that n -fold tensor products of magic states admit a low-rank decomposition into n -qubit stabilizer states.

  14. Computer Cartography. Resource Paper No. 17.

    ERIC Educational Resources Information Center

    Peucker, Thomas K.

    The theory of computer cartography is emphasized in an attempt to bring some related ideas together within a single framework. The paper is part of a series designed to fill the gap between significant research in geography and accessible materials. Part I introduces information theory and cartography, the features of numeric cartography and their…

  15. The CREATE Network (Computer Resource Educational Access in Tennessee Education).

    ERIC Educational Resources Information Center

    Moon, Fletcher F.

    The CREATE Network (Computer Resource Educational Access in Tennessee Education) brought together library professionals from Tennessee's seven historically black colleges and universities (HBCUs) for purposes of training and implementation of library applications of computer-based information technology. Annual training seminars were held at…

  16. Development of Computer-Based Resources for Textile Education.

    ERIC Educational Resources Information Center

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  17. The Computer Explosion: Implications for Educational Equity. Resource Notebook.

    ERIC Educational Resources Information Center

    Denbo, Sheryl, Comp.

    This notebook was prepared to provide resources for educators interested in using computers to increase opportunities for all students. The notebook contains specially prepared materials and selected newspaper and journal articles. The first section reviews the issues related to computer equity (equal access, tracking through different…

  18. Performance Evaluation of Resource Management in Cloud Computing Environments.

    PubMed

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price.

  19. Performance Evaluation of Resource Management in Cloud Computing Environments

    PubMed Central

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price. PMID:26555730

  20. Performance Evaluation of Resource Management in Cloud Computing Environments.

    PubMed

    Batista, Bruno Guazzelli; Estrella, Julio Cezar; Ferreira, Carlos Henrique Gomes; Filho, Dionisio Machado Leite; Nakamura, Luis Hideo Vasconcelos; Reiff-Marganiec, Stephan; Santana, Marcos José; Santana, Regina Helena Carlucci

    2015-01-01

    Cloud computing is a computational model in which resource providers can offer on-demand services to clients in a transparent way. However, to be able to guarantee quality of service without limiting the number of accepted requests, providers must be able to dynamically manage the available resources so that they can be optimized. This dynamic resource management is not a trivial task, since it involves meeting several challenges related to workload modeling, virtualization, performance modeling, deployment and monitoring of applications on virtualized resources. This paper carries out a performance evaluation of a module for resource management in a cloud environment that includes handling available resources during execution time and ensuring the quality of service defined in the service level agreement. An analysis was conducted of different resource configurations to define which dimension of resource scaling has a real influence on client requests. The results were used to model and implement a simulated cloud system, in which the allocated resource can be changed on-the-fly, with a corresponding change in price. In this way, the proposed module seeks to satisfy both the client by ensuring quality of service, and the provider by ensuring the best use of resources at a fair price. PMID:26555730

  1. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  2. X-ray computed tomography for additive manufacturing: a review

    NASA Astrophysics Data System (ADS)

    Thompson, A.; Maskery, I.; Leach, R. K.

    2016-07-01

    In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM.

  3. Additional support for the TDK/MABL computer program

    NASA Technical Reports Server (NTRS)

    Nickerson, G. R.; Dunn, Stuart S.

    1993-01-01

    An advanced version of the Two-Dimensional Kinetics (TDK) computer program was developed under contract and released to the propulsion community in early 1989. Exposure of the code to this community indicated a need for improvements in certain areas. In particular, the TDK code needed to be adapted to the special requirements imposed by the Space Transportation Main Engine (STME) development program. This engine utilizes injection of the gas generator exhaust into the primary nozzle by means of a set of slots. The subsequent mixing of this secondary stream with the primary stream with finite rate chemical reaction can have a major impact on the engine performance and the thermal protection of the nozzle wall. In attempting to calculate this reacting boundary layer problem, the Mass Addition Boundary Layer (MABL) module of TDK was found to be deficient in several respects. For example, when finite rate chemistry was used to determine gas properties, (MABL-K option) the program run times became excessive because extremely small step sizes were required to maintain numerical stability. A robust solution algorithm was required so that the MABL-K option could be viable as a rocket propulsion industry design tool. Solving this problem was a primary goal of the phase 1 work effort.

  4. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  5. Shared resource control between human and computer

    NASA Technical Reports Server (NTRS)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  6. Incorporating computational resources in a cancer research program

    PubMed Central

    Woods, Nicholas T.; Jhuraney, Ankita; Monteiro, Alvaro N.A.

    2015-01-01

    Recent technological advances have transformed cancer genetics research. These advances have served as the basis for the generation of a number of richly annotated datasets relevant to the cancer geneticist. In addition, many of these technologies are now within reach of smaller laboratories to answer specific biological questions. Thus, one of the most pressing issues facing an experimental cancer biology research program in genetics is incorporating data from multiple sources to annotate, visualize, and analyze the system under study. Fortunately, there are several computational resources to aid in this process. However, a significant effort is required to adapt a molecular biology-based research program to take advantage of these datasets. Here, we discuss the lessons learned in our laboratory and share several recommendations to make this transition effectively. This article is not meant to be a comprehensive evaluation of all the available resources, but rather highlight those that we have incorporated into our laboratory and how to choose the most appropriate ones for your research program. PMID:25324189

  7. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  8. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  9. The Job Demands-Resources Model: An Analysis of Additive and Joint Effects of Demands and Resources

    ERIC Educational Resources Information Center

    Hu, Qiao; Schaufeli, Wilmar B.; Taris, Toon W.

    2011-01-01

    The present study investigated the additive, synergistic, and moderating effects of job demands and job resources on well-being (burnout and work engagement) and organizational outcomes, as specified by the Job Demands-Resources (JD-R) model. A survey was conducted among two Chinese samples: 625 blue collar workers and 761 health professionals. A…

  10. The Gain of Resource Delegation in Distributed Computing Environments

    NASA Astrophysics Data System (ADS)

    Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander

    In this paper, we address job scheduling in Distributed Computing Infrastructures, that is a loosely coupled network of autonomous acting High Performance Computing systems. In contrast to the common approach of mutual workload exchange, we consider the more intuitive operator's viewpoint of load-dependent resource reconfiguration. In case of a site's over-utilization, the scheduling system is able to lease resources from other sites to keep up service quality for its local user community. Contrary, the granting of idle resources can increase utilization in times of low local workload and thus ensure higher efficiency. The evaluation considers real workload data and is done with respect to common service quality indicators. For two simple resource exchange policies and three basic setups we show the possible gain of this approach and analyze the dynamics in workload-adaptive reconfiguration behavior.

  11. Dynamic computing resource allocation in online flood monitoring and prediction

    NASA Astrophysics Data System (ADS)

    Kuchar, S.; Podhoranyi, M.; Vavrik, R.; Portero, A.

    2016-08-01

    This paper presents tools and methodologies for dynamic allocation of high performance computing resources during operation of the Floreon+ online flood monitoring and prediction system. The resource allocation is done throughout the execution of supported simulations to meet the required service quality levels for system operation. It also ensures flexible reactions to changing weather and flood situations, as it is not economically feasible to operate online flood monitoring systems in the full performance mode during non-flood seasons. Different service quality levels are therefore described for different flooding scenarios, and the runtime manager controls them by allocating only minimal resources currently expected to meet the deadlines. Finally, an experiment covering all presented aspects of computing resource allocation in rainfall-runoff and Monte Carlo uncertainty simulation is performed for the area of the Moravian-Silesian region in the Czech Republic.

  12. Economic models for management of resources in peer-to-peer and grid computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Stockinger, Heinz; Giddy, Jonathan; Abramson, David

    2001-07-01

    The accelerated development in Peer-to-Peer (P2P) and Grid computing has positioned them as promising next generation computing platforms. They enable the creation of Virtual Enterprises (VE) for sharing resources distributed across the world. However, resource management, application development and usage models in these environments is a complex undertaking. This is due to the geographic distribution of resources that are owned by different organizations or peers. The resource owners of each of these resources have different usage or access policies and cost models, and varying loads and availability. In order to address complex resource management issues, we have proposed a computational economy framework for resource allocation and for regulating supply and demand in Grid computing environments. The framework provides mechanisms for optimizing resource provider and consumer objective functions through trading and brokering services. In a real world market, there exist various economic models for setting the price for goods based on supply-and-demand and their value to the user. They include commodity market, posted price, tenders and auctions. In this paper, we discuss the use of these models for interaction between Grid components in deciding resource value and the necessary infrastructure to realize them. In addition to normal services offered by Grid computing systems, we need an infrastructure to support interaction protocols, allocation mechanisms, currency, secure banking, and enforcement services. Furthermore, we demonstrate the usage of some of these economic models in resource brokering through Nimrod/G deadline and cost-based scheduling for two different optimization strategies on the World Wide Grid (WWG) testbed that contains peer-to-peer resources located on five continents: Asia, Australia, Europe, North America, and South America.

  13. Aggregation of Cricket Activity in Response to Resource Addition Increases Local Diversity

    PubMed Central

    Szinwelski, Neucir; Rosa, Cassiano Sousa; Solar, Ricardo Ribeiro de Castro; Sperber, Carlos Frankl

    2015-01-01

    Crickets are often found feeding on fallen fruits among forest litter. Fruits and other sugar-rich resources are not homogeneously distributed, nor are they always available. We therefore expect that crickets dwelling in forest litter have a limited supply of sugar-rich resource, and will perceive this and displace towards resource-supplemented sites. Here we evaluate how sugar availability affects cricket species richness and abundance in old-growth Atlantic forest by spraying sugarcane syrup on leaf litter, simulating increasing availability, and collecting crickets via pitfall trapping. We found an asymptotic positive association between resource addition and species richness, and an interaction between resource addition and species identity on cricket abundance, which indicates differential effects of resource addition among cricket species. Our results indicate that 12 of the 13 cricket species present in forest litter are maintained at low densities by resource scarcity; this highlights sugar-rich resource as a short-term driver of litter cricket community structure in tropical forests. When resource was experimentally increased, species richness increased due to behavioral displacement. We present evidence that the density of many species is limited by resource scarcity and, when resources are added, behavioral displacement promotes increased species packing and alters species composition. Further, our findings have technical applicability for increasing sampling efficiency of local cricket diversity in studies aiming to estimate species richness, but with no regard to local environmental drivers or species-abundance characteristics. PMID:26436669

  14. Computer-Mediated Communication as a Learning Resource.

    ERIC Educational Resources Information Center

    McAteer, Erica; Tolmie, A.; Duffy, C.; Corbett, J.

    1997-01-01

    Examines computer-mediated communication (CMC) in higher education, particularly the evaluation of network resources for course-related student communication. Presents case studies of e-mail seminars in Scottish language studies and music, demonstrates the significance of emphasizing process over outcome, and discusses factors affecting CMC usage:…

  15. Computer Technology in Curriculum and Instruction Handbook: Resources.

    ERIC Educational Resources Information Center

    Collins, Sue; And Others

    This annotated bibliography and resource list is the fourth guide in a series that constitutes a handbook on educational computing for teachers and administrators. Citations include the title, author, publisher, copyright date, current price, and publisher's address. Research and journal articles are listed under the following topic areas:…

  16. ACToR: Aggregated Computational Toxicology Resource (T)

    EPA Science Inventory

    The EPA Aggregated Computational Toxicology Resource (ACToR) is a set of databases compiling information on chemicals in the environment from a large number of public and in-house EPA sources. ACToR has 3 main goals: (1) The serve as a repository of public toxicology information ...

  17. Resource Manual on the Use of Computers in Schooling.

    ERIC Educational Resources Information Center

    New York State Education Dept., Albany. Bureau of Technology Applications.

    This resource manual is designed to provide educators with timely information on the use of computers and related technology in schools. Section one includes a review of the new Bureau of Technology Applications' goal, functions, and major programs and activities; a description of the Model Schools Program, which has been conceptually derived from…

  18. Computing Bounds on Resource Levels for Flexible Plans

    NASA Technical Reports Server (NTRS)

    Muscvettola, Nicola; Rijsman, David

    2009-01-01

    A new algorithm efficiently computes the tightest exact bound on the levels of resources induced by a flexible activity plan (see figure). Tightness of bounds is extremely important for computations involved in planning because tight bounds can save potentially exponential amounts of search (through early backtracking and detection of solutions), relative to looser bounds. The bound computed by the new algorithm, denoted the resource-level envelope, constitutes the measure of maximum and minimum consumption of resources at any time for all fixed-time schedules in the flexible plan. At each time, the envelope guarantees that there are two fixed-time instantiations one that produces the minimum level and one that produces the maximum level. Therefore, the resource-level envelope is the tightest possible resource-level bound for a flexible plan because any tighter bound would exclude the contribution of at least one fixed-time schedule. If the resource- level envelope can be computed efficiently, one could substitute looser bounds that are currently used in the inner cores of constraint-posting scheduling algorithms, with the potential for great improvements in performance. What is needed to reduce the cost of computation is an algorithm, the measure of complexity of which is no greater than a low-degree polynomial in N (where N is the number of activities). The new algorithm satisfies this need. In this algorithm, the computation of resource-level envelopes is based on a novel combination of (1) the theory of shortest paths in the temporal-constraint network for the flexible plan and (2) the theory of maximum flows for a flow network derived from the temporal and resource constraints. The measure of asymptotic complexity of the algorithm is O(N O(maxflow(N)), where O(x) denotes an amount of computing time or a number of arithmetic operations proportional to a number of the order of x and O(maxflow(N)) is the measure of complexity (and thus of cost) of a maximumflow

  19. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    NASA Astrophysics Data System (ADS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie; Atlas Collaboration

    2014-06-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  20. Additional extensions to the NASCAP computer code, volume 3

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Cooke, D. L.

    1981-01-01

    The ION computer code is designed to calculate charge exchange ion densities, electric potentials, plasma temperatures, and current densities external to a neutralized ion engine in R-Z geometry. The present version assumes the beam ion current and density to be known and specified, and the neutralizing electrons to originate from a hot-wire ring surrounding the beam orifice. The plasma is treated as being resistive, with an electron relaxation time comparable to the plasma frequency. Together with the thermal and electrical boundary conditions described below and other straightforward engine parameters, these assumptions suffice to determine the required quantities. The ION code, written in ASCII FORTRAN for UNIVAC 1100 series computers, is designed to be run interactively, although it can also be run in batch mode. The input is free-format, and the output is mainly graphical, using the machine-independent graphics developed for the NASCAP code. The executive routine calls the code's major subroutines in user-specified order, and the code allows great latitude for restart and parameter change.

  1. 15 CFR 270.204 - Provision of additional resources and services needed by a Team.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... services needed by a Team. 270.204 Section 270.204 Commerce and Foreign Trade Regulations Relating to... CONSTRUCTION SAFETY TEAMS NATIONAL CONSTRUCTION SAFETY TEAMS Investigations § 270.204 Provision of additional resources and services needed by a Team. The Director will determine the appropriate resources that a...

  2. Additional extensions to the NASCAP computer code, volume 1

    NASA Technical Reports Server (NTRS)

    Mandell, M. J.; Katz, I.; Stannard, P. R.

    1981-01-01

    Extensions and revisions to a computer code that comprehensively analyzes problems of spacecraft charging (NASCAP) are documented. Using a fully three dimensional approach, it can accurately predict spacecraft potentials under a variety of conditions. Among the extensions are a multiple electron/ion gun test tank capability, and the ability to model anisotropic and time dependent space environments. Also documented are a greatly extended MATCHG program and the preliminary version of NASCAP/LEO. The interactive MATCHG code was developed into an extremely powerful tool for the study of material-environment interactions. The NASCAP/LEO, a three dimensional code to study current collection under conditions of high voltages and short Debye lengths, was distributed for preliminary testing.

  3. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models.

  4. Computational calculation of equilibrium constants: addition to carbonyl compounds.

    PubMed

    Gómez-Bombarelli, Rafael; González-Pérez, Marina; Pérez-Prior, María Teresa; Calle, Emilio; Casado, Julio

    2009-10-22

    Hydration reactions are relevant for understanding many organic mechanisms. Since the experimental determination of hydration and hemiacetalization equilibrium constants is fairly complex, computational calculations now offer a useful alternative to experimental measurements. In this work, carbonyl hydration and hemiacetalization constants were calculated from the free energy differences between compounds in solution, using absolute and relative approaches. The following conclusions can be drawn: (i) The use of a relative approach in the calculation of hydration and hemiacetalization constants allows compensation of systematic errors in the solvation energies. (ii) On average, the methodology proposed here can predict hydration constants within +/- 0.5 log K(hyd) units for aldehydes. (iii) Hydration constants can be calculated for ketones and carboxylic acid derivatives within less than +/- 1.0 log K(hyd), on average, at the CBS-Q level of theory. (iv) The proposed methodology can predict hemiacetal formation constants accurately at the MP2 6-31++G(d,p) level using a common reference. If group references are used, the results obtained using the much cheaper DFT-B3LYP 6-31++G(d,p) level are almost as accurate. (v) In general, the best results are obtained if a common reference for all compounds is used. The use of group references improves the results at the lower levels of theory, but at higher levels, this becomes unnecessary. PMID:19761202

  5. Computational Calculation of Equilibrium Constants: Addition to Carbonyl Compounds

    NASA Astrophysics Data System (ADS)

    Gómez-Bombarelli, Rafael; González-Pérez, Marina; Pérez-Prior, María Teresa; Calle, Emilio; Casado, Julio

    2009-09-01

    Hydration reactions are relevant for understanding many organic mechanisms. Since the experimental determination of hydration and hemiacetalization equilibrium constants is fairly complex, computational calculations now offer a useful alternative to experimental measurements. In this work, carbonyl hydration and hemiacetalization constants were calculated from the free energy differences between compounds in solution, using absolute and relative approaches. The following conclusions can be drawn: (i) The use of a relative approach in the calculation of hydration and hemiacetalization constants allows compensation of systematic errors in the solvation energies. (ii) On average, the methodology proposed here can predict hydration constants within ± 0.5 log Khyd units for aldehydes. (iii) Hydration constants can be calculated for ketones and carboxylic acid derivatives within less than ± 1.0 log Khyd, on average, at the CBS-Q level of theory. (iv) The proposed methodology can predict hemiacetal formation constants accurately at the MP2 6-31++G(d,p) level using a common reference. If group references are used, the results obtained using the much cheaper DFT-B3LYP 6-31++G(d,p) level are almost as accurate. (v) In general, the best results are obtained if a common reference for all compounds is used. The use of group references improves the results at the lower levels of theory, but at higher levels, this becomes unnecessary.

  6. Additive Manufacturing of Anatomical Models from Computed Tomography Scan Data.

    PubMed

    Gür, Y

    2014-12-01

    The purpose of the study presented here was to investigate the manufacturability of human anatomical models from Computed Tomography (CT) scan data via a 3D desktop printer which uses fused deposition modelling (FDM) technology. First, Digital Imaging and Communications in Medicine (DICOM) CT scan data were converted to 3D Standard Triangle Language (STL) format by using In Vaselius digital imaging program. Once this STL file is obtained, a 3D physical version of the anatomical model can be fabricated by a desktop 3D FDM printer. As a case study, a patient's skull CT scan data was considered, and a tangible version of the skull was manufactured by a 3D FDM desktop printer. During the 3D printing process, the skull was built using acrylonitrile-butadiene-styrene (ABS) co-polymer plastic. The printed model showed that the 3D FDM printing technology is able to fabricate anatomical models with high accuracy. As a result, the skull model can be used for preoperative surgical planning, medical training activities, implant design and simulation to show the potential of the FDM technology in medical field. It will also improve communication between medical stuff and patients. Current result indicates that a 3D desktop printer which uses FDM technology can be used to obtain accurate anatomical models. PMID:26336695

  7. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  8. Emotor control: computations underlying bodily resource allocation, emotions, and confidence

    PubMed Central

    Kepecs, Adam; Mensh, Brett D.

    2015-01-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience—approaching subjective behavior as the result of mental computations instantiated in the brain—to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This “emotor” control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on “confidence.” Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior. PMID:26869840

  9. Emotor control: computations underlying bodily resource allocation, emotions, and confidence.

    PubMed

    Kepecs, Adam; Mensh, Brett D

    2015-12-01

    Emotional processes are central to behavior, yet their deeply subjective nature has been a challenge for neuroscientific study as well as for psychiatric diagnosis. Here we explore the relationships between subjective feelings and their underlying brain circuits from a computational perspective. We apply recent insights from systems neuroscience-approaching subjective behavior as the result of mental computations instantiated in the brain-to the study of emotions. We develop the hypothesis that emotions are the product of neural computations whose motor role is to reallocate bodily resources mostly gated by smooth muscles. This "emotor" control system is analagous to the more familiar motor control computations that coordinate skeletal muscle movements. To illustrate this framework, we review recent research on "confidence." Although familiar as a feeling, confidence is also an objective statistical quantity: an estimate of the probability that a hypothesis is correct. This model-based approach helped reveal the neural basis of decision confidence in mammals and provides a bridge to the subjective feeling of confidence in humans. These results have important implications for psychiatry, since disorders of confidence computations appear to contribute to a number of psychopathologies. More broadly, this computational approach to emotions resonates with the emerging view that psychiatric nosology may be best parameterized in terms of disorders of the cognitive computations underlying complex behavior.

  10. Pricing the Computing Resources: Reading Between the Lines and Beyond

    NASA Technical Reports Server (NTRS)

    Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)

    2001-01-01

    Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.

  11. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  12. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  13. The Use of Passwords for Controlled Access to Computer Resources. Computer Science & Technology.

    ERIC Educational Resources Information Center

    Wood, Helen M.

    This paper considers the generation of passwords and their effective application to the problem of controlling access to computer resources. After describing the need for and uses of passwords, password schemes are categorized according to selection technique, lifetime, physical characteristics, and information content. Password protection, both…

  14. Enabling Grid Computing resources within the KM3NeT computing model

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  15. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  16. Geology and mineral and energy resources, Roswell Resource Area, New Mexico; an interactive computer presentation

    USGS Publications Warehouse

    Tidball, Ronald R.; Bartsch-Winkler, S. B.

    1995-01-01

    This Compact Disc-Read Only Memory (CD-ROM) contains a program illustrating the geology and mineral and energy resources of the Roswell Resource Area, an administrative unit of the U.S. Bureau of Land Management in east-central New Mexico. The program enables the user to access information on the geology, geochemistry, geophysics, mining history, metallic and industrial mineral commodities, hydrocarbons, and assessments of the area. The program was created with the display software, SuperCard, version 1.5, by Aldus. The program will run only on a Macintosh personal computer. This CD-ROM was produced in accordance with Macintosh HFS standards. The program was developed on a Macintosh II-series computer with system 7.0.1. The program is a compiled, executable form that is nonproprietary and does not require the presence of the SuperCard software.

  17. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  18. Performance modeling of parallel computations in resource-constrained systems

    SciTech Connect

    Ghodsi, M.

    1989-01-01

    In this research, the author studies several problems related to the modeling and performance analysis of parallel computations. First, the author investigates the properties of a new task graph model, called the generalized task graphs, which can represent the nondeterminism involved in the parallel search algorithms. In parallel search, several possible solutions to a problem are carried out concurrently, with the intention of having only one successful result. The general task graphs are prone to problems such as deadlock, unboundedness, and unsafeness. The author defines the notion of well-formedness of task graphs by viewing them as high level Petri nets and provide necessary and sufficient conditions under which a generalized task graph is well formed. Next, the author proposes a new approximate iterative algorithm to predict the performance of a resource constrained queueing network, running a number of statistically identical jobs with internal concurrency. The jobs are assumed to be instances of an arbitrary task graph. The queueing network includes a limited number of identical passive resources. A task must acquire one unit of the passive resources before receiving service. Detailed experimental results are presented which show that the algorithm converges quite fast and is reasonably accurate.

  19. Quantum Computing: Selected Internet Resources for Librarians, Researchers, and the Casually Curious

    ERIC Educational Resources Information Center

    Cirasella, Jill

    2009-01-01

    This article presents an annotated selection of the most important and informative Internet resources for learning about quantum computing, finding quantum computing literature, and tracking quantum computing news. All of the quantum computing resources described in this article are freely available, English-language web sites that fall into one…

  20. SInCRe-structural interactome computational resource for Mycobacterium tuberculosis.

    PubMed

    Metri, Rahul; Hariharaputran, Sridhar; Ramakrishnan, Gayatri; Anand, Praveen; Raghavender, Upadhyayula S; Ochoa-Montaño, Bernardo; Higueruelo, Alicia P; Sowdhamini, Ramanathan; Chandra, Nagasuma R; Blundell, Tom L; Srinivasan, Narayanaswamy

    2015-01-01

    We have developed an integrated database for Mycobacterium tuberculosis H37Rv (Mtb) that collates information on protein sequences, domain assignments, functional annotation and 3D structural information along with protein-protein and protein-small molecule interactions. SInCRe (Structural Interactome Computational Resource) is developed out of CamBan (Cambridge and Bangalore) collaboration. The motivation for development of this database is to provide an integrated platform to allow easily access and interpretation of data and results obtained by all the groups in CamBan in the field of Mtb informatics. In-house algorithms and databases developed independently by various academic groups in CamBan are used to generate Mtb-specific datasets and are integrated in this database to provide a structural dimension to studies on tuberculosis. The SInCRe database readily provides information on identification of functional domains, genome-scale modelling of structures of Mtb proteins and characterization of the small-molecule binding sites within Mtb. The resource also provides structure-based function annotation, information on small-molecule binders including FDA (Food and Drug Administration)-approved drugs, protein-protein interactions (PPIs) and natural compounds that bind to pathogen proteins potentially and result in weakening or elimination of host-pathogen protein-protein interactions. Together they provide prerequisites for identification of off-target binding. PMID:26130660

  1. SInCRe—structural interactome computational resource for Mycobacterium tuberculosis

    PubMed Central

    Metri, Rahul; Hariharaputran, Sridhar; Ramakrishnan, Gayatri; Anand, Praveen; Raghavender, Upadhyayula S.; Ochoa-Montaño, Bernardo; Higueruelo, Alicia P.; Sowdhamini, Ramanathan; Chandra, Nagasuma R.; Blundell, Tom L.; Srinivasan, Narayanaswamy

    2015-01-01

    We have developed an integrated database for Mycobacterium tuberculosis H37Rv (Mtb) that collates information on protein sequences, domain assignments, functional annotation and 3D structural information along with protein–protein and protein–small molecule interactions. SInCRe (Structural Interactome Computational Resource) is developed out of CamBan (Cambridge and Bangalore) collaboration. The motivation for development of this database is to provide an integrated platform to allow easily access and interpretation of data and results obtained by all the groups in CamBan in the field of Mtb informatics. In-house algorithms and databases developed independently by various academic groups in CamBan are used to generate Mtb-specific datasets and are integrated in this database to provide a structural dimension to studies on tuberculosis. The SInCRe database readily provides information on identification of functional domains, genome-scale modelling of structures of Mtb proteins and characterization of the small-molecule binding sites within Mtb. The resource also provides structure-based function annotation, information on small-molecule binders including FDA (Food and Drug Administration)-approved drugs, protein–protein interactions (PPIs) and natural compounds that bind to pathogen proteins potentially and result in weakening or elimination of host–pathogen protein–protein interactions. Together they provide prerequisites for identification of off-target binding. Database URL: http://proline.biochem.iisc.ernet.in/sincre PMID:26130660

  2. A Review of Computer Science Resources for Learning and Teaching with K-12 Computing Curricula: An Australian Case Study

    ERIC Educational Resources Information Center

    Falkner, Katrina; Vivian, Rebecca

    2015-01-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age…

  3. Dynamic resource allocation scheme for distributed heterogeneous computer systems

    NASA Technical Reports Server (NTRS)

    Liu, Howard T. (Inventor); Silvester, John A. (Inventor)

    1991-01-01

    This invention relates to a resource allocation in computer systems, and more particularly, to a method and associated apparatus for shortening response time and improving efficiency of a heterogeneous distributed networked computer system by reallocating the jobs queued up for busy nodes to idle, or less-busy nodes. In accordance with the algorithm (SIDA for short), the load-sharing is initiated by the server device in a manner such that extra overhead in not imposed on the system during heavily-loaded conditions. The algorithm employed in the present invention uses a dual-mode, server-initiated approach. Jobs are transferred from heavily burdened nodes (i.e., over a high threshold limit) to low burdened nodes at the initiation of the receiving node when: (1) a job finishes at a node which is burdened below a pre-established threshold level, or (2) a node is idle for a period of time as established by a wakeup timer at the node. The invention uses a combination of the local queue length and the local service rate ratio at each node as the workload indicator.

  4. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region. PMID:23242683

  5. Resources allocation in healthcare for cancer: a case study using generalised additive mixed models.

    PubMed

    Musio, Monica; Sauleau, Erik A; Augustin, Nicole H

    2012-11-01

    Our aim is to develop a method for helping resources re-allocation in healthcare linked to cancer, in order to replan the allocation of providers. Ageing of the population has a considerable impact on the use of health resources because aged people require more specialised medical care due notably to cancer. We propose a method useful to monitor changes of cancer incidence in space and time taking into account two age categories, according to healthcar general organisation. We use generalised additive mixed models with a Poisson response, according to the methodology presented in Wood, Generalised additive models: an introduction with R. Chapman and Hall/CRC, 2006. Besides one-dimensional smooth functions accounting for non-linear effects of covariates, the space-time interaction can be modelled using scale invariant smoothers. Incidence data collected by a general cancer registry between 1992 and 2007 in a specific area of France is studied. Our best model exhibits a strong increase of the incidence of cancer along time and an obvious spatial pattern for people more than 70 years with a higher incidence in the central band of the region. This is a strong argument for re-allocating resources for old people cancer care in this sub-region.

  6. A review of Computer Science resources for learning and teaching with K-12 computing curricula: an Australian case study

    NASA Astrophysics Data System (ADS)

    Falkner, Katrina; Vivian, Rebecca

    2015-10-01

    To support teachers to implement Computer Science curricula into classrooms from the very first year of school, teachers, schools and organisations seek quality curriculum resources to support implementation and teacher professional development. Until now, many Computer Science resources and outreach initiatives have targeted K-12 school-age children, with the intention to engage children and increase interest, rather than to formally teach concepts and skills. What is the educational quality of existing Computer Science resources and to what extent are they suitable for classroom learning and teaching? In this paper, an assessment framework is presented to evaluate the quality of online Computer Science resources. Further, a semi-systematic review of available online Computer Science resources was conducted to evaluate resources available for classroom learning and teaching and to identify gaps in resource availability, using the Australian curriculum as a case study analysis. The findings reveal a predominance of quality resources, however, a number of critical gaps were identified. This paper provides recommendations and guidance for the development of new and supplementary resources and future research.

  7. The Relative Effectiveness of Computer-Based and Traditional Resources for Education in Anatomy

    ERIC Educational Resources Information Center

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R.; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic…

  8. A Web of Resources for Introductory Computer Science.

    ERIC Educational Resources Information Center

    Rebelsky, Samuel A.

    As the field of Computer Science has grown, the syllabus of the introductory Computer Science course has changed significantly. No longer is it a simple introduction to programming or a tutorial on computer concepts and applications. Rather, it has become a survey of the field of Computer Science, touching on a wide variety of topics from digital…

  9. Interactive computer methods for generating mineral-resource maps

    USGS Publications Warehouse

    Calkins, James Alfred; Crosby, A.S.; Huffman, T.E.; Clark, A.L.; Mason, G.T.; Bascle, R.J.

    1980-01-01

    Inasmuch as maps are a basic tool of geologists, the U.S. Geological Survey's CRIB (Computerized Resources Information Bank) was constructed so that the data it contains can be used to generate mineral-resource maps. However, by the standard methods used-batch processing and off-line plotting-the production of a finished map commonly takes 2-3 weeks. To produce computer-generated maps more rapidly, cheaply, and easily, and also to provide an effective demonstration tool, we have devised two related methods for plotting maps as alternatives to conventional batch methods. These methods are: 1. Quick-Plot, an interactive program whose output appears on a CRT (cathode-ray-tube) device, and 2. The Interactive CAM (Cartographic Automatic Mapping system), which combines batch and interactive runs. The output of the Interactive CAM system is final compilation (not camera-ready) paper copy. Both methods are designed to use data from the CRIB file in conjunction with a map-plotting program. Quick-Plot retrieves a user-selected subset of data from the CRIB file, immediately produces an image of the desired area on a CRT device, and plots data points according to a limited set of user-selected symbols. This method is useful for immediate evaluation of the map and for demonstrating how trial maps can be made quickly. The Interactive CAM system links the output of an interactive CRIB retrieval to a modified version of the CAM program, which runs in the batch mode and stores plotting instructions on a disk, rather than on a tape. The disk can be accessed by a CRT, and, thus, the user can view and evaluate the map output on a CRT immediately after a batch run, without waiting 1-3 days for an off-line plot. The user can, therefore, do most of the layout and design work in a relatively short time by use of the CRT, before generating a plot tape and having the map plotted on an off-line plotter.

  10. Incrementality and additionality: A new dimension to North-South resource transfers

    SciTech Connect

    Jordan, A. . School of Environmental Sciences); Werksman, J. . Foundation for International Environmental Law and Development)

    1994-06-01

    In the last four years, incrementality'' and additionality'' have emerged as new terms in the evolving lexicon of international environmental diplomacy. As Parties to the Conventions on Climate Change, Biodiversity and the Ozone Layer, industrialized states undertake to provide sufficient additional resources (the principle of additionality) to meet the incremental cost (the concept of incrementality) of measures undertaken by the developing countries to tackle global environmental problems. Issues of incrementality and additionality go to the heart of a much deeper and highly contentious debate on who should pay the costs of responding to global environmental problems; on how the payment should be made; on which agency or agencies should manage the transfers; and upon which parties should be compensated. Every sign is that if the overall North to South transfer breaks down or is retarded, then the process of implementing the aforementioned agreements may be jeopardized. This paper reviews the emergency of the two terms in international environmental politics; it pinpoints the theoretical and practical difficulties of defining and implementing them; and it assesses whether these difficulties and conflicts of opinion may, in some manner, be resolved.

  11. CHARMM additive and polarizable force fields for biophysics and computer-aided drug design

    PubMed Central

    Vanommeslaeghe, K.

    2014-01-01

    Background Molecular Mechanics (MM) is the method of choice for computational studies of biomolecular systems owing to its modest computational cost, which makes it possible to routinely perform molecular dynamics (MD) simulations on chemical systems of biophysical and biomedical relevance. Scope of Review As one of the main factors limiting the accuracy of MD results is the empirical force field used, the present paper offers a review of recent developments in the CHARMM additive force field, one of the most popular bimolecular force fields. Additionally, we present a detailed discussion of the CHARMM Drude polarizable force field, anticipating a growth in the importance and utilization of polarizable force fields in the near future. Throughout the discussion emphasis is placed on the force fields’ parametrization philosophy and methodology. Major Conclusions Recent improvements in the CHARMM additive force field are mostly related to newly found weaknesses in the previous generation of additive force fields. Beyond the additive approximation is the newly available CHARMM Drude polarizable force field, which allows for MD simulations of up to 1 microsecond on proteins, DNA, lipids and carbohydrates. General Significance Addressing the limitations ensures the reliability of the new CHARMM36 additive force field for the types of calculations that are presently coming into routine computational reach while the availability of the Drude polarizable force fields offers a model that is an inherently more accurate model of the underlying physical forces driving macromolecular structures and dynamics. PMID:25149274

  12. A Review of Resources for Evaluating K-12 Computer Science Education Programs

    ERIC Educational Resources Information Center

    Randolph, Justus J.; Hartikainen, Elina

    2004-01-01

    Since computer science education is a key to preparing students for a technologically-oriented future, it makes sense to have high quality resources for conducting summative and formative evaluation of those programs. This paper describes the results of a critical analysis of the resources for evaluating K-12 computer science education projects.…

  13. Effective Computer Resource Management: Keeping the Tail from Wagging the Dog.

    ERIC Educational Resources Information Center

    Sampson, James P., Jr.

    1982-01-01

    Predicts that student services will be increasingly influenced by computer technology. Suggests this resource be managed effectively to minimize potential problems and prevent a mechanistic and impersonal environment. Urges student personnel workers to assume active responsibility for planning, evaluating, and operating computer resources. (JAC)

  14. Utilizing a Collaborative Cross Number Puzzle Game to Develop the Computing Ability of Addition and Subtraction

    ERIC Educational Resources Information Center

    Chen, Yen-Hua; Looi, Chee-Kit; Lin, Chiu-Pin; Shao, Yin-Juan; Chan, Tak-Wai

    2012-01-01

    While addition and subtraction is a key mathematical skill for young children, a typical activity for them in classrooms involves doing repetitive arithmetic calculation exercises. In this study, we explore a collaborative way for students to learn these skills in a technology-enabled way with wireless computers. Two classes, comprising a total of…

  15. Professional Computer Education Organizations--A Resource for Administrators.

    ERIC Educational Resources Information Center

    Ricketts, Dick

    Professional computer education organizations serve a valuable function by generating, collecting, and disseminating information concerning the role of the computer in education. This report touches briefly on the reasons for the rapid and successful development of professional computer education organizations. A number of attributes of effective…

  16. An interactive computer approach to performing resource analysis for a multi-resource/multi-project problem. [Spacelab inventory procurement planning

    NASA Technical Reports Server (NTRS)

    Schlagheck, R. A.

    1977-01-01

    New planning techniques and supporting computer tools are needed for the optimization of resources and costs for space transportation and payload systems. Heavy emphasis on cost effective utilization of resources has caused NASA program planners to look at the impact of various independent variables that affect procurement buying. A description is presented of a category of resource planning which deals with Spacelab inventory procurement analysis. Spacelab is a joint payload project between NASA and the European Space Agency and will be flown aboard the Space Shuttle starting in 1980. In order to respond rapidly to the various procurement planning exercises, a system was built that could perform resource analysis in a quick and efficient manner. This system is known as the Interactive Resource Utilization Program (IRUP). Attention is given to aspects of problem definition, an IRUP system description, questions of data base entry, the approach used for project scheduling, and problems of resource allocation.

  17. Resources

    MedlinePlus

    ... Breastfeeding - resources Bulimia - resources Burns - resources Cancer - resources Cerebral palsy - resources Celiac disease - resources Child abuse - resources Chronic fatigue syndrome - resources Chronic pain - ...

  18. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  19. Sensor and computing resource management for a small satellite

    NASA Astrophysics Data System (ADS)

    Bhatia, Abhilasha; Goehner, Kyle; Sand, John; Straub, Jeremy; Mohammad, Atif; Korvald, Christoffer; Nervold, Anders Kose

    A small satellite in a low-Earth orbit (e.g., approximately a 300 to 400 km altitude) has an orbital velocity in the range of 8.5 km/s and completes an orbit approximately every 90 minutes. For a satellite with minimal attitude control, this presents a significant challenge in obtaining multiple images of a target region. Presuming an inclination in the range of 50 to 65 degrees, a limited number of opportunities to image a given target or communicate with a given ground station are available, over the course of a 24-hour period. For imaging needs (where solar illumination is required), the number of opportunities is further reduced. Given these short windows of opportunity for imaging, data transfer, and sending commands, scheduling must be optimized. In addition to the high-level scheduling performed for spacecraft operations, payload-level scheduling is also required. The mission requires that images be post-processed to maximize spatial resolution and minimize data transfer (through removing overlapping regions). The payload unit includes GPS and inertial measurement unit (IMU) hardware to aid in image alignment for the aforementioned. The payload scheduler must, thus, split its energy and computing-cycle budgets between determining an imaging sequence (required to capture the highly-overlapping data required for super-resolution and adjacent areas required for mosaicking), processing the imagery (to perform the super-resolution and mosaicking) and preparing the data for transmission (compressing it, etc.). This paper presents an approach for satellite control, scheduling and operations that allows the cameras, GPS and IMU to be used in conjunction to acquire higher-resolution imagery of a target region.

  20. Science and Technology Resources on the Internet: Computer Security.

    ERIC Educational Resources Information Center

    Kinkus, Jane F.

    2002-01-01

    Discusses issues related to computer security, including confidentiality, integrity, and authentication or availability; and presents a selected list of Web sites that cover the basic issues of computer security under subject headings that include ethics, privacy, kids, antivirus, policies, cryptography, operating system security, and biometrics.…

  1. Computer Crime: Criminal Justice Resource Manual (Second Edition).

    ERIC Educational Resources Information Center

    Parker, Donn B.

    This advanced training and reference manual is designed to aid investigators and prosecutors in dealing with white collar computer crime. The first five sections follow the typical order of events for prosecutors handling a criminal case: classifying the crime, computer abuse methods and detection, experts and suspects using information systems,…

  2. Computer-Based Fluency Training: A Resource for Higher Education.

    ERIC Educational Resources Information Center

    Yaber, Guillermo E.; Malott, Richard W.

    1993-01-01

    A behavioral systems analysis approach to higher education was used to design, implement, evaluate, and recycle a computer-based fluency training component of a behavioral instructional system. This component helped 29 students achieve behavior analysis literacy. The computer-based training provided structure, immediate performance feedback, and…

  3. National Resource for Computation in Chemistry (NRCC). Attached scientific processors for chemical computations: a report to the chemistry community

    SciTech Connect

    Ostlund, N.S.

    1980-01-01

    The demands of chemists for computational resources are well known and have been amply documented. The best and most cost-effective means of providing these resources is still open to discussion, however. This report surveys the field of attached scientific processors (array processors) and attempts to indicate their present and possible future use in computational chemistry. Array processors have the possibility of providing very cost-effective computation. This report attempts to provide information that will assist chemists who might be considering the use of an array processor for their computations. It describes the general ideas and concepts involved in using array processors, the commercial products that are available, and the experiences reported by those currently using them. In surveying the field of array processors, the author makes certain recommendations regarding their use in computational chemistry. 5 figures, 1 table (RWR)

  4. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    PubMed

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content. PMID:27673543

  5. Computer Simulation and Digital Resources for Plastic Surgery Psychomotor Education.

    PubMed

    Diaz-Siso, J Rodrigo; Plana, Natalie M; Stranix, John T; Cutting, Court B; McCarthy, Joseph G; Flores, Roberto L

    2016-10-01

    Contemporary plastic surgery residents are increasingly challenged to learn a greater number of complex surgical techniques within a limited period. Surgical simulation and digital education resources have the potential to address some limitations of the traditional training model, and have been shown to accelerate knowledge and skills acquisition. Although animal, cadaver, and bench models are widely used for skills and procedure-specific training, digital simulation has not been fully embraced within plastic surgery. Digital educational resources may play a future role in a multistage strategy for skills and procedures training. The authors present two virtual surgical simulators addressing procedural cognition for cleft repair and craniofacial surgery. Furthermore, the authors describe how partnerships among surgical educators, industry, and philanthropy can be a successful strategy for the development and maintenance of digital simulators and educational resources relevant to plastic surgery training. It is our responsibility as surgical educators not only to create these resources, but to demonstrate their utility for enhanced trainee knowledge and technical skills development. Currently available digital resources should be evaluated in partnership with plastic surgery educational societies to guide trainees and practitioners toward effective digital content.

  6. Quantum computing with incoherent resources and quantum jumps.

    PubMed

    Santos, M F; Cunha, M Terra; Chaves, R; Carvalho, A R R

    2012-04-27

    Spontaneous emission and the inelastic scattering of photons are two natural processes usually associated with decoherence and the reduction in the capacity to process quantum information. Here we show that, when suitably detected, these photons are sufficient to build all the fundamental blocks needed to perform quantum computation in the emitting qubits while protecting them from deleterious dissipative effects. We exemplify this by showing how to efficiently prepare graph states for the implementation of measurement-based quantum computation.

  7. iTools: a framework for classification, categorization and integration of computational biology resources.

    PubMed

    Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W

    2008-05-28

    The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management

  8. Assessing attitudes toward computers and the use of Internet resources among undergraduate microbiology students

    NASA Astrophysics Data System (ADS)

    Anderson, Delia Marie Castro

    Computer literacy and use have become commonplace in our colleges and universities. In an environment that demands the use of technology, educators should be knowledgeable of the components that make up the overall computer attitude of students and be willing to investigate the processes and techniques of effective teaching and learning that can take place with computer technology. The purpose of this study is two fold. First, it investigates the relationship between computer attitudes and gender, ethnicity, and computer experience. Second, it addresses the question of whether, and to what extent, students' attitudes toward computers change over a 16 week period in an undergraduate microbiology course that supplements the traditional lecture with computer-driven assignments. Multiple regression analyses, using data from the Computer Attitudes Scale (Loyd & Loyd, 1985), showed that, in the experimental group, no significant relationships were found between computer anxiety and gender or ethnicity or between computer confidence and gender or ethnicity. However, students who used computers the longest (p = .001) and who were self-taught (p = .046) had the lowest computer anxiety levels. Likewise students who used computers the longest (p = .001) and who were self-taught (p = .041) had the highest confidence levels. No significant relationships between computer liking, usefulness, or the use of Internet resources and gender, ethnicity, or computer experience were found. Dependent T-tests were performed to determine whether computer attitude scores (pretest and posttest) increased over a 16-week period for students who had been exposed to computer-driven assignments and other Internet resources. Results showed that students in the experimental group were less anxious about working with computers and considered computers to be more useful. In the control group, no significant changes in computer anxiety, confidence, liking, or usefulness were noted. Overall, students in

  9. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  10. Web application for simplifying access to computer center resources and information.

    2013-05-01

    Lorenz is a product of the ASC Scientific Data Management effort. Lorenz is a web-based application designed to help computer centers make information and resources more easily available to their users.

  11. Justification of Filter Selection for Robot Balancing in Conditions of Limited Computational Resources

    NASA Astrophysics Data System (ADS)

    Momot, M. V.; Politsinskaia, E. V.; Sushko, A. V.; Semerenko, I. A.

    2016-08-01

    The paper considers the problem of mathematical filter selection, used for balancing of wheeled robot in conditions of limited computational resources. The solution based on complementary filter is proposed.

  12. Dynamic remapping of parallel computations with varying resource demands

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.; Saltz, J. H.

    1986-01-01

    A large class of computational problems is characterized by frequent synchronization, and computational requirements which change as a function of time. When such a problem must be solved on a message passing multiprocessor machine, the combination of these characteristics lead to system performance which decreases in time. Performance can be improved with periodic redistribution of computational load; however, redistribution can exact a sometimes large delay cost. We study the issue of deciding when to invoke a global load remapping mechanism. Such a decision policy must effectively weigh the costs of remapping against the performance benefits. We treat this problem by constructing two analytic models which exhibit stochastically decreasing performance. One model is quite tractable; we are able to describe the optimal remapping algorithm, and the optimal decision policy governing when to invoke that algorithm. However, computational complexity prohibits the use of the optimal remapping decision policy. We then study the performance of a general remapping policy on both analytic models. This policy attempts to minimize a statistic W(n) which measures the system degradation (including the cost of remapping) per computation step over a period of n steps. We show that as a function of time, the expected value of W(n) has at most one minimum, and that when this minimum exists it defines the optimal fixed-interval remapping policy. Our decision policy appeals to this result by remapping when it estimates that W(n) is minimized. Our performance data suggests that this policy effectively finds the natural frequency of remapping. We also use the analytic models to express the relationship between performance and remapping cost, number of processors, and the computation's stochastic activity.

  13. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems

    PubMed Central

    Ehsan, Shoaib; Clark, Adrian F.; ur Rehman, Naveed; McDonald-Maier, Klaus D.

    2015-01-01

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems. PMID:26184211

  14. Integral Images: Efficient Algorithms for Their Computation and Storage in Resource-Constrained Embedded Vision Systems.

    PubMed

    Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D

    2015-01-01

    The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems. PMID:26184211

  15. iTools: A Framework for Classification, Categorization and Integration of Computational Biology Resources

    PubMed Central

    Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.

    2008-01-01

    The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource

  16. Computer Innovations in Education, I. The Use of Computers in Mathematics Education Resource Series.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    This is the first of a set of four papers on the use of computers in mathematics education. It includes a discussion of the impact of the computer on society, types of computer systems and languages; instructional applications, administrative uses, libraries and data bases, the design of computer-oriented curricula, and cost effectiveness. For the…

  17. Multi-Programmatic and Institutional Computing Capacity Resource Attachment 2 Statement of Work

    SciTech Connect

    Seager, M

    2002-04-15

    Lawrence Livermore National Laboratory (LLNL) has identified high-performance computing as a critical competency necessary to meet the goals of LLNL's scientific and engineering programs. Leadership in scientific computing demands the availability of a stable, powerful, well-balanced computational infrastructure, and it requires research directed at advanced architectures, enabling numerical methods and computer science. To encourage all programs to benefit from the huge investment being made by the Advanced Simulation and Computing Program (ASCI) at LLNL, and to provide a mechanism to facilitate multi-programmatic leveraging of resources and access to high-performance equipment by researchers, M&IC was created. The Livermore Computing (LC) Center, a part of the Computations Directorate Integrated Computing and Communications (ICC) Department can be viewed as composed of two facilities, one open and one secure. This acquisition is focused on the M&IC resources in the Open Computing Facility (OCF). For the M&IC program, recent efforts and expenditures have focused on enhancing capacity and stabilizing the TeraCluster 2000 (TC2K) resource. Capacity is a measure of the ability to process a varied workload from many scientists simultaneously. Capability represents the ability to deliver a very large system to run scientific calculations at large scale. In this procurement action, we intend to significantly increase the capability of the M&IC resource to address multiple teraFLOP/s problems, and well as increasing the capacity to do many 100 gigaFLOP/s calculations.

  18. Evaluation of a Computer Assisted Instruction Resource in Nursing Education.

    ERIC Educational Resources Information Center

    Herriot, Anne M.; Bishop, Jacki A.; Kelly, Mary; Murphy, Margaret; Truby, Helen

    2003-01-01

    Nine second-year and six final-year nursing students completed a questionnaire and participated in focus groups about STEP-DIET, a computer-based dietetics instructional tool. Students liked the design and content, perceived increased nutritional knowledge and understanding of dietitians' role. However, they were reluctant to accept…

  19. Computer Modelling of Biological Molecules: Free Resources on the Internet.

    ERIC Educational Resources Information Center

    Millar, Neil

    1996-01-01

    Describes a three-dimensional computer modeling system for biological molecules which is suitable for sixth-form teaching. Consists of the modeling program "RasMol" together with structure files of proteins, DNA, and small biological molecules. Describes how the whole system can be downloaded from various sites on the Internet. (Author/JRH)

  20. Computers and Resource-Based History Teaching: A UK Perspective.

    ERIC Educational Resources Information Center

    Spaeth, Donald A.; Cameron, Sonja

    2000-01-01

    Presents an overview of developments in computer-aided history teaching for higher education in the United Kingdom and the United States. Explains that these developments have focused on providing students with access to primary sources to enhance their understanding of historical methods and content. (CMK)

  1. Addition of flexible body option to the TOLA computer program, part 1

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    This report describes a flexible body option that was developed and added to the Takeoff and Landing Analysis (TOLA) computer program. The addition of the flexible body option to TOLA allows it to be used to study essentially any conventional type airplane in the ground operating environment. It provides the capability to predict the total motion of selected points on the analytical methods incorporated in the program and operating instructions for the option are described. A program listing is included along with several example problems to aid in interpretation of the operating instructions and to illustrate program usage.

  2. MCPLOTS: a particle physics resource based on volunteer computing

    NASA Astrophysics Data System (ADS)

    Karneyeu, A.; Mijovic, L.; Prestel, S.; Skands, P. Z.

    2014-02-01

    The mcplots.cern.ch web site ( mcplots) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the hepdata online database of experimental results and on the rivet Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the lhc@home 2.0 platform.

  3. Distributed Computation Resources for Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Doutriaux, C.; Williams, D. N.

    2014-12-01

    The Intergovernmental Panel on Climate Change (IPCC), prompted by the United Nations General Assembly, has published a series of papers in their Fifth Assessment Report (AR5) on processes, impacts, and mitigations of climate change in 2013. The science used in these reports was generated by an international group of domain experts. They studied various scenarios of climate change through the use of highly complex computer models to simulate the Earth's climate over long periods of time. The resulting total data of approximately five petabytes are stored in a distributed data grid known as the Earth System Grid Federation (ESGF). Through the ESGF, consumers of the data can find and download data with limited capabilities for server-side processing. The Sixth Assessment Report (AR6) is already in the planning stages and is estimated to create as much as two orders of magnitude more data than the AR5 distributed archive. It is clear that data analysis capabilities currently in use will be inadequate to allow for the necessary science to be done with AR6 data—the data will just be too big. A major paradigm shift from downloading data to local systems to perform data analytics must evolve to moving the analysis routines to the data and performing these computations on distributed platforms. In preparation for this need, the ESGF has started a Compute Working Team (CWT) to create solutions that allow users to perform distributed, high-performance data analytics on the AR6 data. The team will be designing and developing a general Application Programming Interface (API) to enable highly parallel, server-side processing throughout the ESGF data grid. This API will be integrated with multiple analysis and visualization tools, such as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), netCDF Operator (NCO), and others. This presentation will provide an update on the ESGF CWT's overall approach toward enabling the necessary storage proximal computational

  4. Orchestrating the XO Computer with Digital and Conventional Resources to Teach Mathematics

    ERIC Educational Resources Information Center

    Díaz, A.; Nussbaum, M.; Varela, I.

    2015-01-01

    Recent research has suggested that simply providing each child with a computer does not lead to an improvement in learning. Given that dozens of countries across the world are purchasing computers for their students, we ask which elements are necessary to improve learning when introducing digital resources into the classroom. Understood the…

  5. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    SciTech Connect

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  6. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically.

  7. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines.

    PubMed

    Gansäuer, Andreas; Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca; Grimme, Stefan

    2013-01-01

    The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol(-1) and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG (‡) and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  8. Computational study of the rate constants and free energies of intramolecular radical addition to substituted anilines

    PubMed Central

    Seddiqzai, Meriam; Dahmen, Tobias; Sure, Rebecca

    2013-01-01

    Summary The intramolecular radical addition to aniline derivatives was investigated by DFT calculations. The computational methods were benchmarked by comparing the calculated values of the rate constant for the 5-exo cyclization of the hexenyl radical with the experimental values. The dispersion-corrected PW6B95-D3 functional provided very good results with deviations for the free activation barrier compared to the experimental values of only about 0.5 kcal mol−1 and was therefore employed in further calculations. Corrections for intramolecular London dispersion and solvation effects in the quantum chemical treatment are essential to obtain consistent and accurate theoretical data. For the investigated radical addition reaction it turned out that the polarity of the molecules is important and that a combination of electrophilic radicals with preferably nucleophilic arenes results in the highest rate constants. This is opposite to the Minisci reaction where the radical acts as nucleophile and the arene as electrophile. The substitution at the N-atom of the aniline is crucial. Methyl substitution leads to slower addition than phenyl substitution. Carbamates as substituents are suitable only when the radical center is not too electrophilic. No correlations between free reaction barriers and energies (ΔG ‡ and ΔG R) are found. Addition reactions leading to indanes or dihydrobenzofurans are too slow to be useful synthetically. PMID:24062821

  9. Computer Monte Carlo simulation in quantitative resource estimation

    USGS Publications Warehouse

    Root, D.H.; Menzie, W.D.; Scott, W.A.

    1992-01-01

    The method of making quantitative assessments of mineral resources sufficiently detailed for economic analysis is outlined in three steps. The steps are (1) determination of types of deposits that may be present in an area, (2) estimation of the numbers of deposits of the permissible deposit types, and (3) combination by Monte Carlo simulation of the estimated numbers of deposits with the historical grades and tonnages of these deposits to produce a probability distribution of the quantities of contained metal. Two examples of the estimation of the number of deposits (step 2) are given. The first example is for mercury deposits in southwestern Alaska and the second is for lode tin deposits in the Seward Peninsula. The flow of the Monte Carlo simulation program is presented with particular attention to the dependencies between grades and tonnages of deposits and between grades of different metals in the same deposit. ?? 1992 Oxford University Press.

  10. PDBparam: Online Resource for Computing Structural Parameters of Proteins

    PubMed Central

    Nagarajan, R.; Archana, A.; Thangakani, A. Mary; Jemimah, S.; Velmurugan, D.; Gromiha, M. Michael

    2016-01-01

    Understanding the structure–function relationship in proteins is a longstanding goal in molecular and computational biology. The development of structure-based parameters has helped to relate the structure with the function of a protein. Although several structural features have been reported in the literature, no single server can calculate a wide-ranging set of structure-based features from protein three-dimensional structures. In this work, we have developed a web-based tool, PDBparam, for computing more than 50 structure-based features for any given protein structure. These features are classified into four major categories: (i) interresidue interactions, which include short-, medium-, and long-range interactions, contact order, long-range order, total contact distance, contact number, and multiple contact index, (ii) secondary structure propensities such as α-helical propensity, β-sheet propensity, and propensity of amino acids to exist at various positions of α-helix and amino acid compositions in high B-value regions, (iii) physicochemical properties containing ionic interactions, hydrogen bond interactions, hydrophobic interactions, disulfide interactions, aromatic interactions, surrounding hydrophobicity, and buriedness, and (iv) identification of binding site residues in protein–protein, protein–nucleic acid, and protein–ligand complexes. The server can be freely accessed at http://www.iitm.ac.in/bioinfo/pdbparam/. We suggest the use of PDBparam as an effective tool for analyzing protein structures. PMID:27330281

  11. Analysis of the effects of section 29 tax credits on reserve additions and production of gas from unconventional resources

    SciTech Connect

    Not Available

    1990-09-01

    Federal tax credits for production of natural gas from unconventional resources can stimulate drilling and reserves additions at a relatively low cost to the Treasury. This report presents the results of an analysis of the effects of a proposed extension of the Section 29 alternative fuels production credit specifically for unconventional gas. ICF Resources estimated the net effect of the extension of the credit (the difference between development activity expected with the extension of the credit and that expected if the credit expires in December 1990 as scheduled). The analysis addressed the effect of tax credits on project economics and capital formation, drilling and reserve additions, production, impact on the US and regional economies, and the net public sector costs and incremental revenues. The analysis was based on explicit modeling of the three dominant unconventional gas resources: Tight sands, coalbed methane, and Devonian shales. It incorporated the most current data on resource size, typical well recoveries and economics, and anticipated activity of the major producers. Each resource was further disaggregated for analysis based on distinct resource characteristics, development practices, regional economics, and historical development patterns.

  12. Education for Homeless Adults: Strategies for Implementation. Volume II - Resources and Additional Lessons.

    ERIC Educational Resources Information Center

    Hudson River Center for Program Development, Glenmont, NY.

    This document, the second in a series of guidebooks that were developed for educators of homeless adults in New York, offers strategies and plans for sample lessons in which a holistic approach is used to help homeless adults and families improve their lives through education. The guidebook begins with lists of print and nonprint resources,…

  13. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    NASA Astrophysics Data System (ADS)

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.

    2015-12-01

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  14. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  15. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    DOE PAGES

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubencik, A. M.

    2015-12-29

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less

  16. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  17. Laser powder bed fusion additive manufacturing of metals; physics, computational, and materials challenges

    SciTech Connect

    King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Khairallah, S. A.; Kamath, C.; Rubenchik, A. M.

    2015-12-15

    The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.

  18. Categorization of Computing Education Resources into the ACM Computing Classification System

    SciTech Connect

    Chen, Yinlin; Bogen, Paul Logasa; Fox, Dr. Edward A.; Hsieh, Dr. Haowei; Cassel, Dr. Lillian N.

    2012-01-01

    The Ensemble Portal harvests resources from multiple heterogonous federated collections. Managing these dynamically increasing collections requires an automatic mechanism to categorize records in to corresponding topics. We propose an approach to use existing ACM DL metadata to build classifiers for harvested resources in the Ensemble project. We also present our experience on utilizing the Amazon Mechanical Turk platform to build ground truth training data sets from Ensemble collections.

  19. T-ReX: interactive global illumination of massive models on heterogeneous computing resources.

    PubMed

    Kim, Tae-Joon; Sun, Xin; Yoon, Sung-Eui

    2014-03-01

    We propose several interactive global illumination techniques for a diverse set of massive models. We integrate these techniques within a progressive rendering framework that aims to achieve both a high rendering throughput and an interactive responsiveness. To achieve a high rendering throughput, we utilize heterogeneous computing resources consisting of CPU and GPU. To reduce expensive data transmission costs between CPU and GPU, we propose to use separate, decoupled data representations dedicated for each CPU and GPU. Our representations consist of geometric and volumetric parts, provide different levels of resolutions, and support progressive global illumination for massive models. We also propose a novel, augmented volumetric representation that provides additional geometric resolutions within our volumetric representation. In addition, we employ tile-based rendering and propose a tile ordering technique considering visual perception. We have tested our approach with a diverse set of large-scale models including CAD, scanned, simulation models that consist of more than 300 million triangles. By using our methods, we are able to achieve ray processing performances of 3 M~20 M rays per second, while limiting response time to users within 15~67 ms. We also allow dynamic modifications of light, and interactive setting of materials, while efficiently supporting novel view rendering.

  20. T-ReX: Interactive Global Illumination of Massive Models on Heterogeneous Computing Resources.

    PubMed

    Kim, Tae-Joon; Sun, Xin; Yoon, Sung-Eui

    2013-08-01

    We propose several interactive global illumination techniques for a diverse set of massive models. We integrate these techniques within a progressive rendering framework that aims to achieve both a high rendering throughput and an interactive responsiveness. In order to achieve a high rendering throughput, we utilize heterogeneous computing resources consisting of CPU and GPU. To reduce expensive data transmission costs between CPU and GPU, we propose to use separate, decoupled data representations dedicated for each CPU and GPU. Our representations consist of geometric and volumetric parts, provide different levels of resolutions, and support progressive global illumination for massive models. We also propose a novel, augmented volumetric representation that provides additional geometric resolutions within our volumetric representation. In addition, we employ tile-based rendering and propose a tile ordering technique considering visual perception. We have tested our approach with a diverse set of large-scale models including CAD, scanned, simulation models that consist of more than 300 million triangles. By using our methods, we are able to achieve ray processing performances of 3 M~20 M rays per second, while limiting response time to users within 15 ms~67 ms. We also allow dynamic modifications of light, and interactive setting of materials, while efficiently supporting novel view rendering.

  1. OBS/GMPLS Interworking Network with Scalable Resource Discovery for Global Grid Computing

    NASA Astrophysics Data System (ADS)

    Wu, J.; Liu, L.; Hong, X. B.; Lin, J. T.

    In recent years, Grid computing is more common in the industry and research community and will open to the consumer market in the future. The final objective is the achievement of global Grid computing, which means that the computing and networks are flexibly integrated across the world and a scalable resource discovery scheme is implemented. In this paper, a promising architecture, i.e., optical burst switching (OBS)/generalized multi-protocol label switching (GMPLS) interworking network with Peer-to-Peer (P2P)-based scheme for resource discovery is investigated to realize a highly scalable and flexible platform for Grids. Experimental results show that this architecture is suitable and efficient for future global Grid computing.

  2. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    PubMed

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis."

  3. A survey on resource allocation in high performance distributed computing systems

    SciTech Connect

    Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul; Khan, Samee Ullah; Bickler, Gage; Min-Allah, Nasro; Qureshi, Muhammad Bilal; Zhang, Limin; Yongji, Wang; Ghani, Nasir; Kolodziej, Joanna; Zomaya, Albert Y.; Xu, Cheng-Zhong; Balaji, Pavan; Vishnu, Abhinav; Pinel, Fredric; Pecero, Johnatan E.; Kliazovich, Dzmitry; Bouvry, Pascal; Li, Hongxiang; Wang, Lizhe; Chen, Dan; Rayes, Ammar

    2013-11-01

    An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement of all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.

  4. Towards Optimal Allocation of Computer Resources: Trade-offs between Uncertainty, Discretization and Model Reduction

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Leube, P.; Zinkhahn, M.; de Barros, F.; Rajagopal, R.

    2012-12-01

    In recent years, there has been an increase in the computational complexity of hydro(geo)logical models. This has been driven by new problems addressing large-scale relationships like global warming, reactive transport on the catchment scale or CO2 sequestration. Computational model complexity becomes even more drastic, when facing the ubiquitous need for uncertainty quantification and risk assessment in the environmental sciences. Computational complexity can be broken down into contributions ranging from spatial, temporal and stochastic resolution, e.g., spatial grid resolution, time step size and number of repeated simulations dedicated to quantify uncertainty. Controlling these resolutions allows keeping the computational cost at a tractable level whilst guaranteeing accurate and robust predictions. Having this possibility at hand triggers our overall driving question: What is the optimal resolution for independent variables (i.e. time and space) to achieve reliable prediction in the presence of uncertainty? Can we determine an overall optimum combination of the number of realizations, spatial and temporal resolutions, needed for overall statistical/physical convergence of model predictions? If so, how can we find it? In other words, how can we optimally allocate available computational resources in order to achieve highest accuracy associated with a given prediction goal? In this work, we present an approach that allows to determine the compromise among different model dimensions (space, time, probability) when allocating computational resources. The overall goal is to maximize the prediction accuracy given limited computational resources. Our analysis is based on the idea to jointly consider the discretization errors and computational costs of all individual model dimensions. This yields a cost-to-error surface which serves to aid modelers in finding an optimal allocation of the computational resources. As a pragmatic way to proceed, we propose running small

  5. Teaching English as an Additional Language 5-11: A Whole School Resource File

    ERIC Educational Resources Information Center

    Scott, Caroline

    2011-01-01

    There are increasing numbers of children with little or no English entering English speaking mainstream lessons. This often leaves them with unique frustrations due to limited English language proficiency and disorientation. Teachers often feel unable to cater sufficiently for these new arrivals. "Teaching English as an Additional Language Ages…

  6. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    PubMed

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  7. Adolescents, Health Education, and Computers: The Body Awareness Resource Network (BARN).

    ERIC Educational Resources Information Center

    Bosworth, Kris; And Others

    1983-01-01

    The Body Awareness Resource Network (BARN) is a computer-based system designed as a confidential, nonjudgmental source of health information for adolescents. Topics include alcohol and other drugs, diet and activity, family communication, human sexuality, smoking, and stress management; programs are available for high school and middle school…

  8. Computer-Mediated Communications for Distance Education and Training: Literature Review and International Resources.

    ERIC Educational Resources Information Center

    Wells, Rosalie A.

    This report is intended to provide instructors and designers concerned with training for the National Guard and Army Reserve (the Reserve Component--RC) with a practical review of research findings on the successful implementation of computer-mediated communication (CMC) for distance education and a reference guide to international resources and…

  9. Computer-Assisted Instruction in Political Science. Instructional Resource Monograph No. 4.

    ERIC Educational Resources Information Center

    Pool, Jonathan, Ed.

    This six-author study highlights the most significant attributes of Computer Assisted Instruction (CAI) and explains the techniques of authoring CAI lessons in political science. Fourth in a series of Instructional Resource Monographs, the volume has the objective to inform political science teachers and students about what CAI has to offer on a…

  10. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  11. The Ever-Present Demand for Public Computing Resources. CDS Spotlight

    ERIC Educational Resources Information Center

    Pirani, Judith A.

    2014-01-01

    This Core Data Service (CDS) Spotlight focuses on public computing resources, including lab/cluster workstations in buildings, virtual lab/cluster workstations, kiosks, laptop and tablet checkout programs, and workstation access in unscheduled classrooms. The findings are derived from 758 CDS 2012 participating institutions. A dataset of 529…

  12. City University of New York--Availability of Student Computer Resources. Report.

    ERIC Educational Resources Information Center

    McCall, H. Carl

    This audit reports on the availability of computer resources at the City University of New York's (CUNY) senior colleges. CUNY is the largest urban and the third largest public university system in the United States. Of the 19 CUNY campuses located throughout the five boroughs, 11 are senior colleges offering four-year degrees. For the fall 2001…

  13. Realizing the Potential of Information Resources: Information, Technology, and Services. Track 8: Academic Computing and Libraries.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Eight papers are presented from the 1995 CAUSE conference track on academic computing and library issues faced by managers of information technology at colleges and universities. The papers include: (1) "Where's the Beef?: Implementation of Discipline-Specific Training on Internet Resources" (Priscilla Hancock and others); (2) "'What I Really Want…

  14. Developing Online Learning Resources: Big Data, Social Networks, and Cloud Computing to Support Pervasive Knowledge

    ERIC Educational Resources Information Center

    Anshari, Muhammad; Alas, Yabit; Guan, Lim Sei

    2016-01-01

    Utilizing online learning resources (OLR) from multi channels in learning activities promise extended benefits from traditional based learning-centred to a collaborative based learning-centred that emphasises pervasive learning anywhere and anytime. While compiling big data, cloud computing, and semantic web into OLR offer a broader spectrum of…

  15. 'Seed' Has Germinated: Staff Resources on Fieldwork, Lab Work, Computer-aided Learning and Employer Links.

    ERIC Educational Resources Information Center

    Chalkley, Brian; Elmes, Andrew

    1999-01-01

    Addresses the Science Education Enhancement and Development (SEED) project, providing a brief summary of the resources produced within the four main areas of SEED: (1) fieldwork; (2) laboratory work; (3) computer-aided learning and assessment; and (4) employer links. (CMK)

  16. Towards optimal allocation of computer resources: trade-offs between uncertainty, discretization and model reduction

    NASA Astrophysics Data System (ADS)

    Zinkhahn, M.; Leube, P.; Nowak, W.; de Barros, F. P. J.

    2012-04-01

    In recent years, there has been an increase in the computational complexity of hydro(geo)logical models. This has been driven by new problems addressing large-scale relationships like global warming, reactive transport on the catchment scale or CO2 sequestration. Computational model complexity becomes even more drastic, when facing the ubiquitous need for uncertainty quantification and risk assessment in the environmental sciences. Complexity can be broken down into contributions ranging from spatial, temporal and stochastic resolution, e.g., spatial grid resolution, time step size and number of repeated simulations dedicated to quantify uncertainty. Controlling these resolutions allows keeping the computational cost at a tractable level whilst guaranteeing accurate and robust predictions. Having this possibility at hand triggers our overall driving question: What is the optimal resolution for independent variables (i.e. time and space) to achieve reliable prediction in the presence of uncertainty? Can we determine an overall optimum combination of the number of realizations, spatial and temporal resolutions, needed for overall statistical/physical convergence of model predictions? If so, how can we find it? In other words, how can we optimally allocate available computational resources in order to achieve highest accuracy associated with a given prediction goal? In this work, we present an approach that allows to determine the compromise among different model dimensions (space, time, probability) when allocating computational resources. The overall goal is to maximize the prediction accuracy given limited computational resources. Our analysis is based on the idea to jointly consider the discretization errors and computational costs of all individual model dimensions. This yields a cost-to-error surface which serves to aid modelers in finding an optimal allocation of the computational resources. As a pragmatic way to proceed, we propose running small cost

  17. Production control system: A resource management and load leveling system for mainframe computers

    SciTech Connect

    Wood, R.R.

    1994-03-01

    The Livermore Computing Production Control System, commonly called the PCS is described in this paper. The intended audiences for this document are system administrators and resource managers of computer systems. In 1990, the Livermore Computing Center committed itself to convert its supercomputer operations from the New Livermore TimeSharing System (NLTSS) to UNIX based systems. This was a radical change for Livermore Computing in that over thirty years had elapsed during which the only operating environments used on production platforms were LTSS and then later NLTSS. Elaborate, and sometimes obscure, paradigms (to others) were developed to help the lab`s scientists productively use the machines and to accurately account for their use to government oversight agencies. Resource management is difficult in typical UNIX and related systems. The weakness involves the allocation, control and delivery of the usage of computer resources. This is a result of the origins and subsequent development of UNIX which started as a system for small platforms in a collegial or departmental atmosphere without stringent or complex budgetary requirements. Accounting also is weak, being generally limited to reporting that a user used an amount of time on a process. A process accounting record is made only after the completion of a process and then only if the process does not terminate as a result of a system {open_quotes}panic.{close_quotes}

  18. Avoiding Split Attention in Computer-Based Testing: Is Neglecting Additional Information Facilitative?

    ERIC Educational Resources Information Center

    Jarodzka, Halszka; Janssen, Noortje; Kirschner, Paul A.; Erkens, Gijsbert

    2015-01-01

    This study investigated whether design guidelines for computer-based learning can be applied to computer-based testing (CBT). Twenty-two students completed a CBT exam with half of the questions presented in a split-screen format that was analogous to the original paper-and-pencil version and half in an integrated format. Results show that students…

  19. Perspectives on the utilization of aquaculture coproduct in Europe and Asia: prospects for value addition and improved resource efficiency.

    PubMed

    Newton, Richard; Telfer, Trevor; Little, Dave

    2014-01-01

    Aquaculture has often been criticized for its environmental impacts, especially efficiencies concerning global fisheries resources for use in aquafeeds among others. However, little attention has been paid to the contribution of coproducts from aquaculture, which can vary between 40% and 70% of the production. These have often been underutilized and could be redirected to maximize the efficient use of resource inputs including reducing the burden on fisheries resources. In this review, we identify strategies to enhance the overall value of the harvested yield including noneffluent processing coproducts for three of the most important global aquaculture species, and discuss the current and prospective utilization of these resources for value addition and environmental impact reduction. The review concludes that in Europe coproducts are often underutilized because of logistical reasons such as the disconnected nature of the value chain, and perceived legislative barriers. However, in Asia, most coproducts are used, often innovatively but not to their full economic potential and sometimes with possible human health and biosecurity risks. These include possible spread of diseased material and low traceability in some circumstances. Full economic and environmental appraisal is long overdue for the current and potential strategies available for coproduct utilization.

  20. A sea urchin genome project: sequence scan, virtual map, and additional resources.

    PubMed

    Cameron, R A; Mahairas, G; Rast, J P; Martinez, P; Biondi, T R; Swartzell, S; Wallace, J C; Poustka, A J; Livingston, B T; Wray, G A; Ettensohn, C A; Lehrach, H; Britten, R J; Davidson, E H; Hood, L

    2000-08-15

    Results of a first-stage Sea Urchin Genome Project are summarized here. The species chosen was Strongylocentrotus purpuratus, a research model of major importance in developmental and molecular biology. A virtual map of the genome was constructed by sequencing the ends of 76,020 bacterial artificial chromosome (BAC) recombinants (average length, 125 kb). The BAC-end sequence tag connectors (STCs) occur an average of 10 kb apart, and, together with restriction digest patterns recorded for the same BAC clones, they provide immediate access to contigs of several hundred kilobases surrounding any gene of interest. The STCs survey >5% of the genome and provide the estimate that this genome contains approximately 27,350 protein-coding genes. The frequency distribution and canonical sequences of all middle and highly repetitive sequence families in the genome were obtained from the STCs as well. The 500-kb Hox gene complex of this species is being sequenced in its entirety. In addition, arrayed cDNA libraries of >10(5) clones each were constructed from every major stage of embryogenesis, several individual cell types, and adult tissues and are available to the community. The accumulated STC data and an expanding expressed sequence tag database (at present including >12, 000 sequences) have been reported to GenBank and are accessible on public web sites.

  1. Current status and prospects of computational resources for natural product dereplication: a review.

    PubMed

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-03-01

    Research in natural products has always enhanced drug discovery by providing new and unique chemical compounds. However, recently, drug discovery from natural products is slowed down by the increasing chance of re-isolating known compounds. Rapid identification of previously isolated compounds in an automated manner, called dereplication, steers researchers toward novel findings, thereby reducing the time and effort for identifying new drug leads. Dereplication identifies compounds by comparing processed experimental data with those of known compounds, and so, diverse computational resources such as databases and tools to process and compare compound data are necessary. Automating the dereplication process through the integration of computational resources has always been an aspired goal of natural product researchers. To increase the utilization of current computational resources for natural products, we first provide an overview of the dereplication process, and then list useful resources, categorizing into databases, methods and software tools and further explaining them from a dereplication perspective. Finally, we discuss the current challenges to automating dereplication and proposed solutions.

  2. An Extensible Scientific Computing Resources Integration Framework Based on Grid Service

    NASA Astrophysics Data System (ADS)

    Cui, Binge; Chen, Xin; Song, Pingjian; Liu, Rongjie

    Scientific computing resources (e.g., components, dynamic linkable libraries, etc) are very valuable assets for the scientific research. However, due to historical reasons, most computing resources can’t be shared by other people. The emergence of Grid computing provides a turning point to solve this problem. The legacy applications can be abstracted and encapsulated into Grid service, and they may be found and invoked on the Web using SOAP messages. The Grid service is loosely coupled with the external JAR or DLL, which builds a bridge from users to computing resources. We defined an XML schema to describe the functions and interfaces of the applications. This information can be acquired by users by invoking the “getCapabilities” operation of the Grid service. We also proposed the concept of class pool to eliminate the memory leaks when invoking the external jars using reflection. The experiment shows that the class pool not only avoids the PermGen space waste and Tomcat server exception, but also significantly improves the application speed. The integration framework has been implemented successfully in a real project.

  3. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  4. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    SciTech Connect

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B.; Ramarathinam, Sri H.; Lindestam Arlehamn, Cecilia S.; Chiek Koh, Ching; Gillet, Ludovic C.; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S.; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony W.; Rammensee, Hans -Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-07-08

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies.

  5. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    SciTech Connect

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpizar, Adan; Schittenhelm, Ralf B.; Ramarathinam, Sri Harsha; Lindestam-Arlehamn, Cecilia S.; Koh, Ching Chiek; Gillet, Ludovic; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony; Rammensee, Hans-Georg; Stevanovic, Stevan; Aebersold, Ruedi

    2015-07-08

    We present a novel proteomics-based workflow and an open source data and computational resource for reproducibly identifying and quantifying HLA-associated peptides at high-throughput. The provided resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra and the analysis of quantitative digital maps of HLA peptidomes generated by SWATH mass spectrometry (MS). This is the first community-based study towards the development of a robust platform for the reproducible and quantitative measurement of HLA peptidomes, an essential step towards the design of efficient immunotherapies.

  6. Addition of flexible body option to the TOLA computer program. Part 2: User and programmer documentation

    NASA Technical Reports Server (NTRS)

    Dick, J. W.; Benda, B. J.

    1975-01-01

    User and programmer oriented documentation for the flexible body option of the Takeoff and Landing Analysis (TOLA) computer program are provided. The user information provides sufficient knowledge of the development and use of the option to enable the engineering user to successfully operate the modified program and understand the results. The programmer's information describes the option structure and logic enabling a programmer to make major revisions to this part of the TOLA computer program.

  7. The relative effectiveness of computer-based and traditional resources for education in anatomy.

    PubMed

    Khot, Zaid; Quinlan, Kaitlyn; Norman, Geoffrey R; Wainman, Bruce

    2013-01-01

    There is increasing use of computer-based resources to teach anatomy, although no study has compared computer-based learning to traditional. In this study, we examine the effectiveness of three formats of anatomy learning: (1) a virtual reality (VR) computer-based module, (2) a static computer-based module providing Key Views (KV), (3) a plastic model. We conducted a controlled trial in which 60 undergraduate students had ten minutes to study the names of 20 different pelvic structures. The outcome measure was a 25 item short answer test consisting of 15 nominal and 10 functional questions, based on a cadaveric pelvis. All subjects also took a brief mental rotations test (MRT) as a measure of spatial ability, used as a covariate in the analysis. Data were analyzed with repeated measures ANOVA. The group learning from the model performed significantly better than the other two groups on the nominal questions (Model 67%; KV 40%; VR 41%, Effect size 1.19 and 1.29, respectively). There was no difference between the KV and VR groups. There was no difference between the groups on the functional questions (Model 28%; KV, 23%, VR 25%). Computer-based learning resources appear to have significant disadvantages compared to traditional specimens in learning nominal anatomy. Consistent with previous research, virtual reality shows no advantage over static presentation of key views.

  8. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  9. GANGA: A tool for computational-task management and easy access to Grid resources

    NASA Astrophysics Data System (ADS)

    Mościcki, J. T.; Brochu, F.; Ebke, J.; Egede, U.; Elmsheuser, J.; Harrison, K.; Jones, R. W. L.; Lee, H. C.; Liko, D.; Maier, A.; Muraru, A.; Patrick, G. N.; Pajchel, K.; Reece, W.; Samset, B. H.; Slater, M. W.; Soroko, A.; Tan, C. L.; van der Ster, D. C.; Williams, M.

    2009-11-01

    In this paper, we present the computational task-management tool GANGA, which allows for the specification, submission, bookkeeping and post-processing of computational tasks on a wide set of distributed resources. GANGA has been developed to solve a problem increasingly common in scientific projects, which is that researchers must regularly switch between different processing systems, each with its own command set, to complete their computational tasks. GANGA provides a homogeneous environment for processing data on heterogeneous resources. We give examples from High Energy Physics, demonstrating how an analysis can be developed on a local system and then transparently moved to a Grid system for processing of all available data. GANGA has an API that can be used via an interactive interface, in scripts, or through a GUI. Specific knowledge about types of tasks or computational resources is provided at run-time through a plugin system, making new developments easy to integrate. We give an overview of the GANGA architecture, give examples of current use, and demonstrate how GANGA can be used in many different areas of science. Catalogue identifier: AEEN_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEN_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 224 590 No. of bytes in distributed program, including test data, etc.: 14 365 315 Distribution format: tar.gz Programming language: Python Computer: personal computers, laptops Operating system: Linux/Unix RAM: 1 MB Classification: 6.2, 6.5 Nature of problem: Management of computational tasks for scientific applications on heterogenous distributed systems, including local, batch farms, opportunistic clusters and

  10. Can Computer-Assisted Discovery Learning Foster First Graders' Fluency with the Most Basic Addition Combinations?

    ERIC Educational Resources Information Center

    Baroody, Arthur J.; Eiland, Michael D.; Purpura, David J.; Reid, Erin E.

    2013-01-01

    In a 9-month training experiment, 64 first graders with a risk factor were randomly assigned to computer-assisted structured discovery of the add-1 rule (e.g., the sum of 7 + 1 is the number after "seven" when we count), unstructured discovery learning of this regularity, or an active-control group. Planned contrasts revealed that the add-1…

  11. Planning health education: Internet and computer resources in southwestern Nigeria. 2000-2001.

    PubMed

    Oyadoke, Adebola A; Salami, Kabiru K; Brieger, William R

    The use of the Internet as a health education tool and as a resource in health education planning is widely accepted as the norm in industrialized countries. Unfortunately, access to computers and the Internet is quite limited in developing countries. Not all licensed service providers operate, many users are actually foreign nationals, telephone connections are unreliable, and electricity supplies are intermittent. In this context, computer, e-mail, Internet, and CD-Rom use by health and health education program officers in five states in southwestern Nigeria were assessed to document their present access and use. Eight of the 30 organizations visited were government health ministry departments, while the remainder were non-governmental organizations (NGOs). Six NGOs and four State Ministry of Health (MOH) departments had no computers, but nearly two-thirds of both types of agency had e-mail, less than one-third had Web browsing facilities, and six had CD-Roms, all of whom were NGOs. Only 25 of the 48 individual respondents had computer use skills. Narrative responses from individual employees showed a qualitative difference between computer and Internet access and use and type of agency. NGO staff in organizations with computers indicated having relatively free access to a computer and the Internet and used these for both program planning and administrative purposes. In government offices it appeared that computers were more likely to be located in administrative or statistics offices and used for management tasks like salaries and correspondence, limiting the access of individual health staff. These two different organizational cultures must be considered when plans are made for increasing computer availability and skills for health education planning.

  12. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    NASA Astrophysics Data System (ADS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-07-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi-Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources.

  13. Optimizing qubit resources for quantum chemistry simulations in second quantization on a quantum computer

    NASA Astrophysics Data System (ADS)

    Moll, Nikolaj; Fuhrer, Andreas; Staar, Peter; Tavernelli, Ivano

    2016-07-01

    Quantum chemistry simulations on a quantum computer suffer from the overhead needed for encoding the Fermionic problem in a system of qubits. By exploiting the block diagonality of a Fermionic Hamiltonian, we show that the number of required qubits can be reduced while the number of terms in the Hamiltonian will increase. All operations for this reduction can be performed in operator space. The scheme is conceived as a pre-computational step that would be performed prior to the actual quantum simulation. We apply this scheme to reduce the number of qubits necessary to simulate both the Hamiltonian of the two-site Fermi–Hubbard model and the hydrogen molecule. Both quantum systems can then be simulated with a two-qubit quantum computer. Despite the increase in the number of Hamiltonian terms, the scheme still remains a useful tool to reduce the dimensionality of specific quantum systems for quantum simulators with a limited number of resources.

  14. The Effects of Computer-Assisted Instruction on Student Achievement in Addition and Subtraction at First Grade Level.

    ERIC Educational Resources Information Center

    Spivey, Patsy M.

    This study was conducted to determine whether the traditional classroom approach to instruction involving the addition and subtraction of number facts (digits 0-6) is more or less effective than the traditional classroom approach plus a commercially-prepared computer game. A pretest-posttest control group design was used with two groups of first…

  15. Identification of Students' Intuitive Mental Computational Strategies for 1, 2 and 3 Digits Addition and Subtraction: Pedagogical and Curricular Implications

    ERIC Educational Resources Information Center

    Ghazali, Munirah; Alias, Rohana; Ariffin, Noor Asrul Anuar; Ayub, Ayminsyadora

    2010-01-01

    This paper reports on a study to examine mental computation strategies used by Year 1, Year 2, and Year 3 students to solve addition and subtraction problems. The participants in this study were twenty five 7 to 9 year-old students identified as excellent, good and satisfactory in their mathematics performance from a school in Penang, Malaysia.…

  16. Impact of Changing Computer Technology on Hydrologic and Water Resource Modeling (Paper 7R0049)

    NASA Astrophysics Data System (ADS)

    Loucks, Daniel P.; Fedra, Kurt

    1987-03-01

    The increasing availability of substantial computer power at relatively low costs and the increasing ease of using computer graphics, of communicating with other computers and data bases, and of programming using high-level problem-oriented computer languages, is providing new opportunities and challenges for those developing and using hydrologic and water resources models. This paper reviews some of the progress made towards the development and application of computer support systems designed to aid those involved in analyzing hydrologic data and in operating, managing, or planning water resource facilities. Such systems of hardware and software are being designed to allow direct and easy access to a broad and heterogeneous group of users. These systems often combine data-base management; simulation and optimization techniques; symbolic colored displays; heuristic, qualitative approaches; and possibly artificial intelligence methods in an interactive, user-controlled, easily accessible interface. Individuals involved in the use of such systems are not only those with technical training, but also those representing different interest groups and having non-technical backgrounds. The essential difference between what is happening now and the more traditional off-line, non-interactive approaches is that instead of generating solutions to specific problems, model developers are now beginning to deliver, in a much more useful and user-friendly form, computer-based turnkey systems for exploring, analyzing and synthesizing plans or policies. Such tools permit the user to evaluate alternative solutions based on his or her own objectives and subjective judgments in an interactive learning and decision-making process.

  17. Application of Computer-Assisted Dispute Resolution Methods to Water Resources Management

    NASA Astrophysics Data System (ADS)

    Cardwell, H. E.

    2006-12-01

    Although technical information has been used for centuries to support water resources decision making, the increased technical capabilities due to ever-more powerful computational resources has allowed a larger number of groups to participate in the technical analysis for water decisions. And the public increasingly demands more involvement by affected stakeholders in these decisions about public resources. Yet water management challenges involve complex technical and scientific issues. Stakeholders, public representatives, decision-makers and experts frequently have different levels of knowledge and comfort with technical aspects of a problem. When left to experts alone, computer models may be seen as "black boxes" and may not be trusted by stakeholders and decision-makers. An alternative is to involve stakeholders and decision-makers in the development and design of decision support models from the beginning. This kind of approach can be referred to as collaborative modeling. Here we present a collaborative modeling approach which has been developed by the Corps over the last fifteen years called Shared Vision Planning. Shared Vision Planning is a combination of three common practices that have a long history in water resources management: 1) traditional planning; 2) technical systems modeling; and 3) structured collaboration with a broad range of stakeholders. Aside from the intensive and continuous collaboration, the thing that most sets SVP apart is the use of collaboratively developed, transparent decision-support models that then serve as primary tools for plan formulation and evaluation. SVP models are typically integrated tools in that they include hydrologic and hydraulic simulations, along with economic, environmental other performance measure. We present past and current cases where collaborative modeling approaches have been used by the Corps to involve stakeholders early and often in the technical analysis that supports the planning process. We will

  18. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  19. Subsonic flutter analysis addition to NASTRAN. [for use with CDC 6000 series digital computers

    NASA Technical Reports Server (NTRS)

    Doggett, R. V., Jr.; Harder, R. L.

    1973-01-01

    A subsonic flutter analysis capability has been developed for NASTRAN, and a developmental version of the program has been installed on the CDC 6000 series digital computers at the Langley Research Center. The flutter analysis is of the modal type, uses doublet lattice unsteady aerodynamic forces, and solves the flutter equations by using the k-method. Surface and one-dimensional spline functions are used to transform from the aerodynamic degrees of freedom to the structural degrees of freedom. Some preliminary applications of the method to a beamlike wing, a platelike wing, and a platelike wing with a folded tip are compared with existing experimental and analytical results.

  20. Restructuring the introductory physics lab with the addition of computer-based laboratories.

    PubMed

    Pierri-Galvao, Monica

    2011-07-01

    Nowadays, data acquisition software and sensors are being widely used in introductory physics laboratories. This allows the student to spend more time exploring the data that is collected by the computer hence focusing more on the physical concept. Very often, a faculty is faced with the challenge of updating or introducing a microcomputer-based laboratory (MBL) at his or her institution. This article will provide a list of experiments and equipment needed to convert about half of the traditional labs on a 1-year introductory physics lab into MBLs.

  1. Addition of higher order plate and shell elements into NASTRAN computer program

    NASA Technical Reports Server (NTRS)

    Narayanaswami, R.; Goglia, G. L.

    1976-01-01

    Two higher order plate elements, the linear strain triangular membrane element and the quintic bending element, along with a shallow shell element, suitable for inclusion into the NASTRAN (NASA Structural Analysis) program are described. Additions to the NASTRAN Theoretical Manual, Users' Manual, Programmers' Manual and the NASTRAN Demonstration Problem Manual, for inclusion of these elements into the NASTRAN program are also presented.

  2. Time Utility Functions for Modeling and Evaluating Resource Allocations in a Heterogeneous Computing System

    SciTech Connect

    Briceno, Luis Diego; Khemka, Bhavesh; Siegel, Howard Jay; Maciejewski, Anthony A; Groer, Christopher S; Koenig, Gregory A; Okonski, Gene D; Poole, Stephen W

    2011-01-01

    This study considers a heterogeneous computing system and corresponding workload being investigated by the Extreme Scale Systems Center (ESSC) at Oak Ridge National Laboratory (ORNL). The ESSC is part of a collaborative effort between the Department of Energy (DOE) and the Department of Defense (DoD) to deliver research, tools, software, and technologies that can be integrated, deployed, and used in both DOE and DoD environments. The heterogeneous system and workload described here are representative of a prototypical computing environment being studied as part of this collaboration. Each task can exhibit a time-varying importance or utility to the overall enterprise. In this system, an arriving task has an associated priority and precedence. The priority is used to describe the importance of a task, and precedence is used to describe how soon the task must be executed. These two metrics are combined to create a utility function curve that indicates how valuable it is for the system to complete a task at any given moment. This research focuses on using time-utility functions to generate a metric that can be used to compare the performance of different resource schedulers in a heterogeneous computing system. The contributions of this paper are: (a) a mathematical model of a heterogeneous computing system where tasks arrive dynamically and need to be assigned based on their priority, precedence, utility characteristic class, and task execution type, (b) the use of priority and precedence to generate time-utility functions that describe the value a task has at any given time, (c) the derivation of a metric based on the total utility gained from completing tasks to measure the performance of the computing environment, and (d) a comparison of the performance of resource allocation heuristics in this environment.

  3. Applications of Computer Conferencing to Teacher Education and Human Resource Development. Proceedings from an International Symposium on Computer Conferencing (Columbus, Ohio, June 13-15, 1991).

    ERIC Educational Resources Information Center

    Miller, Aaron J., Ed.

    This document contains the texts of seven invited presentations and six juried papers from a symposium on the uses of computer conferencing in teacher education and human resource development. The invited presentations include the following: "Computer Conferencing in the Context of Theory and Practice of Distance Education" (Michael G. Moore); "An…

  4. Large Advanced Space Systems (LASS) computer-aided design program additions

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.

    1982-01-01

    The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.

  5. Efficient method for computing the maximum-likelihood quantum state from measurements with additive Gaussian noise.

    PubMed

    Smolin, John A; Gambetta, Jay M; Smith, Graeme

    2012-02-17

    We provide an efficient method for computing the maximum-likelihood mixed quantum state (with density matrix ρ) given a set of measurement outcomes in a complete orthonormal operator basis subject to Gaussian noise. Our method works by first changing basis yielding a candidate density matrix μ which may have nonphysical (negative) eigenvalues, and then finding the nearest physical state under the 2-norm. Our algorithm takes at worst O(d(4)) for the basis change plus O(d(3)) for finding ρ where d is the dimension of the quantum state. In the special case where the measurement basis is strings of Pauli operators, the basis change takes only O(d(3)) as well. The workhorse of the algorithm is a new linear-time method for finding the closest probability distribution (in Euclidean distance) to a set of real numbers summing to one.

  6. Addition of visual noise boosts evoked potential-based brain-computer interface.

    PubMed

    Xie, Jun; Xu, Guanghua; Wang, Jing; Zhang, Sicong; Zhang, Feng; Li, Yeping; Han, Chengcheng; Li, Lili

    2014-05-14

    Although noise has a proven beneficial role in brain functions, there have not been any attempts on the dedication of stochastic resonance effect in neural engineering applications, especially in researches of brain-computer interfaces (BCIs). In our study, a steady-state motion visual evoked potential (SSMVEP)-based BCI with periodic visual stimulation plus moderate spatiotemporal noise can achieve better offline and online performance due to enhancement of periodic components in brain responses, which was accompanied by suppression of high harmonics. Offline results behaved with a bell-shaped resonance-like functionality and 7-36% online performance improvements can be achieved when identical visual noise was adopted for different stimulation frequencies. Using neural encoding modeling, these phenomena can be explained as noise-induced input-output synchronization in human sensory systems which commonly possess a low-pass property. Our work demonstrated that noise could boost BCIs in addressing human needs.

  7. Theory and computer simulation for the equation of state of additive hard-disk fluid mixtures

    NASA Astrophysics Data System (ADS)

    Barrio, C.; Solana, J. R.

    2001-01-01

    A procedure previously developed by the authors to obtain the equation of state for a mixture of additive hard spheres on the basis of a pure fluid equation of state is applied here to a binary mixture of additive hard disks in two dimensions. The equation of state depends on two parameters which are determined from the second and third virial coefficients for the mixture, which are known exactly. Results are compared with Monte Carlo calculations which are also reported. The agreement between theory and simulation is very good. For the fourth and fifth virial coefficients of the mixture, the equation of state gives results which are also in close agreement with exact numerical values reported in the literature.

  8. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  9. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    PubMed Central

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J; Schuster, Heiko; Ternette, Nicola; Alpízar, Adán; Schittenhelm, Ralf B; Ramarathinam, Sri H; Lindestam Arlehamn, Cecilia S; Chiek Koh, Ching; Gillet, Ludovic C; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David S; Deutsch, Eric W; Moritz, Robert L; Purcell, Anthony W; Rammensee, Hans-Georg; Stevanovic, Stefan; Aebersold, Ruedi

    2015-01-01

    We present a novel mass spectrometry-based high-throughput workflow and an open-source computational and data resource to reproducibly identify and quantify HLA-associated peptides. Collectively, the resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra, and the analysis of quantitative digital maps of HLA peptidomes generated from a range of biological sources by SWATH mass spectrometry (MS). This study represents the first community-based effort to develop a robust platform for the reproducible and quantitative measurement of the entire repertoire of peptides presented by HLA molecules, an essential step towards the design of efficient immunotherapies. DOI: http://dx.doi.org/10.7554/eLife.07661.001 PMID:26154972

  10. Coherent-state linear optical quantum computing gates using simplified diagonal superposition resource states

    SciTech Connect

    Lund, A.P.; Ralph, T.C.

    2005-03-01

    In this paper we explore the possibility of fundamental tests for coherent-state optical quantum computing gates [T. C. Ralph et al., Phys. Rev. A 68, 042319 (2003)] using sophisticated but not unrealistic quantum states. The major resource required in these gates is a state diagonal to the basis states. We use the recent observation that a squeezed single-photon state [S(r) vertical bar 1>] approximates well an odd superposition of coherent states (vertical bar {alpha}>- vertical bar -{alpha}>) to address the diagonal resource problem. The approximation only holds for relatively small {alpha}, and hence these gates cannot be used in a scalable scheme. We explore the effects on fidelities and probabilities in teleportation and a rotated Hadamard gate.

  11. Spacecraft computer resource margin management. [of Project Galileo Orbiter in-flight reprogramming task

    NASA Technical Reports Server (NTRS)

    Larman, B. T.

    1981-01-01

    The conduction of the Project Galileo Orbiter, with 18 microcomputers and the equivalent of 360K 8-bit bytes of memory contained within two major engineering subsystems and eight science instruments, requires that the key onboard computer system resources be managed in a very rigorous manner. Attention is given to the rationale behind the project policy, the development stage, the preliminary design stage, the design/implementation stage, and the optimization or 'scrubbing' stage. The implementation of the policy is discussed, taking into account the development of the Attitude and Articulation Control Subsystem (AACS) and the Command and Data Subsystem (CDS), the reporting of margin status, and the response to allocation oversubscription.

  12. Resource quality of a symmetry-protected topologically ordered phase for quantum computation.

    PubMed

    Miller, Jacob; Miyake, Akimasa

    2015-03-27

    We investigate entanglement naturally present in the 1D topologically ordered phase protected with the on-site symmetry group of an octahedron as a potential resource for teleportation-based quantum computation. We show that, as long as certain characteristic lengths are finite, all its ground states have the capability to implement any unit-fidelity one-qubit gate operation asymptotically as a key computational building block. This feature is intrinsic to the entire phase, in that perfect gate fidelity coincides with perfect string order parameters under a state-insensitive renormalization procedure. Our approach may pave the way toward a novel program to classify quantum many-body systems based on their operational use for quantum information processing. PMID:25860730

  13. Exploiting short-term memory in soft body dynamics as a computational resource.

    PubMed

    Nakajima, K; Li, T; Hauser, H; Pfeifer, R

    2014-11-01

    Soft materials are not only highly deformable, but they also possess rich and diverse body dynamics. Soft body dynamics exhibit a variety of properties, including nonlinearity, elasticity and potentially infinitely many degrees of freedom. Here, we demonstrate that such soft body dynamics can be employed to conduct certain types of computation. Using body dynamics generated from a soft silicone arm, we show that they can be exploited to emulate functions that require memory and to embed robust closed-loop control into the arm. Our results suggest that soft body dynamics have a short-term memory and can serve as a computational resource. This finding paves the way towards exploiting passive body dynamics for control of a large class of underactuated systems.

  14. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  15. Enantioselective conjugate addition of nitro compounds to α,β-unsaturated ketones: an experimental and computational study.

    PubMed

    Manzano, Rubén; Andrés, José M; Álvarez, Rosana; Muruzábal, María D; de Lera, Ángel R; Pedrosa, Rafael

    2011-05-16

    A series of chiral thioureas derived from easily available diamines, prepared from α-amino acids, have been tested as catalysts in the enantioselective Michael additions of nitroalkanes to α,β-unsaturated ketones. The best results are obtained with the bifunctional catalyst prepared from L-valine. This thiourea promotes the reaction with high enantioselectivities and chemical yields for aryl/vinyl ketones, but the enantiomeric ratio for alkyl/vinyl derivatives is very modest. The addition of substituted nitromethanes led to the corresponding adducts with excellent enantioselectivity but very poor diastereoselectivity. Evidence for the isomerization of the addition products has been obtained from the reaction of chalcone with [D(3)]nitromethane, which shows that the final addition products epimerize under the reaction conditions. The epimerization explains the low diastereoselectivity observed in the formation of adducts with two adjacent tertiary stereocenters. Density functional studies of the transition structures corresponding to two alternative activation modes of the nitroalkanes and α,β-unsaturated ketones by the bifunctional organocatalyst have been carried out at the B3LYP/3-21G* level. The computations are consistent with a reaction model involving the Michael addition of the thiourea-activated nitronate to the ketone activated by the protonated amine of the organocatalyst. The enantioselectivities predicted by the computations are consistent with the experimental values obtained for aryl- and alkyl-substituted α,β-unsaturated ketones.

  16. Resource efficient hardware architecture for fast computation of running max/min filters.

    PubMed

    Torres-Huitzil, Cesar

    2013-01-01

    Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k(2) - 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456

  17. Lawrence Livermore National Laboratories Perspective on Code Development and High Performance Computing Resources in Support of the National HED/ICF Effort

    SciTech Connect

    Clouse, C. J.; Edwards, M. J.; McCoy, M. G.; Marinak, M. M.; Verdon, C. P.

    2015-07-07

    Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.

  18. Improving hospital bed occupancy and resource utilization through queuing modeling and evolutionary computation.

    PubMed

    Belciug, Smaranda; Gorunescu, Florin

    2015-02-01

    Scarce healthcare resources require carefully made policies ensuring optimal bed allocation, quality healthcare service, and adequate financial support. This paper proposes a complex analysis of the resource allocation in a hospital department by integrating in the same framework a queuing system, a compartmental model, and an evolutionary-based optimization. The queuing system shapes the flow of patients through the hospital, the compartmental model offers a feasible structure of the hospital department in accordance to the queuing characteristics, and the evolutionary paradigm provides the means to optimize the bed-occupancy management and the resource utilization using a genetic algorithm approach. The paper also focuses on a "What-if analysis" providing a flexible tool to explore the effects on the outcomes of the queuing system and resource utilization through systematic changes in the input parameters. The methodology was illustrated using a simulation based on real data collected from a geriatric department of a hospital from London, UK. In addition, the paper explores the possibility of adapting the methodology to different medical departments (surgery, stroke, and mental illness). Moreover, the paper also focuses on the practical use of the model from the healthcare point of view, by presenting a simulated application.

  19. SLA-Driven Adaptive Resource Management for Web Applications on a Heterogeneous Compute Cloud

    NASA Astrophysics Data System (ADS)

    Iqbal, Waheed; Dailey, Matthew; Carrera, David

    Current service-level agreements (SLAs) offered by cloud providers make guarantees about quality attributes such as availability. However, although one of the most important quality attributes from the perspective of the users of a cloud-based Web application is its response time, current SLAs do not guarantee response time. Satisfying a maximum average response time guarantee for Web applications is difficult due to unpredictable traffic patterns, but in this paper we show how it can be accomplished through dynamic resource allocation in a virtual Web farm. We present the design and implementation of a working prototype built on a EUCALYPTUS-based heterogeneous compute cloud that actively monitors the response time of each virtual machine assigned to the farm and adaptively scales up the application to satisfy a SLA promising a specific average response time. We demonstrate the feasibility of the approach in an experimental evaluation with a testbed cloud and a synthetic workload. Adaptive resource management has the potential to increase the usability of Web applications while maximizing resource utilization.

  20. Review: Computer-based models for managing the water-resource problems of irrigated agriculture

    NASA Astrophysics Data System (ADS)

    Singh, Ajay

    2015-09-01

    Irrigation is essential for achieving food security to the burgeoning global population but unplanned and injudicious expansion of irrigated areas causes waterlogging and salinization problems. Under this backdrop, groundwater resources management is a critical issue for fulfilling the increasing water demand for agricultural, industrial, and domestic uses. Various simulation and optimization approaches were used to solve the groundwater management problems. This paper presents a review of the individual and combined applications of simulation and optimization modeling for the management of groundwater-resource problems associated with irrigated agriculture. The study revealed that the combined use of simulation-optimization modeling is very suitable for achieving an optimal solution for groundwater-resource problems, even with a large number of variables. Independent model tools were used to solve the problems of uncertainty analysis and parameter estimation in groundwater modelling studies. Artificial neural networks were used to minimize the problem of computational complexity. The incorporation of socioeconomic aspects into the groundwater management modeling would be an important development in future studies.

  1. Computations on the primary photoreaction of Br2 with CO2: stepwise vs concerted addition of Br atoms.

    PubMed

    Xu, Kewei; Korter, Timothy M; Braiman, Mark S

    2015-04-01

    It was proposed previously that Br2-sensitized photolysis of liquid CO2 proceeds through a metastable primary photoproduct, CO2Br2. Possible mechanisms for such a photoreaction are explored here computationally. First, it is shown that the CO2Br radical is not stable in any geometry. This rules out a free-radical mechanism, for example, photochemical splitting of Br2 followed by stepwise addition of Br atoms to CO2-which in turn accounts for the lack of previously observed Br2+CO2 photochemistry in gas phases. A possible alternative mechanism in liquid phase is formation of a weakly bound CO2:Br2 complex, followed by concerted photoaddition of Br2. This hypothesis is suggested by the previously published spectroscopic detection of a binary CO2:Br2 complex in the supersonically cooled gas phase. We compute a global binding-energy minimum of -6.2 kJ mol(-1) for such complexes, in a linear geometry. Two additional local minima were computed for perpendicular (C2v) and nearly parallel asymmetric planar geometries, both with binding energies near -5.4 kJ mol(-1). In these two latter geometries, C-Br and O-Br bond distances are simultaneously in the range of 3.5-3.8 Å, that is, perhaps suitable for a concerted photoaddition under the temperature and pressure conditions where Br2 + CO2 photochemistry has been observed.

  2. Computational methods and resources for the interpretation of genomic variants in cancer

    PubMed Central

    2015-01-01

    The recent improvement of the high-throughput sequencing technologies is having a strong impact on the detection of genetic variations associated with cancer. Several institutions worldwide have been sequencing the whole exomes and or genomes of cancer patients in the thousands, thereby providing an invaluable collection of new somatic mutations in different cancer types. These initiatives promoted the development of methods and tools for the analysis of cancer genomes that are aimed at studying the relationship between genotype and phenotype in cancer. In this article we review the online resources and computational tools for the analysis of cancer genome. First, we describe the available repositories of cancer genome data. Next, we provide an overview of the methods for the detection of genetic variation and computational tools for the prioritization of cancer related genes and causative somatic variations. Finally, we discuss the future perspectives in cancer genomics focusing on the impact of computational methods and quantitative approaches for defining personalized strategies to improve the diagnosis and treatment of cancer. PMID:26111056

  3. Computational methods and resources for the interpretation of genomic variants in cancer.

    PubMed

    Tian, Rui; Basu, Malay K; Capriotti, Emidio

    2015-01-01

    The recent improvement of the high-throughput sequencing technologies is having a strong impact on the detection of genetic variations associated with cancer. Several institutions worldwide have been sequencing the whole exomes and or genomes of cancer patients in the thousands, thereby providing an invaluable collection of new somatic mutations in different cancer types. These initiatives promoted the development of methods and tools for the analysis of cancer genomes that are aimed at studying the relationship between genotype and phenotype in cancer. In this article we review the online resources and computational tools for the analysis of cancer genome. First, we describe the available repositories of cancer genome data. Next, we provide an overview of the methods for the detection of genetic variation and computational tools for the prioritization of cancer related genes and causative somatic variations. Finally, we discuss the future perspectives in cancer genomics focusing on the impact of computational methods and quantitative approaches for defining personalized strategies to improve the diagnosis and treatment of cancer.

  4. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 3: Software and Hardware. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for the disabled or the elderly, the book focuses on hardware and software. The guide's 13 chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by function, input/output…

  5. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 2: Switches and Environmental Controls. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on switches and environmental controls. The book's three chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by…

  6. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 1: Communication Aids. Rehab/Education Technology ResourceBook Series.

    ERIC Educational Resources Information Center

    Brandenburg, Sara A., Ed.; Vanderheiden, Gregg C., Ed.

    One of a series of three resource guides concerned with communication, control, and computer access for disabled and elderly individuals, the directory focuses on communication aids. The book's six chapters each cover products with the same primary function. Cross reference indexes allow access to listings of products by function, input/output…

  7. Origins of stereoselectivity in the Diels-Alder addition of chiral hydroxyalkyl vinyl ketones to cyclopentadiene: a quantitative computational study.

    PubMed

    Bakalova, Snezhana M; Kaneti, Jose

    2008-12-18

    Modest basis set level MP2/6-31G(d,p) calculations on the Diels-Alder addition of S-1-alkyl-1-hydroxy-but-3-en-2-ones (1-hydroxy-1-alkyl methyl vinyl ketones) to cyclopentadiene correctly reproduce the trends in known experimental endo/exo and diastereoface selectivity. B3LYP theoretical results at the same or significantly higher basis set level, on the other hand, do not satisfactorily model observed endo/exo selectivities and are thus unsuitable for quantitative studies. The same is valid also with regard to subtle effects originating from, for example, conformational distributions of reactants. The latter shortcomings are not alleviated by the fact that observed diastereoface selectivities are well-reproduced by DFT calculations. Quantitative computational studies of large cycloaddition systems would require higher basis sets and better account for electron correlation than MP2, such as, for example, CCSD. Presently, however, with 30 or more non-hydrogen atoms, these computations are hardly feasible. We present quantitatively correct stereochemical predictions using a hybrid layered ONIOM computational approach, including the chiral carbon atom and the intramolecular hydrogen bond into a higher level, MP2/6-311G(d,p) or CCSD/6-311G(d,p), layer. Significant computational economy is achieved by taking account of surrounding bulky (alkyl) residues at 6-31G(d) in a low HF theoretical level layer. We conclude that theoretical calculations based on explicit correlated MO treatment of the reaction site are sufficiently reliable for the prediction of both endo/exo and diastereoface selectivity of Diels-Alder addition reactions. This is in line with the understanding of endo/exo selectivity originating from dynamic electron correlation effects of interacting pi fragments and diastereofacial selectivity originating from steric interactions of fragments outside of the Diels-Alder reaction site. PMID:18637663

  8. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic

  9. ADHydro: A Large-scale High Resolution Multi-Physics Distributed Water Resources Model for Water Resource Simulations in a Parallel Computing Environment

    NASA Astrophysics Data System (ADS)

    lai, W.; Steinke, R. C.; Ogden, F. L.

    2013-12-01

    Physics-based watershed models are useful tools for hydrologic studies, water resources management and economic analyses in the contexts of climate, land-use, and water-use changes. This poster presents development of a physics-based, high-resolution, distributed water resources model suitable for simulating large watersheds in a massively parallel computing environment. Developing this model is one of the objectives of the NSF EPSCoR RII Track II CI-WATER project, which is joint between Wyoming and Utah. The model, which we call ADHydro, is aimed at simulating important processes in the Rocky Mountain west, includes: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow and water management. The ADHydro model uses the explicit finite volume method to solve PDEs for 2D overland flow, 2D saturated groundwater flow coupled to 1D channel flow. The model has a quasi-3D formulation that couples 2D overland flow and 2D saturated groundwater flow using the 1D Talbot-Ogden finite water-content infiltration and redistribution model. This eliminates difficulties in solving the highly nonlinear 3D Richards equation, while the finite volume Talbot-Ogden infiltration solution is computationally efficient, guaranteed to conserve mass, and allows simulation of the effect of near-surface groundwater tables on runoff generation. The process-level components of the model are being individually tested and validated. The model as a whole will be tested on the Green River basin in Wyoming and ultimately applied to the entire Upper Colorado River basin. ADHydro development has necessitated development of tools for large-scale watershed modeling, including open-source workflow steps to extract hydromorphological information from GIS data, integrate hydrometeorological and water management forcing input, and post-processing and visualization of large output data

  10. Development of a Computer-Based Resource for Inclusion Science Classrooms

    NASA Astrophysics Data System (ADS)

    Olsen, J. K.; Slater, T.

    2005-12-01

    Current instructional issues necessitate educators start with curriculum and determine how educational technology can assist students in achieving positive learning goals, functionally supplementing the classroom instruction. Technology projects incorporating principles of situated learning have been shown to provide effective framework for learning, and computer technology has been shown to facilitate learning among special needs students. Students with learning disabilities may benefit from assistive technology, but these resources are not always utilized during classroom instruction: technology is only effective if teachers view it as an integral part of the learning process. The materials currently under development are in the domain of earth and space science, part of the Arizona 5-8 Science Content Standards. The concern of this study is to determine a means of assisting inclusive education that is both feasible and effective in ensuring successful science learning outcomes for all students whether regular education or special needs.

  11. EGI-EUDAT integration activity - Pair data and high-throughput computing resources together

    NASA Astrophysics Data System (ADS)

    Scardaci, Diego; Viljoen, Matthew; Vitlacil, Dejan; Fiameni, Giuseppe; Chen, Yin; sipos, Gergely; Ferrari, Tiziana

    2016-04-01

    EGI (www.egi.eu) is a publicly funded e-infrastructure put together to give scientists access to more than 530,000 logical CPUs, 200 PB of disk capacity and 300 PB of tape storage to drive research and innovation in Europe. The infrastructure provides both high throughput computing and cloud compute/storage capabilities. Resources are provided by about 350 resource centres which are distributed across 56 countries in Europe, the Asia-Pacific region, Canada and Latin America. EUDAT (www.eudat.eu) is a collaborative Pan-European infrastructure providing research data services, training and consultancy for researchers, research communities, research infrastructures and data centres. EUDAT's vision is to enable European researchers and practitioners from any research discipline to preserve, find, access, and process data in a trusted environment, as part of a Collaborative Data Infrastructure (CDI) conceived as a network of collaborating, cooperating centres, combining the richness of numerous community-specific data repositories with the permanence and persistence of some of Europe's largest scientific data centres. EGI and EUDAT, in the context of their flagship projects, EGI-Engage and EUDAT2020, started in March 2015 a collaboration to harmonise the two infrastructures, including technical interoperability, authentication, authorisation and identity management, policy and operations. The main objective of this work is to provide end-users with a seamless access to an integrated infrastructure offering both EGI and EUDAT services and, then, pairing data and high-throughput computing resources together. To define the roadmap of this collaboration, EGI and EUDAT selected a set of relevant user communities, already collaborating with both infrastructures, which could bring requirements and help to assign the right priorities to each of them. In this way, from the beginning, this activity has been really driven by the end users. The identified user communities are

  12. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction

    PubMed Central

    Nezarat, Amin; Dastghaibifard, GH

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer’s utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider. PMID:26431035

  13. Efficient Nash Equilibrium Resource Allocation Based on Game Theory Mechanism in Cloud Computing by Using Auction.

    PubMed

    Nezarat, Amin; Dastghaibifard, G H

    2015-01-01

    One of the most complex issues in the cloud computing environment is the problem of resource allocation so that, on one hand, the cloud provider expects the most profitability and, on the other hand, users also expect to have the best resources at their disposal considering the budget constraints and time. In most previous work conducted, heuristic and evolutionary approaches have been used to solve this problem. Nevertheless, since the nature of this environment is based on economic methods, using such methods can decrease response time and reducing the complexity of the problem. In this paper, an auction-based method is proposed which determines the auction winner by applying game theory mechanism and holding a repetitive game with incomplete information in a non-cooperative environment. In this method, users calculate suitable price bid with their objective function during several round and repetitions and send it to the auctioneer; and the auctioneer chooses the winning player based the suggested utility function. In the proposed method, the end point of the game is the Nash equilibrium point where players are no longer inclined to alter their bid for that resource and the final bid also satisfies the auctioneer's utility function. To prove the response space convexity, the Lagrange method is used and the proposed model is simulated in the cloudsim and the results are compared with previous work. At the end, it is concluded that this method converges to a response in a shorter time, provides the lowest service level agreement violations and the most utility to the provider.

  14. Modeling working memory: a computational implementation of the Time-Based Resource-Sharing theory.

    PubMed

    Oberauer, Klaus; Lewandowsky, Stephan

    2011-02-01

    Working memory is a core concept in cognition, predicting about 50% of the variance in IQ and reasoning tasks. A popular test of working memory is the complex span task, in which encoding of memoranda alternates with processing of distractors. A recent model of complex span performance, the Time-Based-Resource-Sharing (TBRS) model of Barrouillet and colleagues, has seemingly accounted for several crucial findings, in particular the intricate trade-off between deterioration and restoration of memory in the complex span task. According to the TBRS, memory traces decay during processing of the distractors, and they are restored by attentional refreshing during brief pauses in between processing steps. However, to date, the theory has been formulated only at a verbal level, which renders it difficult to test and to be certain of its intuited predictions. We present a computational instantiation of the TBRS and show that it can handle most of the findings on which the verbal model was based. We also show that there are potential challenges to the model that await future resolution. This instantiated model, TBRS*, is the first comprehensive computational model of performance in the complex span paradigm. The Matlab model code is available as a supplementary material of this article. PMID:21327362

  15. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  16. What Can Computer Technology Offer Special Education? Research & Resources: Special Education Information for Policymakers.

    ERIC Educational Resources Information Center

    National Conference of State Legislatures, Washington, DC.

    Intended for policymakers, this brief addresses issues related to computer technology and its contributions to special education. Trends are noted and three types of applications are considered: computer assisted instruction, computer managed instruction, and computer support activities. Descriptions of several computer applications in local and…

  17. How to Make the Best Use of Limited Computer Resources in French Primary Schools.

    ERIC Educational Resources Information Center

    Parmentier, Christophe

    1988-01-01

    Discusses computer science developments in French primary schools and describes strategies for using computers in the classroom most efficiently. Highlights include the use of computer networks; software; artificial intelligence and expert systems; computer-assisted learning (CAL) and intelligent CAL; computer peripherals; simulation; and teaching…

  18. Linking process, structure, property, and performance for metal-based additive manufacturing: computational approaches with experimental support

    NASA Astrophysics Data System (ADS)

    Smith, Jacob; Xiong, Wei; Yan, Wentao; Lin, Stephen; Cheng, Puikei; Kafka, Orion L.; Wagner, Gregory J.; Cao, Jian; Liu, Wing Kam

    2016-04-01

    Additive manufacturing (AM) methods for rapid prototyping of 3D materials (3D printing) have become increasingly popular with a particular recent emphasis on those methods used for metallic materials. These processes typically involve an accumulation of cyclic phase changes. The widespread interest in these methods is largely stimulated by their unique ability to create components of considerable complexity. However, modeling such processes is exceedingly difficult due to the highly localized and drastic material evolution that often occurs over the course of the manufacture time of each component. Final product characterization and validation are currently driven primarily by experimental means as a result of the lack of robust modeling procedures. In the present work, the authors discuss primary detrimental hurdles that have plagued effective modeling of AM methods for metallic materials while also providing logical speculation into preferable research directions for overcoming these hurdles. The primary focus of this work encompasses the specific areas of high-performance computing, multiscale modeling, materials characterization, process modeling, experimentation, and validation for final product performance of additively manufactured metallic components.

  19. Utilizing Research and Development Products in Strategic Planning and Human Resource Development in the Computer Literate, High Technology, Information Society.

    ERIC Educational Resources Information Center

    Groff, Warren H.

    Stressing the importance of the relationship of vocational and technical education to the economy, this paper discusses how existing educational research and development (R&D) resources can assist in preparing for the computer literate, high technology, information society. After emphasizing the magnitude of the education and training industry,…

  20. A Practitioner Model of the Use of Computer-Based Tools and Resources to Support Mathematics Teaching and Learning.

    ERIC Educational Resources Information Center

    Ruthven, Kenneth; Hennessy, Sara

    2002-01-01

    Analyzes the pedagogical ideas underpinning teachers' accounts of the successful use of computer-based tools and resources to support the teaching and learning of mathematics. Organizes central themes to form a pedagogical model capable of informing the use of such technologies in classroom teaching and generating theoretical conjectures for…

  1. Computational resources and strategies to construct single-molecule metabolic models of microbial cells.

    PubMed

    Gameiro, Denise; Pérez-Pérez, Martín; Pérez-Rodríguez, Gael; Monteiro, Gonçalo; Azevedo, Nuno F; Lourenço, Anália

    2016-09-01

    Recent computational methodologies, such as individual-based modelling, pave the way to the search for explanatory insight into the collective behaviour of molecules. Many reviews offer an up-to-date perspective about such methodologies, but little is discussed about the practical information requirements involved. The biological information used as input should be easily and routinely determined in the laboratory, publicly available and, preferably, organized in programmatically accessible databases. This review is the first to provide a systematic and comprehensive overview of available resources for the modelling of metabolic events at the molecular scale. The glycolysis pathway of Escherichia coli, which is one of the most studied pathways in Microbiology, serves as case study. This curation addressed structural information about E. coli (i.e. defining the simulation environment), the reactions forming the glycolysis pathway including the enzymes and the metabolites (i.e. the molecules to be represented), the kinetics of each reaction (i.e. behavioural logic of the molecules) and diffusion parameters for all enzymes and metabolites (i.e. molecule movement in the environment). Furthermore, the interpretation of relevant biological features, such as molecular diffusion and enzyme kinetics, and the connection of experimental determination and simulation validation are detailed. Notably, the information from classical theories, such as enzymatic rates and diffusion coefficients, is translated to simulation parameters, such as collision efficiency and particle velocity. PMID:26515531

  2. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGES

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; et al

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  3. Disposal of waste computer hard disk drive: data destruction and resources recycling.

    PubMed

    Yan, Guoqing; Xue, Mianqiang; Xu, Zhenming

    2013-06-01

    An increasing quantity of discarded computers is accompanied by a sharp increase in the number of hard disk drives to be eliminated. A waste hard disk drive is a special form of waste electrical and electronic equipment because it holds large amounts of information that is closely connected with its user. Therefore, the treatment of waste hard disk drives is an urgent issue in terms of data security, environmental protection and sustainable development. In the present study the degaussing method was adopted to destroy the residual data on the waste hard disk drives and the housing of the disks was used as an example to explore the coating removal process, which is the most important pretreatment for aluminium alloy recycling. The key operation points of the degaussing determined were: (1) keep the platter plate parallel with the magnetic field direction; and (2) the enlargement of magnetic field intensity B and action time t can lead to a significant upgrade in the degaussing effect. The coating removal experiment indicated that heating the waste hard disk drives housing at a temperature of 400 °C for 24 min was the optimum condition. A novel integrated technique for the treatment of waste hard disk drives is proposed herein. This technique offers the possibility of destroying residual data, recycling the recovered resources and disposing of the disks in an environmentally friendly manner.

  4. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    SciTech Connect

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; Rambharos, R.

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop low utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.

  5. Computer Literacy Act of 1984. Report together with Minority Views [and] Computer Literacy Act of 1983. Report together with Additional Views. To accompany H.R. 3750.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Education and Labor.

    These two reports contain supporting material intended to accompany the Computer Literacy Acts of 1983 and 1984 (H. R. 3750). This bill was designed to promote the use of computer technologies in elementary and secondary schools by authorizing: (1) grants to local school districts, particularly in poor areas, to purchase computer hardware; (2)…

  6. Computation of groundwater resources and recharge in Chithar River Basin, South India.

    PubMed

    Subramani, T; Babu, Savithri; Elango, L

    2013-01-01

    Groundwater recharge and available groundwater resources in Chithar River basin, Tamil Nadu, India spread over an area of 1,722 km(2) have been estimated by considering various hydrological, geological, and hydrogeological parameters, such as rainfall infiltration, drainage, geomorphic units, land use, rock types, depth of weathered and fractured zones, nature of soil, water level fluctuation, saturated thickness of aquifer, and groundwater abstraction. The digital ground elevation models indicate that the regional slope of the basin is towards east. The Proterozoic (Post-Archaean) basement of the study area consists of quartzite, calc-granulite, crystalline limestone, charnockite, and biotite gneiss with or without garnet. Three major soil types were identified namely, black cotton, deep red, and red sandy soils. The rainfall intensity gradually decreases from west to east. Groundwater occurs under water table conditions in the weathered zone and fluctuates between 0 and 25 m. The water table gains maximum during January after northeast monsoon and attains low during October. Groundwater abstraction for domestic/stock and irrigational needs in Chithar River basin has been estimated as 148.84 MCM (million m(3)). Groundwater recharge due to monsoon rainfall infiltration has been estimated as 170.05 MCM based on the water level rise during monsoon period. It is also estimated as 173.9 MCM using rainfall infiltration factor. An amount of 53.8 MCM of water is contributed to groundwater from surface water bodies. Recharge of groundwater due to return flow from irrigation has been computed as 147.6 MCM. The static groundwater reserve in Chithar River basin is estimated as 466.66 MCM and the dynamic reserve is about 187.7 MCM. In the present scenario, the aquifer is under safe condition for extraction of groundwater for domestic and irrigation purposes. If the existing water bodies are maintained properly, the extraction rate can be increased in future about 10% to 15%.

  7. Literacy effects on language and vision: emergent effects from an amodal shared resource (ASR) computational model.

    PubMed

    Smith, Alastair C; Monaghan, Padraic; Huettig, Falk

    2014-12-01

    Learning to read and write requires an individual to connect additional orthographic representations to pre-existing mappings between phonological and semantic representations of words. Past empirical results suggest that the process of learning to read and write (at least in alphabetic languages) elicits changes in the language processing system, by either increasing the cognitive efficiency of mapping between representations associated with a word, or by changing the granularity of phonological processing of spoken language, or through a combination of both. Behavioural effects of literacy have typically been assessed in offline explicit tasks that have addressed only phonological processing. However, a recent eye tracking study compared high and low literate participants on effects of phonology and semantics in processing measured implicitly using eye movements. High literates' eye movements were more affected by phonological overlap in online speech than low literates, with only subtle differences observed in semantics. We determined whether these effects were due to cognitive efficiency and/or granularity of speech processing in a multimodal model of speech processing - the amodal shared resource model (ASR, Smith, Monaghan, & Huettig, 2013a,b). We found that cognitive efficiency in the model had only a marginal effect on semantic processing and did not affect performance for phonological processing, whereas fine-grained versus coarse-grained phonological representations in the model simulated the high/low literacy effects on phonological processing, suggesting that literacy has a focused effect in changing the grain-size of phonological mappings. PMID:25171049

  8. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    This report describes a computer-based information system designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. For this study, soil, land cover/use, topographic and geological maps were used as a data base to develop an eleven map portfolio. The major themes of the portfolio are land cover/use, nonpoint water pollution, waste disposal, and ground water recharge.

  9. Editorial: Special issue on resources for the computer security and information assurance curriculum: Issue 1Curriculum Editorial Comments, Volume 1 and Volume 2

    SciTech Connect

    Frincke, Deb; Ouderkirk, Steven J.; Popovsky, Barbara

    2006-12-28

    This is a pair of articles to be used as the cover editorials for a special edition of the Journal of Educational Resources in Computing (JERIC) Special Edition on Resources for the Computer Security and Information Assurance Curriculum, volumes 1 and 2.

  10. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  11. Aggregating data for computational toxicology applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System.

    PubMed

    Judson, Richard S; Martin, Matthew T; Egeghy, Peter; Gangwal, Sumit; Reif, David M; Kothiya, Parth; Wolf, Maritja; Cathey, Tommy; Transue, Thomas; Smith, Doris; Vail, James; Frame, Alicia; Mosher, Shad; Cohen Hubal, Elaine A; Richard, Ann M

    2012-01-01

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for predicting toxicity of new chemicals and products. A key feature of such approaches is their reliance on knowledge extracted from large collections of data and data sets in computable formats. The U.S. Environmental Protection Agency (EPA) has developed a large data resource called ACToR (Aggregated Computational Toxicology Resource) to support these data-intensive efforts. ACToR comprises four main repositories: core ACToR (chemical identifiers and structures, and summary data on hazard, exposure, use, and other domains), ToxRefDB (Toxicity Reference Database, a compilation of detailed in vivo toxicity data from guideline studies), ExpoCastDB (detailed human exposure data from observational studies of selected chemicals), and ToxCastDB (data from high-throughput screening programs, including links to underlying biological information related to genes and pathways). The EPA DSSTox (Distributed Structure-Searchable Toxicity) program provides expert-reviewed chemical structures and associated information for these and other high-interest public inventories. Overall, the ACToR system contains information on about 400,000 chemicals from 1100 different sources. The entire system is built using open source tools and is freely available to download. This review describes the organization of the data repository and provides selected examples of use cases.

  12. Projections of costs, financing, and additional resource requirements for low- and lower middle-income country immunization programs over the decade, 2011-2020.

    PubMed

    Gandhi, Gian; Lydon, Patrick; Cornejo, Santiago; Brenzel, Logan; Wrobel, Sandra; Chang, Hugh

    2013-04-18

    The Decade of Vaccines Global Vaccine Action Plan has outlined a set of ambitious goals to broaden the impact and reach of immunization across the globe. A projections exercise has been undertaken to assess the costs, financing availability, and additional resource requirements to achieve these goals through the delivery of vaccines against 19 diseases across 94 low- and middle-income countries for the period 2011-2020. The exercise draws upon data from existing published and unpublished global forecasts, country immunization plans, and costing studies. A combination of an ingredients-based approach and use of approximations based on past spending has been used to generate vaccine and non-vaccine delivery costs for routine programs, as well as supplementary immunization activities (SIAs). Financing projections focused primarily on support from governments and the GAVI Alliance. Cost and financing projections are presented in constant 2010 US dollars (US$). Cumulative total costs for the decade are projected to be US$57.5 billion, with 85% for routine programs and the remaining 15% for SIAs. Delivery costs account for 54% of total cumulative costs, and vaccine costs make up the remainder. A conservative estimate of total financing for immunization programs is projected to be $34.3 billion over the decade, with country governments financing 65%. These projections imply a cumulative funding gap of $23.2 billion. About 57% of the total resources required to close the funding gap are needed just to maintain existing programs and scale up other currently available vaccines (i.e., before adding in the additional costs of vaccines still in development). Efforts to mobilize additional resources, manage program costs, and establish mutual accountability between countries and development partners will all be necessary to ensure the goals of the Decade of Vaccines are achieved. Establishing or building on existing mechanisms to more comprehensively track resources and

  13. Project Final Report: Ubiquitous Computing and Monitoring System (UCoMS) for Discovery and Management of Energy Resources

    SciTech Connect

    Tzeng, Nian-Feng; White, Christopher D.; Moreman, Douglas

    2012-07-14

    The UCoMS research cluster has spearheaded three research areas since August 2004, including wireless and sensor networks, Grid computing, and petroleum applications. The primary goals of UCoMS research are three-fold: (1) creating new knowledge to push forward the technology forefronts on pertinent research on the computing and monitoring aspects of energy resource management, (2) developing and disseminating software codes and toolkits for the research community and the public, and (3) establishing system prototypes and testbeds for evaluating innovative techniques and methods. Substantial progress and diverse accomplishment have been made by research investigators in their respective areas of expertise cooperatively on such topics as sensors and sensor networks, wireless communication and systems, computational Grids, particularly relevant to petroleum applications.

  14. Staff Computer Literacy in the Community College: A Resource Inventory and Directory, 1984.

    ERIC Educational Resources Information Center

    Stewart, Anne

    In spring 1984, a survey was undertaken to determine what activities were being conducted by member institutions of the League for Innovation in the Community College to help staff members become more comfortable with and knowledgeable about computers. League members were asked to provide information on the current level of computer usage among…

  15. University Students and Ethics of Computer Technology Usage: Human Resource Development

    ERIC Educational Resources Information Center

    Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer

    2012-01-01

    The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…

  16. Coordinating Technological Resources in a Non-Technical Profession: The Administrative Computer User Group.

    ERIC Educational Resources Information Center

    Rollo, J. Michael; Marmarchev, Helen L.

    1999-01-01

    The explosion of computer applications in the modern workplace has required student affairs professionals to keep pace with technological advances for office productivity. This article recommends establishing an administrative computer user groups, utilizing coordinated web site development, and enhancing working relationships as ways of dealing…

  17. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  18. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics.

    PubMed

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems. PMID:27176426

  19. Quantum ring-polymer contraction method: Including nuclear quantum effects at no additional computational cost in comparison to ab initio molecular dynamics

    NASA Astrophysics Data System (ADS)

    John, Christopher; Spura, Thomas; Habershon, Scott; Kühne, Thomas D.

    2016-04-01

    We present a simple and accurate computational method which facilitates ab initio path-integral molecular dynamics simulations, where the quantum-mechanical nature of the nuclei is explicitly taken into account, at essentially no additional computational cost in comparison to the corresponding calculation using classical nuclei. The predictive power of the proposed quantum ring-polymer contraction method is demonstrated by computing various static and dynamic properties of liquid water at ambient conditions using density functional theory. This development will enable routine inclusion of nuclear quantum effects in ab initio molecular dynamics simulations of condensed-phase systems.

  20. Food Resources of Stream Macronivertebrates Determined by Natural-Abundance stable C and N Isotopes and a 15N Tracer Addition

    SciTech Connect

    Mulholland, P. J.

    2000-01-01

    Trophic relationships were examined using natural-abundance {sup 13}C and {sup 15}N analyses and a {sup 15}N-tracer addition experiment in Walker Branch, a 1st-order forested stream in eastern Tennessee. In the {sup 15}N-tracer addition experiment, we added {sup 15}NH{sub 4} to stream water over a 6-wk period in early spring, and measured {sup 15}N:{sup 14}N ratios in different taxa and biomass compartments over distance and time. Samples collected from a station upstream from the {sup 15}N addition provided data on natural-abundance {sup 13}C:{sup 12}C and {sup 15}N:{sup 14}N ratios. The natural-abundance {sup 15}N analysis proved to be of limited value in identifying food resources of macroinvertebrates because {sup 15}N values were not greatly different among food resources. In general, the natural-abundance stable isotope approach was most useful for determining whether epilithon or detritus were important food resources for organisms that may use both (e.g., the snail Elimia clavaeformis), and to provide corroborative evidence of food resources of taxa for which the {sup 15}N tracer results were not definitive. The {sup 15}N tracer results showed that the mayflies Stenonema spp. and Baetis spp. assimilated primarily epilithon, although Baetis appeared to assimilate a portion of the epilithon (e.g., algal cells) with more rapid N turnover than the bulk pool sampled. Although Elimia did not reach isotopic equilibrium during the tracer experiment, application of a N-turnover model to the field data suggested that it assimilated a combination of epilithon and detritus. The amphipod Gammarus minus appeared to depend mostly on fine benthic organic matter (FBOM), and the coleopteran Anchytarsus bicolor on epixylon. The caddisfly Diplectrona modesta appeared to assimilate primarily a fast N-turnover portion of the FBOM pool, and Simuliidae a fast N-turnover component of the suspended particulate organic matter pool rather than the bulk pool sampled. Together, the

  1. Mineral resources of the Sheep Mountain Wilderness study area and the Cucamonga Wilderness and additions, Los Angeles and San Bernardino counties, California

    USGS Publications Warehouse

    Evans, James G.; Pankraatz, Leroy; Ridenour, James; Schmauch, Steven W.; Zilka, Nicholas T.

    1977-01-01

    A mineral survey of the Sheep Mountain Wilderness study area and Cucamonga Wilderness area and additions by the U.S. Geological Survey and Bureau of Mines in 1975 covered about 66,500 acres (26,500 ha) of the San Bernardino and Angeles National Forests in southern California. The two study areas are separated by San Antonio Canyon. The mineral resource potential was evaluated through geological, geochemical, and geophysical studies by the Geological Survey and through evaluation of mines and prospects by the Bureau of Mines.

  2. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming

    USGS Publications Warehouse

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.

    1982-01-01

    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  3. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Return Method Rate of return for a period may be calculated by computing the net performance divided by the beginning net asset value for each trading day in the period and compounding each daily rate of... commodity pool operator or commodity trading advisor may present to the Commission proposals regarding...

  4. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Return Method Rate of return for a period may be calculated by computing the net performance divided by the beginning net asset value for each trading day in the period and compounding each daily rate of... commodity pool operator or commodity trading advisor may present to the Commission proposals regarding...

  5. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Return Method Rate of return for a period may be calculated by computing the net performance divided by the beginning net asset value for each trading day in the period and compounding each daily rate of... commodity pool operator or commodity trading advisor may present to the Commission proposals regarding...

  6. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Return Method Rate of return for a period may be calculated by computing the net performance divided by the beginning net asset value for each trading day in the period and compounding each daily rate of... commodity pool operator or commodity trading advisor may present to the Commission proposals regarding...

  7. 17 CFR Appendix B to Part 4 - Adjustments for Additions and Withdrawals in the Computation of Rate of Return

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Return Method Rate of return for a period may be calculated by computing the net performance divided by the beginning net asset value for each trading day in the period and compounding each daily rate of... commodity pool operator or commodity trading advisor may present to the Commission proposals regarding...

  8. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    PubMed

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system.

  9. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    NASA Astrophysics Data System (ADS)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  10. Tracking the Flow of Resources in Electronic Waste - The Case of End-of-Life Computer Hard Disk Drives.

    PubMed

    Habib, Komal; Parajuly, Keshav; Wenzel, Henrik

    2015-10-20

    Recovery of resources, in particular, metals, from waste flows is widely seen as a prioritized option to reduce their potential supply constraints in the future. The current waste electrical and electronic equipment (WEEE) treatment system is more focused on bulk metals, where the recycling rate of specialty metals, such as rare earths, is negligible compared to their increasing use in modern products, such as electronics. This study investigates the challenges in recovering these resources in the existing WEEE treatment system. It is illustrated by following the material flows of resources in a conventional WEEE treatment plant in Denmark. Computer hard disk drives (HDDs) containing neodymium-iron-boron (NdFeB) magnets were selected as the case product for this experiment. The resulting output fractions were tracked until their final treatment in order to estimate the recovery potential of rare earth elements (REEs) and other resources contained in HDDs. The results further show that out of the 244 kg of HDDs treated, 212 kg comprising mainly of aluminum and steel can be finally recovered from the metallurgic process. The results further demonstrate the complete loss of REEs in the existing shredding-based WEEE treatment processes. Dismantling and separate processing of NdFeB magnets from their end-use products can be a more preferred option over shredding. However, it remains a technological and logistic challenge for the existing system. PMID:26351732

  11. Distance Education Resource Guide. Computer-Based Distance Education in Community Colleges: A Joint Project.

    ERIC Educational Resources Information Center

    Lever, Judy C.

    This resource guide provides information on distance education options and offers a directory of programs at various institutions in the United States and Canada. Following a brief preface on the development of the guide, "New Models for Higher Education in Changing Times," by Don Doucette, discusses the nature of distance learning, the…

  12. Recommended Computer End-User Skills for Business Students by Fortune 500 Human Resource Executives.

    ERIC Educational Resources Information Center

    Zhao, Jensen J.

    1996-01-01

    Human resources executives (83 responses from 380) strongly recommended 11 and recommended 46 end-user skills for business graduates. Core skills included use of keyboard, mouse, microcomputer, and printer; Windows; Excel; and telecommunications functions (electronic mail, Internet, local area networks, downloading). Knowing one application of…

  13. Trace ResourceBook: Assistive Technologies for Communication, Control & Computer Access. 1996-97 Edition.

    ERIC Educational Resources Information Center

    Borden, Peter A., Ed.; And Others

    This resource book lists approximately 1,500 products designed specifically for the needs of people with disabilities. Typically, each product is pictured; basic information is provided including manufacturer name, product cost, size, and weight; and the product is briefly described. The book's four sections each describe products designed for…

  14. VisPortal: Increasing Scientific Productivity by Simplifying Access to and Use of Remote Computational Resources

    SciTech Connect

    Siegerist, Cristina; Shalf, John; Bethel, E. Wes

    2004-01-01

    Our goal is to simplify and streamline the process of using remotely located visual data analysis software tools. This discussion presents an example of an easy-to-use interface that mediates access to and use of diverse and powerful visual data analysis resources. The interface is presented via a standard web browser, which is ubiquitous and a part of every researchers work environment. Through the web interface, a few mouse clicks are all that is needed to take advantage of powerful, remotely located software resources. The VisPortal project is the software that provides diverse services to remotely located users through their web browser. Using standard Globus-grid middleware and off-the-shelf web automation, the VisPortal hides the underlying complexity of resource selection and distributed application management. The portal automates complex workflows that would otherwise require a substantial amount of manual effort on the part of the researcher. With a few mouse clicks, a researcher can quickly perform complex tasks like creating MPEG movies, scheduling file transfers, launching components of a distributed application, and accessing specialized resources.

  15. Recommendations for protecting National Library of Medicine Computing and Networking Resources

    SciTech Connect

    Feingold, R.

    1994-11-01

    Protecting Information Technology (IT) involves a number of interrelated factors. These include mission, available resources, technologies, existing policies and procedures, internal culture, contemporary threats, and strategic enterprise direction. In the face of this formidable list, a structured approach provides cost effective actions that allow the organization to manage its risks. We face fundamental challenges that will persist for at least the next several years. It is difficult if not impossible to precisely quantify risk. IT threats and vulnerabilities change rapidly and continually. Limited organizational resources combined with mission restraints-such as availability and connectivity requirements-will insure that most systems will not be absolutely secure (if such security were even possible). In short, there is no technical (or administrative) {open_quotes}silver bullet.{close_quotes} Protection is employing a stratified series of recommendations, matching protection levels against information sensitivities. Adaptive and flexible risk management is the key to effective protection of IT resources. The cost of the protection must be kept less than the expected loss, and one must take into account that an adversary will not expend more to attack a resource than the value of its compromise to that adversary. Notwithstanding the difficulty if not impossibility to precisely quantify risk, the aforementioned allows us to avoid the trap of choosing a course of action simply because {open_quotes}it`s safer{close_quotes} or ignoring an area because no one had explored its potential risk. Recommendations for protecting IT resources begins with discussing contemporary threats and vulnerabilities, and then procedures from general to specific preventive measures. From a risk management perspective, it is imperative to understand that today, the vast majority of threats are against UNIX hosts connected to the Internet.

  16. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    PubMed

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized. PMID:27064532

  17. Mobile clusters of single board computers: an option for providing resources to student projects and researchers.

    PubMed

    Baun, Christian

    2016-01-01

    Clusters usually consist of servers, workstations or personal computers as nodes. But especially for academic purposes like student projects or scientific projects, the cost for purchase and operation can be a challenge. Single board computers cannot compete with the performance or energy-efficiency of higher-value systems, but they are an option to build inexpensive cluster systems. Because of the compact design and modest energy consumption, it is possible to build clusters of single board computers in a way that they are mobile and can be easily transported by the users. This paper describes the construction of such a cluster, useful applications and the performance of the single nodes. Furthermore, the clusters' performance and energy-efficiency is analyzed by executing the High Performance Linpack benchmark with a different number of nodes and different proportion of the systems total main memory utilized.

  18. A mathematical model for a distributed attack on targeted resources in a computer network

    NASA Astrophysics Data System (ADS)

    Haldar, Kaushik; Mishra, Bimal Kumar

    2014-09-01

    A mathematical model has been developed to analyze the spread of a distributed attack on critical targeted resources in a network. The model provides an epidemic framework with two sub-frameworks to consider the difference between the overall behavior of the attacking hosts and the targeted resources. The analysis focuses on obtaining threshold conditions that determine the success or failure of such attacks. Considering the criticality of the systems involved and the strength of the defence mechanism involved, a measure has been suggested that highlights the level of success that has been achieved by the attacker. To understand the overall dynamics of the system in the long run, its equilibrium points have been obtained and their stability has been analyzed, and conditions for their stability have been outlined.

  19. Development of a Computational Framework for Stochastic Co-optimization of Water and Energy Resource Allocations under Climatic Uncertainty

    NASA Astrophysics Data System (ADS)

    Xuan, Y.; Mahinthakumar, K.; Arumugam, S.; DeCarolis, J.

    2015-12-01

    Owing to the lack of a consistent approach to assimilate probabilistic forecasts for water and energy systems, utilization of climate forecasts for conjunctive management of these two systems is very limited. Prognostic management of these two systems presents a stochastic co-optimization problem that seeks to determine reservoir releases and power allocation strategies while minimizing the expected operational costs subject to probabilistic climate forecast constraints. To address these issues, we propose a high performance computing (HPC) enabled computational framework for stochastic co-optimization of water and energy resource allocations under climate uncertainty. The computational framework embodies a new paradigm shift in which attributes of climate (e.g., precipitation, temperature) and its forecasted probability distribution are employed conjointly to inform seasonal water availability and electricity demand. The HPC enabled cyberinfrastructure framework is developed to perform detailed stochastic analyses, and to better quantify and reduce the uncertainties associated with water and power systems management by utilizing improved hydro-climatic forecasts. In this presentation, our stochastic multi-objective solver extended from Optimus (Optimization Methods for Universal Simulators), is introduced. The solver uses parallel cooperative multi-swarm method to solve for efficient solution of large-scale simulation-optimization problems on parallel supercomputers. The cyberinfrastructure harnesses HPC resources to perform intensive computations using ensemble forecast models of streamflow and power demand. The stochastic multi-objective particle swarm optimizer we developed is used to co-optimize water and power system models under constraints over a large number of ensembles. The framework sheds light on the application of climate forecasts and cyber-innovation framework to improve management and promote the sustainability of water and energy systems.

  20. Planning and Development of the Computer Resource at Baylor College of Medicine.

    ERIC Educational Resources Information Center

    And Others; Ogilvie, W. Buckner, Jr.

    1979-01-01

    Describes the development and implementation of a plan at Baylor College of Medicine for providing computer support for both the administrative and scientific/ research needs of the Baylor community. The cost-effectiveness of this plan is also examined. (Author/CMV)

  1. The Portability of Computer-Related Educational Resources: An Overview of Issues and Directions.

    ERIC Educational Resources Information Center

    Collis, Betty A.; De Diana, Italo

    1990-01-01

    Provides an overview of the articles in this special issue, which deals with the portability, or transferability, of educational computer software. Motivations for portable software relating to cost, personnel, and time are discussed, and factors affecting portability are described, including technical factors, educational factors, social/cultural…

  2. Computer and Video Games in Family Life: The Digital Divide as a Resource in Intergenerational Interactions

    ERIC Educational Resources Information Center

    Aarsand, Pal Andre

    2007-01-01

    In this ethnographic study of family life, intergenerational video and computer game activities were videotaped and analysed. Both children and adults invoked the notion of a digital divide, i.e. a generation gap between those who master and do not master digital technology. It is argued that the digital divide was exploited by the children to…

  3. Allocation of Resources to Computer Support in Two-Year Colleges.

    ERIC Educational Resources Information Center

    Arth, Maurice P.

    A nationwide survey of 999 two-year colleges was conducted to collect data concerning computer-related expenditures at the colleges and to facilitate interinstitutional comparisons of these data. The survey collected data on current annual operating expenditures, Fall 1979 credit student headcount, and current annual outlays for: (1) "normal"…

  4. Computers for All Students: A Strategy for Universal Access to Information Resources.

    ERIC Educational Resources Information Center

    Resmer, Mark; And Others

    This report proposes a strategy of putting networked computing devices into the hands of all students at institutions of higher education. It outlines the rationale for such a strategy, the options for financing, the required institutional support structure needed, and various implementation approaches. The report concludes that the resultant…

  5. Method and apparatus for offloading compute resources to a flash co-processing appliance

    SciTech Connect

    Tzelnic, Percy; Faibish, Sorin; Gupta, Uday K.; Bent, John; Grider, Gary Alan; Chen, Hsing -bung

    2015-10-13

    Solid-State Drive (SSD) burst buffer nodes are interposed into a parallel supercomputing cluster to enable fast burst checkpoint of cluster memory to or from nearby interconnected solid-state storage with asynchronous migration between the burst buffer nodes and slower more distant disk storage. The SSD nodes also perform tasks offloaded from the compute nodes or associated with the checkpoint data. For example, the data for the next job is preloaded in the SSD node and very fast uploaded to the respective compute node just before the next job starts. During a job, the SSD nodes perform fast visualization and statistical analysis upon the checkpoint data. The SSD nodes can also perform data reduction and encryption of the checkpoint data.

  6. Computational Prediction of Protein–Protein Interaction Networks: Algo-rithms and Resources

    PubMed Central

    Zahiri, Javad; Bozorgmehr, Joseph Hannon; Masoudi-Nejad, Ali

    2013-01-01

    Protein interactions play an important role in the discovery of protein functions and pathways in biological processes. This is especially true in case of the diseases caused by the loss of specific protein-protein interactions in the organism. The accuracy of experimental results in finding protein-protein interactions, however, is rather dubious and high throughput experimental results have shown both high false positive beside false negative information for protein interaction. Computational methods have attracted tremendous attention among biologists because of the ability to predict protein-protein interactions and validate the obtained experimental results. In this study, we have reviewed several computational methods for protein-protein interaction prediction as well as describing major databases, which store both predicted and detected protein-protein interactions, and the tools used for analyzing protein interaction networks and improving protein-protein interaction reliability. PMID:24396273

  7. Integrating Clinical Trial Imaging Data Resources Using Service-Oriented Architecture and Grid Computing

    PubMed Central

    Cladé, Thierry; Snyder, Joshua C.

    2010-01-01

    Clinical trials which use imaging typically require data management and workflow integration across several parties. We identify opportunities for all parties involved to realize benefits with a modular interoperability model based on service-oriented architecture and grid computing principles. We discuss middleware products for implementation of this model, and propose caGrid as an ideal candidate due to its healthcare focus; free, open source license; and mature developer tools and support. PMID:20449775

  8. Resource Utilization and Costs during the Initial Years of Lung Cancer Screening with Computed Tomography in Canada

    PubMed Central

    Lam, Stephen; Tammemagi, Martin C.; Evans, William K.; Leighl, Natasha B.; Regier, Dean A.; Bolbocean, Corneliu; Shepherd, Frances A.; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R.; Mayo, John R.; McWilliams, Annette; Couture, Christian; English, John C.; Goffin, John; Hwang, David M.; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J.; Goss, Glenwood D.; Nicholas, Garth; Seely, Jean M.; Sekhon, Harmanjatinder S.; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N.; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D.; Tan, Wan C.; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J.

    2014-01-01

    Background: It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Methods: Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer’s perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. Results: The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400–$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553–$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254–$52,200; p = 0.061). Conclusion: In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure. PMID:25105438

  9. Improved operating scenarios of the DIII-D tokamak as a result of the addition of UNIX computer systems

    SciTech Connect

    Henline, P.A.

    1995-10-01

    The increased use of UNIX based computer systems for machine control, data handling and analysis has greatly enhanced the operating scenarios and operating efficiency of the DRI-D tokamak. This paper will describe some of these UNIX systems and their specific uses. These include the plasma control system, the electron cyclotron heating control system, the analysis of electron temperature and density measurements and the general data acquisition system (which is collecting over 130 Mbytes of data). The speed and total capability of these systems has dramatically affected the ability to operate DIII-D. The improved operating scenarios include better plasma shape control due to the more thorough MHD calculations done between shots and the new ability to see the time dependence of profile data as it relates across different spatial locations in the tokamak. Other analysis which engenders improved operating abilities will be described.

  10. Mechanistic and computational studies of the atom transfer radical addition of CCl4 to styrene catalyzed by copper homoscorpionate complexes.

    PubMed

    Muñoz-Molina, José María; Sameera, W M C; Álvarez, Eleuterio; Maseras, Feliu; Belderrain, Tomás R; Pérez, Pedro J

    2011-03-21

    Experimental as well as theoretical studies have been carried out with the aim of elucidating the mechanism of the atom transfer radical addition (ATRA) of styrene and carbon tetrachloride with a Tp(x)Cu(NCMe) complex as the catalyst precursor (Tp(x) = hydrotrispyrazolyl-borate ligand). The studies shown herein demonstrate the effect of different variables in the kinetic behavior. A mechanistic proposal consistent with theoretical and experimental data is presented.

  11. Earth Resources

    ERIC Educational Resources Information Center

    Brewer, Tom

    1970-01-01

    Reviews some of the more concerted, large-scale efforts in the earth resources areas" in order to help the computer community obtain insights into the activities it can jointly particpate in withthe earth resources community." (Author)

  12. Studying the Earth's Environment from Space: Computer Laboratory Exercised and Instructor Resources

    NASA Technical Reports Server (NTRS)

    Smith, Elizabeth A.; Alfultis, Michael

    1998-01-01

    Studying the Earth's Environment From Space is a two-year project to develop a suite of CD-ROMs containing Earth System Science curriculum modules for introductory undergraduate science classes. Lecture notes, slides, and computer laboratory exercises, including actual satellite data and software, are being developed in close collaboration with Carla Evans of NASA GSFC Earth Sciences Directorate Scientific and Educational Endeavors (SEE) project. Smith and Alfultis are responsible for the Oceanography and Sea Ice Processes Modules. The GSFC SEE project is responsible for Ozone and Land Vegetation Modules. This document constitutes a report on the first year of activities of Smith and Alfultis' project.

  13. Computer image analysis: an additional tool for the identification of processed poultry and mammal protein containing bones.

    PubMed

    Pinotti, L; Fearn, T; Gulalp, S; Campagnoli, A; Ottoboni, M; Baldi, A; Cheli, F; Savoini, G; Dell'Orto, V

    2013-01-01

    The aims of this study were (1) to evaluate the potential of image analysis measurements, in combination with the official analytical methods for the detection of constituents of animal origin in feedstuffs, to distinguish between poultry versus mammals; and (2) to identify possible markers that can be used in routine analysis. For this purpose, 14 mammal and seven poultry samples and a total of 1081 bone fragment lacunae were analysed by combining the microscopic methods with computer image analysis. The distribution of 30 different measured size and shape bone lacunae variables were studied both within and between the two zoological classes. In all cases a considerable overlap between classes meant that classification of individual lacunae was problematic, though a clear separation in the means did allow successful classification of samples on the basis of averages. The variables most useful for classification were those related to size, lacuna area for example. The approach shows considerable promise but will need further study using a larger number of samples with a wider range.

  14. Additional value of computer assisted semen analysis (CASA) compared to conventional motility assessments in pig artificial insemination.

    PubMed

    Broekhuijse, M L W J; Soštarić, E; Feitsma, H; Gadella, B M

    2011-11-01

    In order to obtain a more standardised semen motility evaluation, Varkens KI Nederland has introduced a computer assisted semen analysis (CASA) system in all their pig AI laboratories. The repeatability of CASA was enhanced by standardising for: 1) an optimal sample temperature (39 °C); 2) an optimal dilution factor; 3) optimal mixing of semen and dilution buffer by using mechanical mixing; 4) the slide chamber depth, and together with the previous points; 5) the optimal training of technicians working with the CASA system; and 6) the use of a standard operating procedure (SOP). Once laboratory technicians were trained in using this SOP, they achieved a coefficient of variation of < 5% which was superior to the variation found when the SOP was not strictly used. Microscopic semen motility assessments by eye were subjective and not comparable to the data obtained by standardised CASA. CASA results are preferable as accurate continuous motility dates are generated rather than discrimination motility percentage increments of 10% motility as with motility estimation by laboratory technicians. The higher variability of sperm motility found with CASA and the continuous motility values allow better analysis of the relationship between semen motility characteristics and fertilising capacity. The benefits of standardised CASA for AI is discussed both with respect to estimate the correct dilution factor of the ejaculate for the production of artificial insemination (AI) doses (critical for reducing the number of sperm per AI doses) and thus to get more reliable fertility data from these AI doses in return.

  15. Computational mechanics for geosystems management to support the energy and natural resources mission.

    SciTech Connect

    Stone, Charles Michael

    2010-07-01

    U.S. energy needs - minimizing climate change, mining and extraction technologies, safe waste disposal - require the ability to simulate, model, and predict the behavior of subsurface systems. They propose development of a coupled thermal, hydrological, mechanical, chemistry (THMC) modeling capability for massively parallel applications that can address these critical needs. The goal and expected outcome of this research is a state-of-the-art, extensible, simulation capability, based upon SIERRA Mechanics, to address multiphase, multicomponent reactive transport coupled to nonlinear geomechanics in heterogeneous (geologic) porous materials. The THMC code provides a platform for integrating research in numerical mathematics and algorithms for chemically reactive multiphase systems with computer science research in adaptive coupled solution control and framework architecture.

  16. SuperB R&D computing program: HTTP direct access to distributed resources

    NASA Astrophysics Data System (ADS)

    Fella, A.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Delprete, D.; Diacono, D.; Di Simone, A.; Franchini, P.; Donvito, G.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.; Tomassetti, L.

    2012-12-01

    The SuperB asymmetric energy e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a luminosity target of 1036cm-2s-1. The increasing network performance also in the Wide Area Network environment and the capability to read data remotely with good efficiency are providing new possibilities and opening new scenarios in the data access field. Subjects like data access and data availability in a distributed environment are key points in the definition of the computing model for an HEP experiment like SuperB. R&D efforts in such a field have been brought on during the last year in order to release the Computing Technical Design Report within 2013. WAN direct access to data has been identified as one of the more interesting viable option; robust and reliable protocols as HTTP/WebDAV and xrootd are the subjects of a specific R&D line in a mid-term scenario. In this work we present the R&D results obtained in the study of new data access technologies for typical HEP use cases, focusing on specific protocols such as HTTP and WebDAV in Wide Area Network scenarios. Reports on efficiency, performance and reliability tests performed in a data analysis context have been described. Future R&D plan includes HTTP and xrootd protocols comparison tests, in terms of performance, efficiency, security and features available.

  17. The Need for Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Bernier, David

    2011-01-01

    Broadening computer science learning to include more students is a crucial item on the United States' education agenda, these authors say. Although policymakers advocate more computer science expertise, computer science offerings in high schools are few--and actually shrinking. In addition, poorly resourced schools with a high percentage of…

  18. Evolutionary Computation with Spatial Receding Horizon Control to Minimize Network Coding Resources

    PubMed Central

    Leeson, Mark S.

    2014-01-01

    The minimization of network coding resources, such as coding nodes and links, is a challenging task, not only because it is a NP-hard problem, but also because the problem scale is huge; for example, networks in real world may have thousands or even millions of nodes and links. Genetic algorithms (GAs) have a good potential of resolving NP-hard problems like the network coding problem (NCP), but as a population-based algorithm, serious scalability and applicability problems are often confronted when GAs are applied to large- or huge-scale systems. Inspired by the temporal receding horizon control in control engineering, this paper proposes a novel spatial receding horizon control (SRHC) strategy as a network partitioning technology, and then designs an efficient GA to tackle the NCP. Traditional network partitioning methods can be viewed as a special case of the proposed SRHC, that is, one-step-wide SRHC, whilst the method in this paper is a generalized N-step-wide SRHC, which can make a better use of global information of network topologies. Besides the SRHC strategy, some useful designs are also reported in this paper. The advantages of the proposed SRHC and GA for the NCP are illustrated by extensive experiments, and they have a good potential of being extended to other large-scale complex problems. PMID:24883371

  19. [Use of computer applications to support clinical processes. An electronic letter of discharge as resource for DRG-relevant coding].

    PubMed

    Reng, Carl-Michael; Tege, Birgit; Reicherzer, Hans-Georg; Nass, Gertrud; Schacherer, Doris; Boerner, Wolfgang; Schölmerich, Jürgen

    2004-09-15

    Not long ago forensic aspects where the major driving force to complete and correct medical documentation. Diagnosis related groups becoming basic data for billing of hospital treatment in Germany now extremely extend the need for complex administrative documentation over-ruling all medical documentary needs. Due to unchanged personnel resources in hospitals and due to a lack of comfortable tools supporting these documentary needs medical personnel in most hospitals is confronted with an annoying effort to fulfill these needs. As a result innumerable hours of working time have to be shifted from patient focused to administrative work. This paper introduces a computer system based on a hospital-wide documentation concept and clinical workflow allowing to derive administrative data straight from medical documents so that redundant documentation and discrepancies between medical and administrative documentation can be avoided.

  20. Computing the compliance of physician drug orders with guidelines using an OWL2 reasoner and standard drug resources.

    PubMed

    Noussa Yao, Joseph; Séroussi, Brigitte; Bouaud, Jacques

    2011-01-01

    Assessing the conformity of a physician's prescription to a given recommended prescription is not obvious since both prescriptions are expressed at different levels of abstraction and may concern only a subpart of the whole order. Recent formalisms (OWL2) and tools (reasoners) from the semantic web technologies are becoming available to represent defined concepts and to handle classification services. We propose a generic framework based on such technologies, using available standardized drug resources, to compute the compliance of a given drug order to a recommended prescription, such that the subsumption relationship yields the conformity relationship between the order and the recommendation. The ATC drug classification has been used as a local ontology. The method has been successfully implemented for arterial hypertension management for which we had a sample of antihypertensive orders. However, supplemental standardized drug knowledge is needed to correctly compare drug orders to recommended orders.

  1. Water resources climate change projections using supervised nonlinear and multivariate soft computing techniques

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Johnson, Fiona; Mehrotra, Raj; Sharma, Ashish

    2016-05-01

    Accurate projection of global warming on the probabilistic behavior of hydro-climate variables is one of the main challenges in climate change impact assessment studies. Due to the complexity of climate-associated processes, different sources of uncertainty influence the projected behavior of hydro-climate variables in regression-based statistical downscaling procedures. The current study presents a comprehensive methodology to improve the predictive power of the procedure to provide improved projections. It does this by minimizing the uncertainty sources arising from the high-dimensionality of atmospheric predictors, the complex and nonlinear relationships between hydro-climate predictands and atmospheric predictors, as well as the biases that exist in climate model simulations. To address the impact of the high dimensional feature spaces, a supervised nonlinear dimensionality reduction algorithm is presented that is able to capture the nonlinear variability among projectors through extracting a sequence of principal components that have maximal dependency with the target hydro-climate variables. Two soft-computing nonlinear machine-learning methods, Support Vector Regression (SVR) and Relevance Vector Machine (RVM), are engaged to capture the nonlinear relationships between predictand and atmospheric predictors. To correct the spatial and temporal biases over multiple time scales in the GCM predictands, the Multivariate Recursive Nesting Bias Correction (MRNBC) approach is used. The results demonstrate that this combined approach significantly improves the downscaling procedure in terms of precipitation projection.

  2. 1,4-Addition of bis(iodozincio)methane to α,β-unsaturated ketones: chemical and theoretical/computational studies.

    PubMed

    Sada, Mutsumi; Furuyama, Taniyuki; Komagawa, Shinsuke; Uchiyama, Masanobu; Matsubara, Seijiro

    2010-09-10

    1,4-Addition of bis(iodozincio)methane to simple α,β-unsaturated ketones does not proceed well; the reaction is slightly endothermic according to DFT calculations. In the presence of chlorotrimethylsilane, the reaction proceeded efficiently to afford a silyl enol ether of β-zinciomethyl ketone. The C--Zn bond of the silyl enol ether could be used in a cross-coupling reaction to form another C--C bond in a one-pot reaction. In contrast, 1,4-addition of the dizinc reagent to enones carrying an acyloxy group proceeded very efficiently without any additive. In this case, the product was a 1,3-diketone, which was generated in a novel tandem reaction. A theoretical/computational study indicates that the whole reaction pathway is exothermic, and that two zinc atoms of bis(iodozincio)methane accelerate each step cooperatively as effective Lewis acids. PMID:20645344

  3. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    NASA Astrophysics Data System (ADS)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  4. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    NASA Astrophysics Data System (ADS)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  5. Enhanced computational prediction of polyethylene wear in hip joints by incorporating cross-shear and contact pressure in additional to load and sliding distance: effect of head diameter.

    PubMed

    Kang, Lu; Galvin, Alison L; Fisher, John; Jin, Zhongmin

    2009-05-11

    A new definition of the experimental wear factor was established and reported as a function of cross-shear motion and contact pressure using a multi-directional pin-on-plate wear testing machine for conventional polyethylene in the present study. An independent computational wear model was developed by incorporating the cross-shear motion and contact pressure-dependent wear factor into the Archard's law, in additional to load and sliding distance. The computational prediction of wear volume was directly compared with a simulator testing of a polyethylene hip joint with a 28 mm diameter. The effect of increasing the femoral head size was subsequently considered and was shown to increase wear, as a result of increased sliding distance and reduced contact pressure. PMID:19261286

  6. The role of additional computed tomography in the decision-making process on the secondary prevention in patients after systemic cerebral thrombolysis

    PubMed Central

    Sobolewski, Piotr; Kozera, Grzegorz; Szczuchniak, Wiktor; Nyka, Walenty M

    2016-01-01

    Introduction Patients with ischemic stroke undergoing intravenous (iv)-thrombolysis are routinely controlled with computed tomography on the second day to assess stroke evolution and hemorrhagic transformation (HT). However, the benefits of an additional computed tomography (aCT) performed over the next days after iv-thrombolysis have not been determined. Methods We retrospectively screened 287 Caucasian patients with ischemic stroke who were consecutively treated with iv-thrombolysis from 2008 to 2012. The results of computed tomography performed on the second (control computed tomography) and seventh (aCT) day after iv-thrombolysis were compared in 274 patients (95.5%); 13 subjects (4.5%), who died before the seventh day from admission were excluded from the analysis. Results aCTs revealed a higher incidence of HT than control computed tomographies (14.2% vs 6.6%; P=0.003). Patients with HT in aCT showed higher median of National Institutes of Health Stroke Scale score on admission than those without HT (13.0 vs 10.0; P=0.01) and higher presence of ischemic changes >1/3 middle cerebral artery territory (66.7% vs 35.2%; P<0.01). Correlations between presence of HT in aCT and National Institutes of Health Stroke Scale score on admission (rpbi 0.15; P<0.01), and the ischemic changes >1/3 middle cerebral artery (phi=0.03) existed, and the presence of HT in aCT was associated with 3-month mortality (phi=0.03). Conclusion aCT after iv-thrombolysis enables higher detection of HT, which is related to higher 3-month mortality. Thus, patients with severe middle cerebral artery infarction may benefit from aCT in the decision-making process on the secondary prophylaxis. PMID:26730196

  7. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  8. Impact of remote sensing upon the planning, management and development of water resources. Summary of computers and computer growth trends for hydrologic modeling and the input of ERTS image data processing load

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L., Jr.

    1975-01-01

    An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.

  9. Student use of computer tools designed to scaffold scientific problem-solving with hypermedia resources: A case study

    NASA Astrophysics Data System (ADS)

    Oliver, Kevin Matthew

    National science standards call for increasing student exposure to inquiry and real-world problem solving. Students can benefit from open-ended learning environments that stress the engagement of real problems and the development of thinking skills and processes. The Internet is an ideal resource for context-bound problems with its seemingly endless supply of resources. Problems may arise, however, since young students are cognitively ill-prepared to manage open-ended learning and may have difficulty processing hypermedia. Computer tools were used in a qualitative case study with 12 eighth graders to determine how such implements might support the process of solving open-ended problems. A preliminary study proposition suggested students would solve open-ended problems more appropriately if they used tools in a manner consistent with higher-order critical and creative thinking. Three research questions sought to identify: how students used tools, the nature of science learning in open-ended environments, and any personal or environmental barriers effecting problem solving. The findings were mixed. The participants did not typically use the tools and resources effectively. They successfully collected basic information, but infrequently organized, evaluated, generated, and justified their ideas. While the students understood how to use most tools procedurally, they lacked strategic understanding for why tool use was necessary. Students scored average to high on assessments of general content understanding, but developed artifacts suggesting their understanding of specific micro problems was naive and rife with misconceptions. Process understanding was also inconsistent, with some students describing basic problem solving processes, but most students unable to describe how tools could support open-ended inquiry. Barriers to effective problem solving were identified in the study. Personal barriers included naive epistemologies, while environmental barriers included a

  10. Linear Equations and Rap Battles: How Students in a Wired Classroom Utilized the Computer as a Resource to Coordinate Personal and Mathematical Positional Identities in Hybrid Spaces

    ERIC Educational Resources Information Center

    Langer-Osuna, Jennifer

    2015-01-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small…

  11. Communication, Control, and Computer Access for Disabled and Elderly Individuals. ResourceBook 4: Update to Books 1, 2, and 3.

    ERIC Educational Resources Information Center

    Borden, Peter A., Ed.; Vanderheiden, Gregg C., Ed.

    This update to the three-volume first edition of the "Rehab/Education ResourceBook Series" describes special software and products pertaining to communication, control, and computer access, designed specifically for the needs of disabled and elderly people. The 22 chapters cover: speech aids; pointing and typing aids; training and communication…

  12. Technology in the Curriculum. Resource Guide 1988 Update. A Guide to the Instructional Use of Computers and Video in K-12.

    ERIC Educational Resources Information Center

    1988

    This resource guide lists 47 computer software and 29 instructional video programs recommended for use in grades K through 12 to help teachers achieve the learning objectives set forth by their school districts and the State of California. Programs are organized in six curriculum areas: foreign language, history-social science, language arts,…

  13. Computational Resources for GTL

    SciTech Connect

    Herbert M. Sauro

    2007-12-18

    This final report summarizes the work conducted under our three year DOE GTL grant ($459,402). The work involved a number of areas, including standardization, the Systems Biology Workbench, Visual Editors, collaboration with other groups and the development of new theory and algorithms. Our work has played a key part in helping to further develop SBML, the de facto standard for System Biology Model exchange and SBGN, the developing standard for visual representation for biochemical models. Our work has also made significant contributions to developing SBW, the systems biology workbench which is now very widely used in the community (roughly 30 downloads per day for the last three years, which equates to about 30,000 downloads in total). We have also used the DOE funding to collaborate extensively with nine different groups around the world. Finally we have developed new methods to reduce model size which are now used by all the major simulation packages, including Matlab. All in all, we consider the last three years to be highly productive and influential in the systems biology community. The project resulted in 16 peer review publications.

  14. Resource-Efficient, Hierarchical Auto-Tuning of a Hybrid Lattice Boltzmann Computation on the Cray XT4

    SciTech Connect

    Computational Research Division, Lawrence Berkeley National Laboratory; NERSC, Lawrence Berkeley National Laboratory; Computer Science Department, University of California, Berkeley; Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2009-05-04

    We apply auto-tuning to a hybrid MPI-pthreads lattice Boltzmann computation running on the Cray XT4 at National Energy Research Scientific Computing Center (NERSC). Previous work showed that multicore-specific auto-tuning can improve the performance of lattice Boltzmann magnetohydrodynamics (LBMHD) by a factor of 4x when running on dual- and quad-core Opteron dual-socket SMPs. We extend these studies to the distributed memory arena via a hybrid MPI/pthreads implementation. In addition to conventional auto-tuning at the local SMP node, we tune at the message-passing level to determine the optimal aspect ratio as well as the correct balance between MPI tasks and threads per MPI task. Our study presents a detailed performance analysis when moving along an isocurve of constant hardware usage: fixed total memory, total cores, and total nodes. Overall, our work points to approaches for improving intra- and inter-node efficiency on large-scale multicore systems for demanding scientific applications.

  15. Adaptive Computing.

    ERIC Educational Resources Information Center

    Harrell, William

    1999-01-01

    Provides information on various adaptive technology resources available to people with disabilities. (Contains 19 references, an annotated list of 129 websites, and 12 additional print resources.) (JOW)

  16. Comparison of computer-based and manual coal resource estimation methods for the Cache coal bed, Recluse Geologic Model Area, Wyoming

    USGS Publications Warehouse

    Schneider, Gary B.; Crowley, Sharon S.; Carey, Mary Alice

    1984-01-01

    Coal resources have been estimated, using both manual and computer methods, for the Cache coal bed in the Recluse Geologic Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7?-minute quadrangles in Campbell County, Wyoming. Approximately 300 coal thickness measurements from drill-hole logs are distributed throughout the area The Cache coal bed and associated strata are in the Paleocene Tongue River Member of the Fort Union Formation. The depth to the Cache coal bed ranges from 269 to 1,257 feet. The coal bed is as much as 31 feet thick but is absent in places. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources estimated by hand show the bed to contain 2,228 million short tons or about 2.6 percent more than the computer-calculated figure of 2,169 million short tons.

  17. Synthesis of Bridged Heterocycles via Sequential 1,4- and 1,2-Addition Reactions to α,β-Unsaturated N-Acyliminium Ions: Mechanistic and Computational Studies.

    PubMed

    Yazici, Arife; Wille, Uta; Pyne, Stephen G

    2016-02-19

    Novel tricyclic bridged heterocyclic systems can be readily prepared from sequential 1,4- and 1,2-addition reactions of allyl and 3-substituted allylsilanes to indolizidine and quinolizidine α,β-unsaturated N-acyliminium ions. These reactions involve a novel N-assisted, transannular 1,5-hydride shift. Such a mechanism was supported by examining the reaction of a dideuterated indolizidine, α,β-unsaturated N-acyliminium ion precursor, which provided specifically dideuterated tricyclic bridged heterocyclic products, and from computational studies. In contrast, the corresponding pyrrolo[1,2-a]azepine system did not provide the corresponding tricyclic bridged heterocyclic product and gave only a bis-allyl adduct, while more substituted versions gave novel furo[3,2-d]pyrrolo[1,2-a]azepine products. Such heterocyclic systems would be expected to be useful scaffolds for the preparation of libraries of novel compounds for new drug discovery programs. PMID:26816207

  18. Synthesis of Bridged Heterocycles via Sequential 1,4- and 1,2-Addition Reactions to α,β-Unsaturated N-Acyliminium Ions: Mechanistic and Computational Studies.

    PubMed

    Yazici, Arife; Wille, Uta; Pyne, Stephen G

    2016-02-19

    Novel tricyclic bridged heterocyclic systems can be readily prepared from sequential 1,4- and 1,2-addition reactions of allyl and 3-substituted allylsilanes to indolizidine and quinolizidine α,β-unsaturated N-acyliminium ions. These reactions involve a novel N-assisted, transannular 1,5-hydride shift. Such a mechanism was supported by examining the reaction of a dideuterated indolizidine, α,β-unsaturated N-acyliminium ion precursor, which provided specifically dideuterated tricyclic bridged heterocyclic products, and from computational studies. In contrast, the corresponding pyrrolo[1,2-a]azepine system did not provide the corresponding tricyclic bridged heterocyclic product and gave only a bis-allyl adduct, while more substituted versions gave novel furo[3,2-d]pyrrolo[1,2-a]azepine products. Such heterocyclic systems would be expected to be useful scaffolds for the preparation of libraries of novel compounds for new drug discovery programs.

  19. An economic analysis for optimal distributed computing resources for mask synthesis and tape-out in production environment

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon

    2005-06-01

    With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.

  20. Assessment of potential additions to conventional oil and gas resources of the world (outside the United States) from reserve growth, 2012

    USGS Publications Warehouse

    Klett, Timothy R.; Cook, Troy A.; Charpentier, Ronald R.; Tennyson, Marilyn E.; Attanasi, E.D.; Freeman, Phil A.; Ryder, Robert T.; Gautier, Donald L.; Verma, Mahendra K.; Le, Phuong A.; Schenk, Christopher J.

    2012-01-01

    The U.S. Geological Survey estimated volumes of technically recoverable, conventional petroleum resources resulting from reserve growth for discovered fields outside the United States that have reported in-place oil and gas volumes of 500 million barrels of oil equivalent or greater. The mean volumes were estimated at 665 billion barrels of crude oil, 1,429 trillion cubic feet of natural gas, and 16 billion barrels of natural gas liquids. These volumes constitute a significant portion of the world's oil and gas resources.

  1. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  2. Assessment of potential additions to conventional oil and gas resources in discovered fields of the United States from reserve growth, 2012

    USGS Publications Warehouse

    ,

    2012-01-01

    The U.S. Geological Survey estimated volumes of technically recoverable, conventional petroleum resources that have the potential to be added to reserves from reserve growth in 70 discovered oil and gas accumulations of the United States, excluding Federal offshore areas. The mean estimated volumes are 32 billion barrels of crude oil, 291 trillion cubic feet of natural gas, and 10 billion barrels of natural gas liquids.

  3. Aggregating Data for Computational Toxicology Applications: The U.S. Environmental Protection Agency (EPA) Aggregated Computational Toxicology Resource (ACToR) System

    EPA Science Inventory

    Computational toxicology combines data from high-throughput test methods, chemical structure analyses and other biological domains (e.g., genes, proteins, cells, tissues) with the goals of predicting and understanding the underlying mechanistic causes of chemical toxicity and for...

  4. Final report for %22High performance computing for advanced national electric power grid modeling and integration of solar generation resources%22, LDRD Project No. 149016.

    SciTech Connect

    Reno, Matthew J.; Riehm, Andrew Charles; Hoekstra, Robert John; Munoz-Ramirez, Karina; Stamp, Jason Edwin; Phillips, Laurence R.; Adams, Brian M.; Russo, Thomas V.; Oldfield, Ron A.; McLendon, William Clarence, III; Nelson, Jeffrey Scott; Hansen, Clifford W.; Richardson, Bryan T.; Stein, Joshua S.; Schoenwald, David Alan; Wolfenbarger, Paul R.

    2011-02-01

    Design and operation of the electric power grid (EPG) relies heavily on computational models. High-fidelity, full-order models are used to study transient phenomena on only a small part of the network. Reduced-order dynamic and power flow models are used when analysis involving thousands of nodes are required due to the computational demands when simulating large numbers of nodes. The level of complexity of the future EPG will dramatically increase due to large-scale deployment of variable renewable generation, active load and distributed generation resources, adaptive protection and control systems, and price-responsive demand. High-fidelity modeling of this future grid will require significant advances in coupled, multi-scale tools and their use on high performance computing (HPC) platforms. This LDRD report demonstrates SNL's capability to apply HPC resources to these 3 tasks: (1) High-fidelity, large-scale modeling of power system dynamics; (2) Statistical assessment of grid security via Monte-Carlo simulations of cyber attacks; and (3) Development of models to predict variability of solar resources at locations where little or no ground-based measurements are available.

  5. A computational study of the addition of ReO3L (L = Cl(-), CH3, OCH3 and Cp) to ethenone.

    PubMed

    Aniagyei, Albert; Tia, Richard; Adei, Evans

    2016-01-01

    The periselectivity and chemoselectivity of the addition of transition metal oxides of the type ReO3L (L = Cl, CH3, OCH3 and Cp) to ethenone have been explored at the MO6 and B3LYP/LACVP* levels of theory. The activation barriers and reaction energies for the stepwise and concerted addition pathways involving multiple spin states have been computed. In the reaction of ReO3L (L = Cl(-), OCH3, CH3 and Cp) with ethenone, the concerted [2 + 2] addition of the metal oxide across the C=C and C=O double bond to form either metalla-2-oxetane-3-one or metalla-2,4-dioxolane is the most kinetically favored over the formation of metalla-2,5-dioxolane-3-one from the direct [3 + 2] addition pathway. The trends in activation and reaction energies for the formation of metalla-2-oxetane-3-one and metalla-2,4-dioxolane are Cp < Cl(-) < OCH3 < CH3 and Cp < OCH3 < CH3 < Cl(-) and for the reaction energies are Cp < OCH3 < Cl(-) < CH3 and Cp < CH3 < OCH3 < Cl CH3. The concerted [3 + 2] addition of the metal oxide across the C=C double of the ethenone to form species metalla-2,5-dioxolane-3-one is thermodynamically the most favored for the ligand L = Cp. The direct [2 + 2] addition pathways leading to the formations of metalla-2-oxetane-3-one and metalla-2,4-dioxolane is thermodynamically the most favored for the ligands L = OCH3 and Cl(-). The difference between the calculated [2 + 2] activation barriers for the addition of the metal oxide LReO3 across the C=C and C=O functionalities of ethenone are small except for the case of L = Cl(-) and OCH3. The rearrangement of the metalla-2-oxetane-3-one-metalla-2,5-dioxolane-3-one even though feasible, are unfavorable due to high activation energies of their rate-determining steps. For the rearrangement of the metalla-2-oxetane-3-one to metalla-2,5-dioxolane-3-one, the trends in activation barriers is found to follow the order OCH3 < Cl(-) < CH3 < Cp. The trends in the activation energies for

  6. Additive Manufacturing of Single-Crystal Superalloy CMSX-4 Through Scanning Laser Epitaxy: Computational Modeling, Experimental Process Development, and Process Parameter Optimization

    NASA Astrophysics Data System (ADS)

    Basak, Amrita; Acharya, Ranadip; Das, Suman

    2016-08-01

    This paper focuses on additive manufacturing (AM) of single-crystal (SX) nickel-based superalloy CMSX-4 through scanning laser epitaxy (SLE). SLE, a powder bed fusion-based AM process was explored for the purpose of producing crack-free, dense deposits of CMSX-4 on top of similar chemistry investment-cast substrates. Optical microscopy and scanning electron microscopy (SEM) investigations revealed the presence of dendritic microstructures that consisted of fine γ' precipitates within the γ matrix in the deposit region. Computational fluid dynamics (CFD)-based process modeling, statistical design of experiments (DoE), and microstructural characterization techniques were combined to produce metallurgically bonded single-crystal deposits of more than 500 μm height in a single pass along the entire length of the substrate. A customized quantitative metallography based image analysis technique was employed for automatic extraction of various deposit quality metrics from the digital cross-sectional micrographs. The processing parameters were varied, and optimal processing windows were identified to obtain good quality deposits. The results reported here represent one of the few successes obtained in producing single-crystal epitaxial deposits through a powder bed fusion-based metal AM process and thus demonstrate the potential of SLE to repair and manufacture single-crystal hot section components of gas turbine systems from nickel-based superalloy powders.

  7. ECG-Based Detection of Early Myocardial Ischemia in a Computational Model: Impact of Additional Electrodes, Optimal Placement, and a New Feature for ST Deviation

    PubMed Central

    Loewe, Axel; Schulze, Walther H. W.; Jiang, Yuan; Wilhelms, Mathias; Luik, Armin; Dössel, Olaf; Seemann, Gunnar

    2015-01-01

    In case of chest pain, immediate diagnosis of myocardial ischemia is required to respond with an appropriate treatment. The diagnostic capability of the electrocardiogram (ECG), however, is strongly limited for ischemic events that do not lead to ST elevation. This computational study investigates the potential of different electrode setups in detecting early ischemia at 10 minutes after onset: standard 3-channel and 12-lead ECG as well as body surface potential maps (BSPMs). Further, it was assessed if an additional ECG electrode with optimized position or the right-sided Wilson leads can improve sensitivity of the standard 12-lead ECG. To this end, a simulation study was performed for 765 different locations and sizes of ischemia in the left ventricle. Improvements by adding a single, subject specifically optimized electrode were similar to those of the BSPM: 2–11% increased detection rate depending on the desired specificity. Adding right-sided Wilson leads had negligible effect. Absence of ST deviation could not be related to specific locations of the ischemic region or its transmurality. As alternative to the ST time integral as a feature of ST deviation, the K point deviation was introduced: the baseline deviation at the minimum of the ST-segment envelope signal, which increased 12-lead detection rate by 7% for a reasonable threshold. PMID:26587538

  8. ECG-Based Detection of Early Myocardial Ischemia in a Computational Model: Impact of Additional Electrodes, Optimal Placement, and a New Feature for ST Deviation.

    PubMed

    Loewe, Axel; Schulze, Walther H W; Jiang, Yuan; Wilhelms, Mathias; Luik, Armin; Dössel, Olaf; Seemann, Gunnar

    2015-01-01

    In case of chest pain, immediate diagnosis of myocardial ischemia is required to respond with an appropriate treatment. The diagnostic capability of the electrocardiogram (ECG), however, is strongly limited for ischemic events that do not lead to ST elevation. This computational study investigates the potential of different electrode setups in detecting early ischemia at 10 minutes after onset: standard 3-channel and 12-lead ECG as well as body surface potential maps (BSPMs). Further, it was assessed if an additional ECG electrode with optimized position or the right-sided Wilson leads can improve sensitivity of the standard 12-lead ECG. To this end, a simulation study was performed for 765 different locations and sizes of ischemia in the left ventricle. Improvements by adding a single, subject specifically optimized electrode were similar to those of the BSPM: 2-11% increased detection rate depending on the desired specificity. Adding right-sided Wilson leads had negligible effect. Absence of ST deviation could not be related to specific locations of the ischemic region or its transmurality. As alternative to the ST time integral as a feature of ST deviation, the K point deviation was introduced: the baseline deviation at the minimum of the ST-segment envelope signal, which increased 12-lead detection rate by 7% for a reasonable threshold.

  9. Tandem β-elimination/hetero-michael addition rearrangement of an N-alkylated pyridinium oxime to an O-alkylated pyridine oxime ether: an experimental and computational study.

    PubMed

    Picek, Igor; Vianello, Robert; Šket, Primož; Plavec, Janez; Foretić, Blaženka

    2015-02-20

    A novel OH(-)-promoted tandem reaction involving C(β)-N(+)(pyridinium) cleavage and ether C(β)-O(oxime) bond formation in aqueous media has been presented. The study fully elucidates the fascinating reaction behavior of N-benzoylethylpyridinium-4-oxime chloride in aqueous media under mild reaction conditions. The reaction journey begins with the exclusive β-elimination and formation of pyridine-4-oxime and phenyl vinyl ketone and ends with the formation of O-alkylated pyridine oxime ether. A combination of experimental and computational studies enabled the introduction of a new type of rearrangement process that involves a unique tandem reaction sequence. We showed that (E)-O-benzoylethylpyridine-4-oxime is formed in aqueous solution by a base-induced tandem β-elimination/hetero-Michael addition rearrangement of (E)-N-benzoylethylpyridinium-4-oximate, the novel synthetic route to this engaging target class of compounds. The complete mechanistic picture of this rearrangement process was presented and discussed in terms of the E1cb reaction scheme within the rate-limiting β-elimination step.

  10. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  11. Techniques for computer-aided analysis of ERTS-1 data, useful in geologic, forest and water resource surveys. [Colorado Rocky Mountains

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1974-01-01

    Forestry, geology, and water resource applications were the focus of this study, which involved the use of computer-implemented pattern-recognition techniques to analyze ERTS-1 data. The results have proven the value of computer-aided analysis techniques, even in areas of mountainous terrain. Several analysis capabilities have been developed during these ERTS-1 investigations. A procedure to rotate, deskew, and geometrically scale the MSS data results in 1:24,000 scale printouts that can be directly overlayed on 7 1/2 minutes U.S.G.S. topographic maps. Several scales of computer-enhanced "false color-infrared" composites of MSS data can be obtained from a digital display unit, and emphasize the tremendous detail present in the ERTS-1 data. A grid can also be superimposed on the displayed data to aid in specifying areas of interest.

  12. Development and Use of a Computer-Based Interactive Resource for Teaching and Learning Probability in Primary Classrooms

    ERIC Educational Resources Information Center

    Trigueros, Maria; Lozano, Maria Dolores; Lage, Ana Elisa

    2006-01-01

    "Enciclomedia" is a Mexican project for primary school teaching using computers in the classroom. Within this project, and following an enactivist theoretical perspective and methodology, we have designed a computer-based package called "Dados", which, together with teaching guides, is intended to support the teaching and learning of probability.…

  13. S.T.A.R.T.T. plus: addition of prehospital personnel to a national multidisciplinary crisis resource management trauma team training course

    PubMed Central

    Gillman, Lawrence M.; Martin, Doug; Engels, Paul T.; Brindley, Peter; Widder, Sandy; French, Cheryl

    2016-01-01

    Summary The Simulated Trauma and Resuscitation Team Training (S.T.A.R.T.T.) course is a unique multidisciplinary trauma team training course deliberately designed to address the common crisis resource management (CRM) skills of trauma team members. Moreover, the curriculum has been updated to also target the specific learning needs of individual participating professionals: physicians, nurses and respiratory therapists. This commentary outlines further modifications to the course curriculum in order to address the needs of a relatively undertargeted group: prehospital personnel (i.e., emergency medical services). Maintenance of high participant satisfaction, regardless of profession, suggests that the S.T.A.R.T.T. course can be readily modified to incorporate prehospital personnel without losing its utility or popularity. PMID:26574706

  14. Graduate Enrollment Increases in Science and Engineering Fields, Especially in Engineering and Computer Sciences. InfoBrief: Science Resources Statistics.

    ERIC Educational Resources Information Center

    Burrelli, Joan S.

    This brief describes graduate enrollment increases in the science and engineering fields, especially in engineering and computer sciences. Graduate student enrollment is summarized by enrollment status, citizenship, race/ethnicity, and fields. (KHR)

  15. Noise Threshold and Resource Cost of Fault-Tolerant Quantum Computing with Majorana Fermions in Hybrid Systems

    NASA Astrophysics Data System (ADS)

    Li, Ying

    2016-09-01

    Fault-tolerant quantum computing in systems composed of both Majorana fermions and topologically unprotected quantum systems, e.g. superconducting circuits or quantum dots, is studied in this paper. Errors caused by topologically unprotected quantum systems need to be corrected with error correction schemes, for instance, the surface code. We find that the error-correction performance of such a hybrid topological quantum computer is not superior to a normal quantum computer unless the topological charge of Majorana fermions is insusceptible to noise. If errors changing the topological charge are rare, the fault-tolerance threshold is much higher than the threshold of a normal quantum computer, and a surface-code logical qubit could be encoded in only tens of topological qubits instead of about a thousand normal qubits.

  16. Parallel high-performance grid computing: capabilities and opportunities of a novel demanding service and business class allowing highest resource efficiency.

    PubMed

    Kepper, Nick; Ettig, Ramona; Dickmann, Frank; Stehr, Rene; Grosveld, Frank G; Wedemann, Gero; Knoch, Tobias A

    2010-01-01

    Especially in the life-science and the health-care sectors the huge IT requirements are imminent due to the large and complex systems to be analysed and simulated. Grid infrastructures play here a rapidly increasing role for research, diagnostics, and treatment, since they provide the necessary large-scale resources efficiently. Whereas grids were first used for huge number crunching of trivially parallelizable problems, increasingly parallel high-performance computing is required. Here, we show for the prime example of molecular dynamic simulations how the presence of large grid clusters including very fast network interconnects within grid infrastructures allows now parallel high-performance grid computing efficiently and thus combines the benefits of dedicated super-computing centres and grid infrastructures. The demands for this service class are the highest since the user group has very heterogeneous requirements: i) two to many thousands of CPUs, ii) different memory architectures, iii) huge storage capabilities, and iv) fast communication via network interconnects, are all needed in different combinations and must be considered in a highly dedicated manner to reach highest performance efficiency. Beyond, advanced and dedicated i) interaction with users, ii) the management of jobs, iii) accounting, and iv) billing, not only combines classic with parallel high-performance grid usage, but more importantly is also able to increase the efficiency of IT resource providers. Consequently, the mere "yes-we-can" becomes a huge opportunity like e.g. the life-science and health-care sectors as well as grid infrastructures by reaching higher level of resource efficiency.

  17. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  18. Linear equations and rap battles: how students in a wired classroom utilized the computer as a resource to coordinate personal and mathematical positional identities in hybrid spaces

    NASA Astrophysics Data System (ADS)

    Langer-Osuna, Jennifer

    2015-03-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.

  19. Identification and Mapping of Soils, Vegetation, and Water Resources of Lynn County, Texas, by Computer Analysis of ERTS MSS Data

    NASA Technical Reports Server (NTRS)

    Baumgardner, M. F.; Kristof, S. J.; Henderson, J. A., Jr.

    1973-01-01

    Results of the analysis and interpretation of ERTS multispectral data obtained over Lynn County, Texas, are presented. The test site was chosen because it embodies a variety of problems associated with the development and management of agricultural resources in the Southern Great Plains. Lynn County is one of ten counties in a larger test site centering around Lubbock, Texas. The purpose of this study is to examine the utility of ERTS data in identifying, characterizing, and mapping soils, vegetation, and water resources in this semiarid region. Successful application of multispectral remote sensing and machine-processing techniques to arid and seminarid land-management problems will provide valuable new tools for the more than one-third of the world's lands lying in arid-semiarid regions.

  20. A Framework for Safe Composition of Heterogeneous SOA Services in a Pervasive Computing Environment with Resource Constraints

    ERIC Educational Resources Information Center

    Reyes Alamo, Jose M.

    2010-01-01

    The Service Oriented Computing (SOC) paradigm, defines services as software artifacts whose implementations are separated from their specifications. Application developers rely on services to simplify the design, reduce the development time and cost. Within the SOC paradigm, different Service Oriented Architectures (SOAs) have been developed.…

  1. Winning the Popularity Contest: Researcher Preference When Selecting Resources for Civil Engineering, Computer Science, Mathematics and Physics Dissertations

    ERIC Educational Resources Information Center

    Dotson, Daniel S.; Franks, Tina P.

    2015-01-01

    More than 53,000 citations from 609 dissertations published at The Ohio State University between 1998-2012 representing four science disciplines--civil engineering, computer science, mathematics and physics--were examined to determine what, if any, preferences or trends exist. This case study seeks to identify whether or not researcher preferences…

  2. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  3. Resource Manual

    ERIC Educational Resources Information Center

    Human Development Institute, 2008

    2008-01-01

    This manual was designed primarily for use by individuals with developmental disabilities and related conditions. The main focus of this manual is to provide easy-to-read information concerning available resources, and to provide immediate contact information for the purpose of applying for resources and/or locating additional information. The…

  4. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  5. Is Coronary Computed Tomography Angiography a Resource Sparing Strategy in the Risk Stratification and Evaluation of Acute Chest Pain? Results of a Randomized Controlled Trial

    PubMed Central

    Miller, Adam H.; Pepe, Paul E.; Peshock, Ron; Bhore, Rafia; Yancy, Clyde C.; Xuan, Lei; Miller, Margarita M.; Huet, Gisselle R.; Trimmer, Clayton; Davis, Rene; Chason, Rebecca; Kashner, Micheal T.

    2011-01-01

    Objectives Annually, almost 6 million U.S. citizens are evaluated for acute chest pain syndromes (ACPSs), and billions of dollars in resources are utilized. A large part of the resource utilization results from precautionary hospitalizations that occur because care providers are unable to exclude the presence of coronary artery disease (CAD) as the underlying cause of ACPSs. The purpose of this study was to examine whether the addition of coronary computerized tomography angiography (CCTA) to the concurrent standard care (SC) during an index emergency department (ED) visit could lower resource utilization when evaluating for the presence of CAD. Methods Sixty participants were assigned randomly to SC or SC + CCTA groups. Participants were interviewed at the index ED visit and at 90 days. Data collected included demographics, perceptions of the value of accessing health care, and clinical outcomes. Resource utilization included services received from both the primary in-network and the primary out-of-network providers. The prospectively defined primary endpoint was the total amount of resources utilized over a 90-day follow-up period when adding CCTA to the SC risk stratification in ACPSs. Results The mean (± standard deviation [SD]) for total resources utilized at 90 days for in-network plus out-of-network services was less for the participants in the SC + CCTA group ($10,134; SD ± $14,239) versus the SC-only group ($16,579; SD ± $19,148; p = 0.144), as was the median for the SC + CCTA ($4,288) versus SC only ($12,148; p = 0.652; median difference = −$1,291; 95% confidence interval [CI] = −$12,219 to $1,100; p = 0.652). Among the 60 total study patients, only 19 had an established diagnosis of CAD at 90 days. However, 18 (95%) of these diagnosed participants were in the SC + CCTA group. In addition, there were fewer hospital readmissions in the SC + CCTA group (6 of 30 [20%] vs. 16 of 30 [53%]; difference in proportions = −33%; 95% CI = −56% to −10%; p

  6. Minimal-resource computer program for automatic generation of ocean wave ray or crest diagrams in shoaling waters

    NASA Technical Reports Server (NTRS)

    Poole, L. R.; Lecroy, S. R.; Morris, W. D.

    1977-01-01

    A computer program for studying linear ocean wave refraction is described. The program features random-access modular bathymetry data storage. Three bottom topography approximation techniques are available in the program which provide varying degrees of bathymetry data smoothing. Refraction diagrams are generated automatically and can be displayed graphically in three forms: Ray patterns with specified uniform deepwater ray density, ray patterns with controlled nearshore ray density, or crest patterns constructed by using a cubic polynomial to approximate crest segments between adjacent rays.

  7. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  8. GeoComputation 2009

    SciTech Connect

    Xue, Yong; Hoffman, Forrest M; Liu, Dingsheng

    2009-01-01

    The tremendous computing requirements of today's algorithms and the high costs of high-performance supercomputers drive us to share computing resources. The emerging computational Grid technologies are expected to make feasible the creation of a computational environment handling many PetaBytes of distributed data, tens of thousands of heterogeneous computing resources, and thousands of simultaneous users from multiple research institutions.

  9. Finding Hidden Geothermal Resources in the Basin and Range Using Electrical Survey Techniques: A Computational Feasibility Study

    SciTech Connect

    J. W. Pritchett; not used on publication

    2004-12-01

    For many years, there has been speculation about "hidden" or "blind" geothermal systems—reservoirs that lack an obvious overlying surface fluid outlet. At present, it is simply not known whether "hidden" geothermal reservoirs are rare or common. An approach to identifying promising drilling targets using methods that are cheaper than drilling is needed. These methods should be regarded as reconnaissance tools, whose primary purpose is to locate high-probability targets for subsequent deep confirmation drilling. The purpose of this study was to appraise the feasibility of finding "hidden" geothermal reservoirs in the Basin and Range using electrical survey techniques, and of adequately locating promising targets for deep exploratory drilling based on the survey results. The approach was purely theoretical. A geothermal reservoir simulator was used to carry out a lengthy calculation of the evolution of a synthetic but generic Great Basin-type geothermal reservoir to a quasi-steady "natural state". Postprocessors were used to try to estimate what a suite of geophysical surveys of the prospect would see. Based on these results, the different survey techniques were compared and evaluated in terms of their ability to identify suitable drilling targets. This process was completed for eight different "reservoir models". Of the eight cases considered, four were "hidden" systems, so that the survey techniques could be appraised in terms of their ability to detect and characterize such resources and to distinguish them from more conventionally situated geothermal reservoirs. It is concluded that the best way to find "hidden" basin and range geothermal resources of this general type is to carry out simultaneous SP and low-frequency MT surveys, and then to combine the results of both surveys with other pertinent information using mathematical "inversion" techniques to characterize the subsurface quantitatively. Many such surveys and accompanying analyses can be carried out

  10. CD-ROM & Computer Software.

    ERIC Educational Resources Information Center

    Mews, Alison

    1997-01-01

    Reviews six CD-ROMs and computer software packages that can serve as resource materials for education. In addition to costs, hardware requirements, and thematic links, includes features such as interface and screen design, ease of access, installation, educational uses, animation and sound, educational challenges and benefits, and student…

  11. Additivity of Factor Effects in Reading Tasks Is Still a Challenge for Computational Models: Reply to Ziegler, Perry, and Zorzi (2009)

    ERIC Educational Resources Information Center

    Besner, Derek; O'Malley, Shannon

    2009-01-01

    J. C. Ziegler, C. Perry, and M. Zorzi (2009) have claimed that their connectionist dual process model (CDP+) can simulate the data reported by S. O'Malley and D. Besner. Most centrally, they have claimed that the model simulates additive effects of stimulus quality and word frequency on the time to read aloud when words and nonwords are randomly…

  12. Acute coronary artery deficiency treatment process. Managing scarce resources and maintaining a level of service with the help of computer simulation.

    PubMed

    Proctor, T

    1997-01-01

    Surgical treatment can be planned well in advance or arrive in a near emergency. If careful control is not exercised there is always the possibility that patients may suffer as a result. Discrete event simulation is an effective tool which can be used to find the most efficient ways of managing such a service's timetable. Examines the application of a desktop computer simulation package to model the work of a cardiac diagnosis and treatment system. The simulation package suggests how efficiency might be improved by moderating times taken to complete activities and staff and other resources required to perform the activities. Suggests the principles outlined can be readily expanded into a more complex model. PMID:10165852

  13. miR-Synth: a computational resource for the design of multi-site multi-target synthetic miRNAs

    PubMed Central

    Laganà, Alessandro; Acunzo, Mario; Romano, Giulia; Pulvirenti, Alfredo; Veneziano, Dario; Cascione, Luciano; Giugno, Rosalba; Gasparini, Pierluigi; Shasha, Dennis; Ferro, Alfredo; Croce, Carlo Maria

    2014-01-01

    RNAi is a powerful tool for the regulation of gene expression. It is widely and successfully employed in functional studies and is now emerging as a promising therapeutic approach. Several RNAi-based clinical trials suggest encouraging results in the treatment of a variety of diseases, including cancer. Here we present miR-Synth, a computational resource for the design of synthetic microRNAs able to target multiple genes in multiple sites. The proposed strategy constitutes a valid alternative to the use of siRNA, allowing the employment of a fewer number of molecules for the inhibition of multiple targets. This may represent a great advantage in designing therapies for diseases caused by crucial cellular pathways altered by multiple dysregulated genes. The system has been successfully validated on two of the most prominent genes associated to lung cancer, c-MET and Epidermal Growth Factor Receptor (EGFR). (See http://microrna.osumc.edu/mir-synth). PMID:24627222

  14. Assessment Planning and Evaluation of Renewable Energy Resources: an Interactive Computer Assisted Procedure. [hydroelectricity, biomass, and windpower in the Pittsfield metropolitan region, Massachusetts

    NASA Technical Reports Server (NTRS)

    Aston, T. W.; Fabos, J. G.; Macdougall, E. B.

    1982-01-01

    Adaptation and derivation were used to develop a procedure for assessing the availability of renewable energy resources on the landscape while simultaneously accounting for the economic, legal, social, and environmental issues required. Done in a step-by-step fashion, the procedure can be used interactively at the computer terminals. Its application in determining the hydroelectricity, biomass, and windpower in a 40,000 acre study area of Western Massachusetts shows that: (1) three existing dam sites are physically capable of being retrofitted for hydropower; (2) each of three general areas has a mean annual windspeed exceeding 14 mph and is conductive to windpower; and (3) 20% of the total land area consists of prime agricultural biomass while 30% of the area is prime forest biomass land.

  15. The teaching of nursing by computer-assisted learning: maximisation of existing resources in an established school of nursing.

    PubMed

    Jones, P G; Wainwright, P

    1998-01-01

    Computer Assisted Learning (CAL) has made limited inroads into the teaching of nursing. While existing CAL materials are often excellent for supporting conventional teaching programs, they cannot in any way be considered as steps towards providing a free-standing CAL course. The recently published Dearing Report on the state of higher education in the UK recommends a major shift towards CAL, proposes Open and Distance Learning (ODL) as a means of effective courseware delivery, and encourages British Higher Education Institutions to enter the competitive global marketplace for ODL students; the Dearing recommendations reflect the trend within both the European Union and North America towards CAL and ODL. This report documents one potential solution to the problem of raising educational standards through capitalising on existing expertise for the development of CAL using hypertext mark-up language, while working within shrinking budgets. It represents a model which is capable of adoption by other Nursing Schools, and is therefore presented with a view to stimulating discussion. PMID:10179639

  16. Computer modelling integrated with micro-CT and material testing provides additional insight to evaluate bone treatments: Application to a beta-glycan derived whey protein mice model.

    PubMed

    Sreenivasan, D; Tu, P T; Dickinson, M; Watson, M; Blais, A; Das, R; Cornish, J; Fernandez, J

    2016-01-01

    The primary aim of this study was to evaluate the influence of a whey protein diet on computationally predicted mechanical strength of murine bones in both trabecular and cortical regions of the femur. There was no significant influence on mechanical strength in cortical bone observed with increasing whey protein treatment, consistent with cortical tissue mineral density (TMD) and bone volume changes observed. Trabecular bone showed a significant decline in strength with increasing whey protein treatment when nanoindentation derived Young׳s moduli were used in the model. When microindentation, micro-CT phantom density or normalised Young׳s moduli were included in the model a non-significant decline in strength was exhibited. These results for trabecular bone were consistent with both trabecular bone mineral density (BMD) and micro-CT indices obtained independently. The secondary aim of this study was to characterise the influence of different sources of Young׳s moduli on computational prediction. This study aimed to quantify the predicted mechanical strength in 3D from these sources and evaluate if trends and conclusions remained consistent. For cortical bone, predicted mechanical strength behaviour was consistent across all sources of Young׳s moduli. There was no difference in treatment trend observed when Young׳s moduli were normalised. In contrast, trabecular strength due to whey protein treatment significantly reduced when material properties from nanoindentation were introduced. Other material property sources were not significant but emphasised the strength trend over normalised material properties. This shows strength at the trabecular level was attributed to both changes in bone architecture and material properties.

  17. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  18. Information technology resources assessment

    SciTech Connect

    Loken, S.C.

    1993-01-01

    The emphasis in Information Technology (IT) development has shifted from technology management to information management, and the tools of information management are increasingly at the disposal of end-users, people who deal with information. Moreover, the interactive capabilities of technologies such as hypertext, scientific visualization, virtual reality, video conferencing, and even database management systems have placed in the hands of users a significant amount of discretion over how these resources will be used. The emergence of high-performance networks, as well as network operating systems, improved interoperability, and platform independence of applications will eliminate technical barriers to the use of data, increase the power and range of resources that can be used cooperatively, and open up a wealth of possibilities for new applications. The very scope of these prospects for the immediate future is a problem for the IT planner or administrator. Technology procurement and implementation, integration of new technologies into the existing infrastructure, cost recovery and usage of networks and networked resources, training issues, and security concerns such as data protection and access to experiments are just some of the issues that need to be considered in the emerging IT environment. As managers we must use technology to improve competitiveness. When procuring new systems, we must take advantage of scalable resources. New resources such as distributed file systems can improve access to and efficiency of existing operating systems. In addition, we must assess opportunities to improve information worker productivity and information management through tedmologies such as distributed computational visualization and teleseminar applications.

  19. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    1999-10-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. Through these examples and additional exercises at the end of each chapter, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems.

  20. Do We Really Need Additional Contrast-Enhanced Abdominal Computed Tomography for Differential Diagnosis in Triage of Middle-Aged Subjects With Suspected Biliary Pain

    PubMed Central

    Hwang, In Kyeom; Lee, Yoon Suk; Kim, Jaihwan; Lee, Yoon Jin; Park, Ji Hoon; Hwang, Jin-Hyeok

    2015-01-01

    Abstract Enhanced computed tomography (CT) is widely used for evaluating acute biliary pain in the emergency department (ED). However, concern about radiation exposure from CT has also increased. We investigated the usefulness of pre-contrast CT for differential diagnosis in middle-aged subjects with suspected biliary pain. A total of 183 subjects, who visited the ED for suspected biliary pain from January 2011 to December 2012, were included. Retrospectively, pre-contrast phase and multiphase CT findings were reviewed and the detection rate of findings suggesting disease requiring significant treatment by noncontrast CT (NCCT) was compared with cases detected by multiphase CT. Approximately 70% of total subjects had a significant condition, including 1 case of gallbladder cancer and 126 (68.8%) cases requiring intervention (122 biliary stone-related diseases, 3 liver abscesses, and 1 liver hemangioma). The rate of overlooking malignancy without contrast enhancement was calculated to be 0% to 1.5%. Biliary stones and liver space-occupying lesions were found equally on NCCT and multiphase CT. Calculated probable rates of overlooking acute cholecystitis and biliary obstruction were maximally 6.8% and 4.2% respectively. Incidental significant finding unrelated with pain consisted of 1 case of adrenal incidentaloma, which was also observed in NCCT. NCCT might be sufficient to detect life-threatening or significant disease requiring early treatment in young adults with biliary pain. PMID:25700321

  1. Computational models can predict response to HIV therapy without a genotype and may reduce treatment failure in different resource-limited settings

    PubMed Central

    Revell, A. D.; Wang, D.; Wood, R.; Morrow, C.; Tempelman, H.; Hamers, R. L.; Alvarez-Uria, G.; Streinu-Cercel, A.; Ene, L.; Wensing, A. M. J.; DeWolf, F.; Nelson, M.; Montaner, J. S.; Lane, H. C.; Larder, B. A.

    2013-01-01

    Objectives Genotypic HIV drug-resistance testing is typically 60%–65% predictive of response to combination antiretroviral therapy (ART) and is valuable for guiding treatment changes. Genotyping is unavailable in many resource-limited settings (RLSs). We aimed to develop models that can predict response to ART without a genotype and evaluated their potential as a treatment support tool in RLSs. Methods Random forest models were trained to predict the probability of response to ART (≤400 copies HIV RNA/mL) using the following data from 14 891 treatment change episodes (TCEs) after virological failure, from well-resourced countries: viral load and CD4 count prior to treatment change, treatment history, drugs in the new regimen, time to follow-up and follow-up viral load. Models were assessed by cross-validation during development, with an independent set of 800 cases from well-resourced countries, plus 231 cases from Southern Africa, 206 from India and 375 from Romania. The area under the receiver operating characteristic curve (AUC) was the main outcome measure. Results The models achieved an AUC of 0.74–0.81 during cross-validation and 0.76–0.77 with the 800 test TCEs. They achieved AUCs of 0.58–0.65 (Southern Africa), 0.63 (India) and 0.70 (Romania). Models were more accurate for data from the well-resourced countries than for cases from Southern Africa and India (P < 0.001), but not Romania. The models identified alternative, available drug regimens predicted to result in virological response for 94% of virological failures in Southern Africa, 99% of those in India and 93% of those in Romania. Conclusions We developed computational models that predict virological response to ART without a genotype with comparable accuracy to genotyping with rule-based interpretation. These models have the potential to help optimize antiretroviral therapy for patients in RLSs where genotyping is not generally available. PMID:23485767

  2. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  3. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  4. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  5. Computer simulation for the growing probability of additional offspring with an advantageous reversal allele in the decoupled continuous-time mutation-selection model

    NASA Astrophysics Data System (ADS)

    Gill, Wonpyong

    2016-01-01

    This study calculated the growing probability of additional offspring with the advantageous reversal allele in an asymmetric sharply-peaked landscape using the decoupled continuous-time mutation-selection model. The growing probability was calculated for various population sizes, N, sequence lengths, L, selective advantages, s, fitness parameters, k and measuring parameters, C. The saturated growing probability in the stochastic region was approximately the effective selective advantage, s*, when C≫1/Ns* and s*≪1. The present study suggests that the growing probability in the stochastic region in the decoupled continuous-time mutation-selection model can be described using the theoretical formula for the growing probability in the Moran two-allele model. The selective advantage ratio, which represents the ratio of the effective selective advantage to the selective advantage, does not depend on the population size, selective advantage, measuring parameter and fitness parameter; instead the selective advantage ratio decreases with the increasing sequence length.

  6. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  7. Resource data bases-Resource assessment

    USGS Publications Warehouse

    Clark, A.L.

    1976-01-01

    The U.S. Geological Survey's Office of Resource Analysis is developing computer methods for the handling of mineral-resources data in order to provide improved means for addressing and manipulating data. These methods include: computerized data files and predictive resource models. Data files contain the raw or disaggregated information on mineral deposits and commodities. One operational data file is CRIB (Computerized Resources Information Bank) which is a general purpose inventory and reference file on metallic and nonmetallic mineral deposits. A computer file on resources should contain detailed information on the following main categories: record identification, name and location, description of deposit, analytical data, and production/reserves. A resource model employs postulates and inferences in conjunction with the data to make predictions about resources-as key variables concerning a mineral commodity are changed. The objective is to estimate the availability of minerals including: geological availability (occurrence models), technological availability (exploration and beneficiation models), and economic availability (economics models). ?? 1976.

  8. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    PubMed Central

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  9. A lightweight distributed framework for computational offloading in mobile cloud computing.

    PubMed

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  10. Argonne's Laboratory computing center - 2007 annual report.

    SciTech Connect

    Bair, R.; Pieper, G. W.

    2008-05-28

    performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

  11. Neutron Characterization for Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Watkins, Thomas; Bilheux, Hassina; An, Ke; Payzant, Andrew; DeHoff, Ryan; Duty, Chad; Peter, William; Blue, Craig; Brice, Craig A.

    2013-01-01

    Oak Ridge National Laboratory (ORNL) is leveraging decades of experience in neutron characterization of advanced materials together with resources such as the Spallation Neutron Source (SNS) and the High Flux Isotope Reactor (HFIR) shown in Fig. 1 to solve challenging problems in additive manufacturing (AM). Additive manufacturing, or three-dimensional (3-D) printing, is a rapidly maturing technology wherein components are built by selectively adding feedstock material at locations specified by a computer model. The majority of these technologies use thermally driven phase change mechanisms to convert the feedstock into functioning material. As the molten material cools and solidifies, the component is subjected to significant thermal gradients, generating significant internal stresses throughout the part (Fig. 2). As layers are added, inherent residual stresses cause warping and distortions that lead to geometrical differences between the final part and the original computer generated design. This effect also limits geometries that can be fabricated using AM, such as thin-walled, high-aspect- ratio, and overhanging structures. Distortion may be minimized by intelligent toolpath planning or strategic placement of support structures, but these approaches are not well understood and often "Edisonian" in nature. Residual stresses can also impact component performance during operation. For example, in a thermally cycled environment such as a high-pressure turbine engine, residual stresses can cause components to distort unpredictably. Different thermal treatments on as-fabricated AM components have been used to minimize residual stress, but components still retain a nonhomogeneous stress state and/or demonstrate a relaxation-derived geometric distortion. Industry, federal laboratory, and university collaboration is needed to address these challenges and enable the U.S. to compete in the global market. Work is currently being conducted on AM technologies at the ORNL

  12. Future computing needs for Fermilab

    SciTech Connect

    Not Available

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should

  13. Computational aerodynamics and artificial intelligence

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.; Kutler, P.

    1984-01-01

    The general principles of artificial intelligence are reviewed and speculations are made concerning how knowledge based systems can accelerate the process of acquiring new knowledge in aerodynamics, how computational fluid dynamics may use expert systems, and how expert systems may speed the design and development process. In addition, the anatomy of an idealized expert system called AERODYNAMICIST is discussed. Resource requirements for using artificial intelligence in computational fluid dynamics and aerodynamics are examined. Three main conclusions are presented. First, there are two related aspects of computational aerodynamics: reasoning and calculating. Second, a substantial portion of reasoning can be achieved with artificial intelligence. It offers the opportunity of using computers as reasoning machines to set the stage for efficient calculating. Third, expert systems are likely to be new assets of institutions involved in aeronautics for various tasks of computational aerodynamics.

  14. Climate Modeling using High-Performance Computing

    SciTech Connect

    Mirin, A A

    2007-02-05

    The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

  15. Theoretical effect of modifications to the upper surface of two NACA airfoils using smooth polynomial additional thickness distributions which emphasize leading edge profile and which vary quadratically at the trailing edge. [using flow equations and a CDC 7600 computer

    NASA Technical Reports Server (NTRS)

    Merz, A. W.; Hague, D. S.

    1975-01-01

    An investigation was conducted on a CDC 7600 digital computer to determine the effects of additional thickness distributions to the upper surface of the NACA 64-206 and 64 sub 1 - 212 airfoils. The additional thickness distribution had the form of a continuous mathematical function which disappears at both the leading edge and the trailing edge. The function behaves as a polynomial of order epsilon sub 1 at the leading edge, and a polynomial of order epsilon sub 2 at the trailing edge. Epsilon sub 2 is a constant and epsilon sub 1 is varied over a range of practical interest. The magnitude of the additional thickness, y, is a second input parameter, and the effect of varying epsilon sub 1 and y on the aerodynamic performance of the airfoil was investigated. Results were obtained at a Mach number of 0.2 with an angle-of-attack of 6 degrees on the basic airfoils, and all calculations employ the full potential flow equations for two dimensional flow. The relaxation method of Jameson was employed for solution of the potential flow equations.

  16. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    SciTech Connect

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  17. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  18. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  19. Distributive Computer Networking: Making It Work on a Regional Basis: Effective sharing through a network requires new management and resource distribution techniques.

    PubMed

    Cornew, R W; Morse, P M

    1975-08-15

    After 4 years of operation the NERComP network is now a self-supporting success. Some of the reasons for its success are that (i) the network started small and built up utilization; (ii) the members, through monthly trustee meetings, practiced "participatory management" from the outset; (iii) unlike some networks, NERComP appealed to individual academic and research users who were terminal-oriented and who controlled their own budgets; (iv) the compactness of the New England region made it an ideal laboratory for testing networking concepts; and (v) a dedicated staff was willing to work hard in the face of considerable uncertainty. While the major problems were "political, organizational and economic" (1) we have found that they can be solved if the network meets real needs. We have also found that it is difficult to proceed beyond a certain point without investing responsibility and authority in the networking organization. Conversely, there is a need to distribute some responsibilities such as marketing and user services back to the member institutions. By adopting a modest starting point and achieving limited goals the necessary trust and working relationships between institutions can be built. In our case the necessary planning has been facilitated by recognizing three distinct network functions: governance, user services, and technical operations. Separating out the three essential networing tasks and dealing with each individually through advisory committees, each with its own staff coordinator, has overcome a distracting tendency to address all issues at once. It has also provided an element of feedback between the end user and the supplier not usually present in networking activity. The success of NERComP demonstrates that a distributive-type network can work. Our experiences in New England-which, because of its numerous colleges and universities free from domination by any single institution, is a microcosm for academic computing in the United States

  20. Computer-Based Learning.

    ERIC Educational Resources Information Center

    Brown, Peggy, Ed.

    1981-01-01

    Three essays on the ways in which colleges and universities use the computer as a teaching tool are presented, along with descriptions of 10 school programs that reflect the diversity of computer applications across the United States. In "A Place for Computing in Liberal Education," Karl L. Zinn likens the computer to personal resource tools, such…

  1. Resource Economics

    NASA Astrophysics Data System (ADS)

    Conrad, Jon M.

    2000-01-01

    Resource Economics is a text for students with a background in calculus, intermediate microeconomics, and a familiarity with the spreadsheet software Excel. The book covers basic concepts, shows how to set up spreadsheets to solve dynamic allocation problems, and presents economic models for fisheries, forestry, nonrenewable resources, stock pollutants, option value, and sustainable development. Within the text, numerical examples are posed and solved using Excel's Solver. These problems help make concepts operational, develop economic intuition, and serve as a bridge to the study of real-world problems of resource management. Through these examples and additional exercises at the end of Chapters 1 to 8, students can make dynamic models operational, develop their economic intuition, and learn how to set up spreadsheets for the simulation of optimization of resource and environmental systems. Book is unique in its use of spreadsheet software (Excel) to solve dynamic allocation problems Conrad is co-author of a previous book for the Press on the subject for graduate students Approach is extremely student-friendly; gives students the tools to apply research results to actual environmental issues

  2. Family History Resources.

    ERIC Educational Resources Information Center

    Bookmark, 1991

    1991-01-01

    The 12 articles in this issue focus on the theme of family history resources: (1) "Introduction: Family History Resources" (Joseph F. Shubert); (2) "Work, Credentials, and Expectations of a Professional Genealogist" (Coreen P. Hallenbeck and Lewis W. Hallenbeck); (3) "Computers and Genealogy" (Theresa C. Strasser); (4) "Finding Historical Records…

  3. Resources for Teaching Astronomy.

    ERIC Educational Resources Information Center

    Grafton, Teresa; Suggett, Martin

    1991-01-01

    Resources that are available for teachers presenting astronomy in the National Curriculum are listed. Included are societies and organizations, resource centers and places to visit, planetaria, telescopes and binoculars, planispheres, star charts, night sky diaries, equipment, audiovisual materials, computer software, books, and magazines. (KR)

  4. Developable resources

    SciTech Connect

    Hunt, R.T.; Hunt, J.M.

    1995-01-01

    In the United States, it has become the conventional wisdom that all developable conventional hydropower resources have been exhausted. Studies by the US Department of Energy (DOE) and other agencies find differently. The root of disagreement may lie in the definition of what is developable. Environmental special interest groups now define developable hydropower sites as those having zero effect on the environment. As a result they conclude there are no additional developable hydropower sites. By contrast, the definition used by DOE and others is broader as it balances economic, technical, and environmental factors in accordance with the Federal Power Act.

  5. Computer-assisted assignment of functional domains in the nonstructural polyprotein of hepatitis E virus: delineation of an additional group of positive-strand RNA plant and animal viruses.

    PubMed

    Koonin, E V; Gorbalenya, A E; Purdy, M A; Rozanov, M N; Reyes, G R; Bradley, D W

    1992-09-01

    Computer-assisted comparison of the nonstructural polyprotein of hepatitis E virus (HEV) with proteins of other positive-strand RNA viruses allowed the identification of the following putative functional domains: (i) RNA-dependent RNA polymerase, (ii) RNA helicase, (iii) methyltransferase, (iv) a domain of unknown function ("X" domain) flanking the papain-like protease domains in the polyproteins of animal positive-strand RNA viruses, and (v) papain-like cysteine protease domain distantly related to the putative papain-like protease of rubella virus (RubV). Comparative analysis of the polymerase and helicase sequences of positive-strand RNA viruses belonging to the so-called "alpha-like" supergroup revealed grouping between HEV, RubV, and beet necrotic yellow vein virus (BNYVV), a plant furovirus. Two additional domains have been identified: one showed significant conservation between HEV, RubV, and BNYVV, and the other showed conservation specifically between HEV and RubV. The large nonstructural proteins of HEV, RubV, and BNYVV retained similar domain organization, with the exceptions of relocation of the putative protease domain in HEV as compared to RubV and the absence of the protease and X domains in BNYVV. These observations show that HEV, RubV, and BNYVV encompass partially conserved arrays of distinctive putative functional domains, suggesting that these viruses constitute a distinct monophyletic group within the alpha-like supergroup of positive-strand RNA viruses. PMID:1518855

  6. Fermilab computing at the Intensity Frontier

    SciTech Connect

    Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less on the development of tools and infrastructure.

  7. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  8. 30 CFR 256.53 - Additional bonds.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Additional bonds. 256.53 Section 256.53 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR OFFSHORE LEASING OF SULPHUR OR OIL AND GAS IN THE OUTER CONTINENTAL SHELF Bonding § 256.53 Additional bonds. (a) This paragraph explains...

  9. Computation Directorate 2008 Annual Report

    SciTech Connect

    Crawford, D L

    2009-03-25

    Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to its 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.

  10. Expanding Computer Service with Personal Computers.

    ERIC Educational Resources Information Center

    Bomzer, Herbert

    1983-01-01

    A planning technique, the mission justification document, and the evaluation procedures developed at Central Michigan University to ensure the orderly growth of computer-dependent resources within the constraints of tight budgets are described. (Author/MLW)

  11. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  12. Computers and small satellites: How FORTE is utilizing the WWW as a {open_quotes}paperless{close_quotes} information resource and the development of a unique resource management planning tool

    SciTech Connect

    Roussel-Dupre, D.; Carter, M.; Franz, R.

    1997-10-01

    The Fast-On-orbit Recording of Transient Events (FORTE) satellite is the second satellite to be developed and flown by Los Alamos National Laboratory and is scheduled to be launched August, 1997 by a Pegasus XL rocket. FORTE follows in the footsteps of the ALEXIS satellite in utilizing a very small operations crew for mission operations. Partially based upon the ALEXIS automation and World Wide Web (WWW) usage for data dissemination, FORTE began at an early stage of ground processing to use the web as a repository of information about all aspects of the satellite. Detailed descriptions of the various satellite and experiment components, cable diagrams, integration photographs as well as extensive test data have all been compiled into a single site as a means of archiving the data at a single location. In this manner, it is readily available during times of ground testing, ground station operation training as well as anomaly resolution. Small satellites usually require extensive effort to optimize operation under minimal resources. For the FORTE satellite, a unique planning tool has been developed over the past 2 years which balances the various resources of the satellite (power, memory, downlink, on board command buffer, etc.) to provide the maximum data acquisition. This paper will concentrate on a description of both the extensive web interface and the planning tool. 6 refs.

  13. Computational tools and resources for metabolism-related property predictions. 1. Overview of publicly available (free and commercial) databases and software

    PubMed Central

    Peach, Megan L; Zakharov, Alexey V; Liu, Ruifeng; Pugliese, Angelo; Tawa, Gregory; Wallqvist, Anders; Nicklaus, Marc C

    2014-01-01

    Metabolism has been identified as a defining factor in drug development success or failure because of its impact on many aspects of drug pharmacology, including bioavailability, half-life and toxicity. In this article, we provide an outline and descriptions of the resources for metabolism-related property predictions that are currently either freely or commercially available to the public. These resources include databases with data on, and software for prediction of, several end points: metabolite formation, sites of metabolic transformation, binding to metabolizing enzymes and metabolic stability. We attempt to place each tool in historical context and describe, wherever possible, the data it was based on. For predictions of interactions with metabolizing enzymes, we show a typical set of results for a small test set of compounds. Our aim is to give a clear overview of the areas and aspects of metabolism prediction in which the currently available resources are useful and accurate, and the areas in which they are inadequate or missing entirely. PMID:23088273

  14. Mobile Virtual Environments in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Lazem, Shaimaa; Abdel-Hamid, Ayman; Gračanin, Denis; Adams, Kevin P.

    Recently, human computer interaction has shifted from traditional desktop computing to the pervasive computing paradigm where users are engaged with everywhere and anytime computing devices. Mobile virtual environments (MVEs) are an emerging research area that studies the deployment of virtual reality applications on mobile devices. MVEs present additional challenges to application developers due to the restricted resources of the mobile devices, in addition to issues that are specific to wireless computing, such as limited bandwidth, high error rate and handoff intervals. Moreover, adaptive resource allocation is a key issue in MVEs where user interactions affect system resources, which, in turn, affects the user’s experience. Such interplay between the user and the system can be modelled using game theory. This chapter presents MVEs as a real-time interactive distributed system, and investigates the challenges in designing and developing a remote rendering prefetching application for mobile devices. Furthermore, we introduce game theory as a tool for modelling decision-making in MVEs by describing a game between the remote rendering server and the mobile client.

  15. The Island of Terra: A Computer-Aided Simulation of World Resources Problems Designed for Lower-Achieving Secondary School Pupils.

    ERIC Educational Resources Information Center

    Watson, Robert

    1983-01-01

    Describes the computer-assisted environmental-educational game TERRA, which is designed to encourage lower-ability high school students to think at their own level about the future of their world. Based on "The Limits to Growth" (Meadows et al., 1972), the game comprises four programs: Terra Firma, Terradactyl, Inflation, and Doubles. (EAO)

  16. Computer-Aided Drafting and Design Series. Educational Resources for the Machine Tool Industry, Course Syllabi, [and] Instructor's Handbook. Student Laboratory Manual.

    ERIC Educational Resources Information Center

    Texas State Technical Coll. System, Waco.

    This package consists of course syllabi, an instructor's handbook, and a student laboratory manual for a 2-year vocational training program to prepare students for entry-level employment in computer-aided drafting and design in the machine tool industry. The program was developed through a modification of the DACUM (Developing a Curriculum)…

  17. Herpes - resources

    MedlinePlus

    Genital herpes - resources; Resources - genital herpes ... The following organizations are good resources for information on genital herpes : March of Dimes -- www.marchofdimes.com/pregnancy/complications-herpes The American Congress of Obstetricians and Gynecologists -- ...

  18. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2016-07-12

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  19. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  20. Computed Tomography (CT) - Spine

    MedlinePlus

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Spine Computed tomography (CT) of the spine is a diagnostic imaging ... Spine? What is CT Scanning of the Spine? Computed tomography, more commonly known as a CT or CAT ...

  1. West Virginia US Department of Energy experimental program to stimulate competitive research. Section 2: Human resource development; Section 3: Carbon-based structural materials research cluster; Section 3: Data parallel algorithms for scientific computing

    SciTech Connect

    Not Available

    1994-02-02

    This report consists of three separate but related reports. They are (1) Human Resource Development, (2) Carbon-based Structural Materials Research Cluster, and (3) Data Parallel Algorithms for Scientific Computing. To meet the objectives of the Human Resource Development plan, the plan includes K--12 enrichment activities, undergraduate research opportunities for students at the state`s two Historically Black Colleges and Universities, graduate research through cluster assistantships and through a traineeship program targeted specifically to minorities, women and the disabled, and faculty development through participation in research clusters. One research cluster is the chemistry and physics of carbon-based materials. The objective of this cluster is to develop a self-sustaining group of researchers in carbon-based materials research within the institutions of higher education in the state of West Virginia. The projects will involve analysis of cokes, graphites and other carbons in order to understand the properties that provide desirable structural characteristics including resistance to oxidation, levels of anisotropy and structural characteristics of the carbons themselves. In the proposed cluster on parallel algorithms, research by four WVU faculty and three state liberal arts college faculty are: (1) modeling of self-organized critical systems by cellular automata; (2) multiprefix algorithms and fat-free embeddings; (3) offline and online partitioning of data computation; and (4) manipulating and rendering three dimensional objects. This cluster furthers the state Experimental Program to Stimulate Competitive Research plan by building on existing strengths at WVU in parallel algorithms.

  2. The Sandia Airborne Computer (SANDAC)

    SciTech Connect

    Nava, E.J.

    1992-06-01

    The Sandia Airborne Computer (SANDAC) is a small, modular, high performance, multiprocessor computer originally designed for aerospace applications. It can use a combination of Motorola 68020 and 68040 based processor modules along with AT&T DSP32C based signal processing modules. The system is designed to use up to 15 processors in almost any combination and a complete system can include up to 20 modules. Depending on the mix of processors, total computational throughput can range from 2.5 to greater than 225 Million Instructions Per Second (MIPS). The system is designed so that processors can access all resources in the machine and the inter-processor communication details are completely transparent to the software. In addition to processors, the system includes input/output, memory, and special function modules. Because of its ease of use, small size, durability, and configuration flexibility, SANDAC has been used on applications ranging from missile navigation, guidance, and control systems to medical imaging systems.

  3. Kentucky Department for Natural Resources and Environmental Protection permit application for air contaminant source: SRC-I demonstration plant, Newman, Kentucky. Supplement I. [Additional information on 38 items requested by KY/DNREP

    SciTech Connect

    Pearson, Jr., John F.

    1981-02-13

    In response to a letter from KY/DNREP, January 19, 1981, ICRC and DOE have prepared the enclosed supplement to the Kentucky Department for Natural Resources and Environmental Protection Permit Application for Air Contaminant Source for the SRC-I Demonstration Plant. Each of the 38 comments contained in the letter has been addressed in accordance with the discussions held in Frankfort on January 28, 1981, among representatives of KY/DNREP, EPA Region IV, US DOE, and ICRC. The questions raised involve requests for detailed information on the performance and reliability of proprietary equipment, back-up methods, monitoring plans for various pollutants, composition of wastes to flares, emissions estimates from particular operations, origin of baseline information, mathematical models, storage tanks, dusts, etc. (LTN)

  4. Contextuality supplies the 'magic' for quantum computation.

    PubMed

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-19

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms. PMID:24919152

  5. Contextuality supplies the 'magic' for quantum computation.

    PubMed

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-19

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.

  6. Contextuality supplies the `magic' for quantum computation

    NASA Astrophysics Data System (ADS)

    Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph

    2014-06-01

    Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via `magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple `hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.

  7. Lensfree computational microscopy tools for cell and tissue imaging at the point-of-care and in low-resource settings.

    PubMed

    Isikman, Serhan O; Greenbaum, Alon; Lee, Myungjun; Bishara, Waheb; Mudanyali, Onur; Su, Ting-Wei; Ozcan, Aydogan

    2012-01-01

    The recent revolution in digital technologies and information processing methods present important opportunities to transform the way optical imaging is performed, particularly toward improving the throughput of microscopes while at the same time reducing their relative cost and complexity. Lensfree computational microscopy is rapidly emerging toward this end, and by discarding lenses and other bulky optical components of conventional imaging systems, and relying on digital computation instead, it can achieve both reflection and transmission mode microscopy over a large field-of-view within compact, cost-effective and mechanically robust architectures. Such high throughput and miniaturized imaging devices can provide a complementary toolset for telemedicine applications and point-of-care diagnostics by facilitating complex and critical tasks such as cytometry and microscopic analysis of e.g., blood smears, Pap tests and tissue samples. In this article, the basics of these lensfree microscopy modalities will be reviewed, and their clinically relevant applications will be discussed.

  8. Act for Better Child Care Services of 1988. Report from the Committee on Labor and Human Resources Together with Additional Views (To Accompany S. 1885). 100th Congress, 2nd Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Labor and Human Resources.

    The Act for Better Child Care Services of 1988, additional views of members of the United States Senate, and related materials are reported. The purpose of the Act is to increase the availability, affordability, and quality of child care throughout the nation. The legislation provides direct financial assistance to low-income and working families…

  9. SIPSEY WILDERNESS AND ADDITIONS, ALABAMA.

    USGS Publications Warehouse

    Schweinfurth, Stanley P.; Mory, Peter C.

    1984-01-01

    On the basis of geologic, geochemical, and mineral surveys the Sipsey Wilderness and additions are deemed to have little promise for the occurrence of metallic mineral resources. Although limestone, shale, and sandstone resources that occur in the area are physically suitable for a variety of uses, similar materials are available outside the area closer to transportation routes and potential markets. A small amount of coal has been identified in the area, occurring as nonpersistent beds less than 28 in. thick. Oil and (or) natural gas resources may be present if suitable structural traps exist in the subsurface. Therefore, the area has a probable oil and gas potential. Small amounts of asphaltic sandstone and limestone, commonly referred to as tar sands, may also occur in the subsurface. 5 refs.

  10. Computational physics with PetaFlops computers

    NASA Astrophysics Data System (ADS)

    Attig, Norbert

    2009-04-01

    Driven by technology, Scientific Computing is rapidly entering the PetaFlops era. The Jülich Supercomputing Centre (JSC), one of three German national supercomputing centres, is focusing on the IBM Blue Gene architecture to provide computer resources of this class to its users, the majority of whom are computational physicists. Details of the system will be discussed and applications will be described which significantly benefit from this new architecture.

  11. Integration of High-Performance Computing into Cloud Computing Services

    NASA Astrophysics Data System (ADS)

    Vouk, Mladen A.; Sills, Eric; Dreher, Patrick

    High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).

  12. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  13. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  14. 18 CFR 1314.10 - Additional provisions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Additional provisions. 1314.10 Section 1314.10 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY BOOK-ENTRY... attachment for TVA Power Securities in Book-entry System. The interest of a debtor in a Security...

  15. 18 CFR 1314.10 - Additional provisions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Additional provisions. 1314.10 Section 1314.10 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY BOOK-ENTRY... attachment for TVA Power Securities in Book-entry System. The interest of a debtor in a Security...

  16. 18 CFR 1314.10 - Additional provisions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 2 2014-04-01 2014-04-01 false Additional provisions. 1314.10 Section 1314.10 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY BOOK-ENTRY... attachment for TVA Power Securities in Book-entry System. The interest of a debtor in a Security...

  17. 18 CFR 1314.10 - Additional provisions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 2 2012-04-01 2012-04-01 false Additional provisions. 1314.10 Section 1314.10 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY BOOK-ENTRY... attachment for TVA Power Securities in Book-entry System. The interest of a debtor in a Security...

  18. 18 CFR 1314.10 - Additional provisions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Additional provisions. 1314.10 Section 1314.10 Conservation of Power and Water Resources TENNESSEE VALLEY AUTHORITY BOOK-ENTRY... attachment for TVA Power Securities in Book-entry System. The interest of a debtor in a Security...

  19. Hydropower and Environmental Resource Assessment (HERA): a computational tool for the assessment of the hydropower potential of watersheds considering engineering and socio-environmental aspects.

    NASA Astrophysics Data System (ADS)

    Martins, T. M.; Kelman, R.; Metello, M.; Ciarlini, A.; Granville, A. C.; Hespanhol, P.; Castro, T. L.; Gottin, V. M.; Pereira, M. V. F.

    2015-12-01

    The hydroelectric potential of a river is proportional to its head and water flows. Selecting the best development alternative for Greenfield projects watersheds is a difficult task, since it must balance demands for infrastructure, especially in the developing world where a large potential remains unexplored, with environmental conservation. Discussions usually diverge into antagonistic views, as in recent projects in the Amazon forest, for example. This motivates the construction of a computational tool that will support a more qualified debate regarding development/conservation options. HERA provides the optimal head division partition of a river considering technical, economic and environmental aspects. HERA has three main components: (i) pre-processing GIS of topographic and hydrologic data; (ii) automatic engineering and equipment design and budget estimation for candidate projects; (iii) translation of division-partition problem into a mathematical programming model. By integrating an automatic calculation with geoprocessing tools, cloud computation and optimization techniques, HERA makes it possible countless head partition division alternatives to be intrinsically compared - a great advantage with respect to traditional field surveys followed by engineering design methods. Based on optimization techniques, HERA determines which hydro plants should be built, including location, design, technical data (e.g. water head, reservoir area and volume, engineering design (dam, spillways, etc.) and costs). The results can be visualized in the HERA interface, exported to GIS software, Google Earth or CAD systems. HERA has a global scope of application since the main input data area a Digital Terrain Model and water inflows at gauging stations. The objective is to contribute to an increased rationality of decisions by presenting to the stakeholders a clear and quantitative view of the alternatives, their opportunities and threats.

  20. Cost and Resource Utilization Associated with Use of Computed Tomography to Evaluate Chest Pain in the Emergency Department: The ROMICAT Study

    PubMed Central

    Hulten, Edward; Goehler, Alexander; Bittencourt, Marcio; Bamberg, Fabian; Schlett, Christopher L.; Truong, Quynh A.; Nichols, John; Nasir, Khurram; Rogers, Ian S.; Gazelle, Scott G.; Nagurney, John T.; Hoffmann, Udo; Blankstein, Ron

    2013-01-01

    Background Coronary computed tomography angiography (cCTA) allows for rapid non-invasive exclusion of obstructive coronary artery disease (CAD). However, concern exists whether implementation of cCTA in the assessment of patients presenting to the emergency room with acute chest pain will lead to increased downstream testing and costs compared to alternative strategies. Our aim was to compare observed actual costs of usual care (UC) with projected costs of a strategy including early cCTA in the evaluation of patients with acute chest pain in the Rule Out Myocardial Infarction Using Computed Tomography (ROMICAT I) study. Methods and Results We compared cost and hospital length of stay of UC observed among 368 patients enrolled in the ROMICAT I trial with projected costs of management based on cCTA. Costs of UC were determined by an electronic cost accounting system. Notably, UC was not influenced by cCTA results, as patients and caregivers were blinded to the cCTA results. Costs after early implementation cCTA were estimated assuming changes in management based on cCTA findings of presence and severity of CAD. Sensitivity analysis was used to test influence of key variables on both outcomes and costs. We determined that in comparison to UC, cCTA-guided triage whereby patients with no CAD are discharged, could reduce total hospital costs by 23%, p < 0.001. However, when the prevalence of obstructive CAD increases, index hospitalization cost increases such that when the prevalence of ≥50% stenosis is greater than 28–33%, the use of cCTA becomes more costly than UC. Conclusion cCTA may be a cost saving tool in acute chest pain populations that have a prevalence of potentially obstructive CAD lower than 30%. However, increased cost would be anticipated in populations with higher prevalence of disease. PMID:24021693

  1. Entanglement and adiabatic quantum computation

    NASA Astrophysics Data System (ADS)

    Ahrensmeier, D.

    2006-06-01

    Adiabatic quantum computation provides an alternative approach to quantum computation using a time-dependent Hamiltonian. The time evolution of entanglement during the adiabatic quantum search algorithm is studied, and its relevance as a resource is discussed.

  2. Concurrent negotiation and coordination for grid resource coallocation.

    PubMed

    Sim, Kwang Mong; Shi, Benyun

    2010-06-01

    Bolstering resource coallocation is essential for realizing the Grid vision, because computationally intensive applications often require multiple computing resources from different administrative domains. Given that resource providers and consumers may have different requirements, successfully obtaining commitments through concurrent negotiations with multiple resource providers to simultaneously access several resources is a very challenging task for consumers. The impetus of this paper is that it is one of the earliest works that consider a concurrent negotiation mechanism for Grid resource coallocation. The concurrent negotiation mechanism is designed for 1) managing (de)commitment of contracts through one-to-many negotiations and 2) coordination of multiple concurrent one-to-many negotiations between a consumer and multiple resource providers. The novel contributions of this paper are devising 1) a utility-oriented coordination (UOC) strategy, 2) three classes of commitment management strategies (CMSs) for concurrent negotiation, and 3) the negotiation protocols of consumers and providers. Implementing these ideas in a testbed, three series of experiments were carried out in a variety of settings to compare the following: 1) the CMSs in this paper with the work of others in a single one-to-many negotiation environment for one resource where decommitment is allowed for both provider and consumer agents; 2) the performance of the three classes of CMSs in different resource market types; and 3) the UOC strategy with the work of others [e.g., the patient coordination strategy (PCS )] for coordinating multiple concurrent negotiations. Empirical results show the following: 1) the UOC strategy achieved higher utility, faster negotiation speed, and higher success rates than PCS for different resource market types; and 2) the CMS in this paper achieved higher final utility than the CMS in other works. Additionally, the properties of the three classes of CMSs in

  3. Reconciling resource utilization and resource selection functions

    USGS Publications Warehouse

    Hooten, Mevin B.; Hanks, Ephraim M.; Johnson, Devin S.; Alldredge, Mat W.

    2013-01-01

    Summary: 1. Analyses based on utilization distributions (UDs) have been ubiquitous in animal space use studies, largely because they are computationally straightforward and relatively easy to employ. Conventional applications of resource utilization functions (RUFs) suggest that estimates of UDs can be used as response variables in a regression involving spatial covariates of interest. 2. It has been claimed that contemporary implementations of RUFs can yield inference about resource selection, although to our knowledge, an explicit connection has not been described. 3. We explore the relationships between RUFs and resource selection functions from a hueristic and simulation perspective. We investigate several sources of potential bias in the estimation of resource selection coefficients using RUFs (e.g. the spatial covariance modelling that is often used in RUF analyses). 4. Our findings illustrate that RUFs can, in fact, serve as approximations to RSFs and are capable of providing inference about resource selection, but only with some modification and under specific circumstances. 5. Using real telemetry data as an example, we provide guidance on which methods for estimating resource selection may be more appropriate and in which situations. In general, if telemetry data are assumed to arise as a point process, then RSF methods may be preferable to RUFs; however, modified RUFs may provide less biased parameter estimates when the data are subject to location error.

  4. Reconciling resource utilization and resource selection functions.

    PubMed

    Hooten, Mevin B; Hanks, Ephraim M; Johnson, Devin S; Alldredge, Mat W

    2013-11-01

    1. Analyses based on utilization distributions (UDs) have been ubiquitous in animal space use studies, largely because they are computationally straightforward and relatively easy to employ. Conventional applications of resource utilization functions (RUFs) suggest that estimates of UDs can be used as response variables in a regression involving spatial covariates of interest. 2. It has been claimed that contemporary implementations of RUFs can yield inference about resource selection, although to our knowledge, an explicit connection has not been described. 3. We explore the relationships between RUFs and resource selection functions from a hueristic and simulation perspective. We investigate several sources of potential bias in the estimation of resource selection coefficients using RUFs (e.g. the spatial covariance modelling that is often used in RUF analyses). 4. Our findings illustrate that RUFs can, in fact, serve as approximations to RSFs and are capable of providing inference about resource selection, but only with some modification and under specific circumstances. 5. Using real telemetry data as an example, we provide guidance on which methods for estimating resource selection may be more appropriate and in which situations. In general, if telemetry data are assumed to arise as a point process, then RSF methods may be preferable to RUFs; however, modified RUFs may provide less biased parameter estimates when the data are subject to location error.

  5. Resource Planning Guide

    SciTech Connect

    1994-06-30

    RPG1.06 is an (IRP) tool developed so that small public utilities would have a tool to complete an IRP that was in compliance with the Energy Policy Act of 1992. RPG1.06 is divided into three levels: Fast Track, Intermediate, and Detailed. Each level is designed to be completed independently. The Fast Track level is designed for users who want to complete a quick, simple planning study to identify applicable resource options. The Intermediate level is designed for users who want to add their own data and apply weighting factors to the results after completing a quick, simple planning study. The Detailed level is designed for users who want to identify applicable resource options and spend a fair amount of time collecting the data, computing the calculations, and compiling the results. The Detailed level contains a production costing module and optimization algorithms. The software contains the worksheets that appear in the workbook version. Similar to the workbooks, the software worksheets are designed to be completed in numerical order. The software automatically saves the data entered into a worksheet, and carries the data from worksheet to worksheet, completing all calculations. RPG1.06 also contains three additional volumes. Getting Started is an introduction to RPG1.06. Getting Started helps the user understand the differences between the three workbooks and choose the workbook that best fits the users needs. Reference Data contains supply, demand, end-use, weather, and survey data that can be used in lieu of a utility''s own data. Reference Data is particulary helpful for a utility that does not have the time, staff, or money to gather all its own data. The Sample Load Forecasting Methodologies details the steps necessary to complete simple trends, end-use, and economic load forecasts.

  6. Resource Planning Guide

    1994-06-30

    RPG1.06 is an (IRP) tool developed so that small public utilities would have a tool to complete an IRP that was in compliance with the Energy Policy Act of 1992. RPG1.06 is divided into three levels: Fast Track, Intermediate, and Detailed. Each level is designed to be completed independently. The Fast Track level is designed for users who want to complete a quick, simple planning study to identify applicable resource options. The Intermediate level ismore » designed for users who want to add their own data and apply weighting factors to the results after completing a quick, simple planning study. The Detailed level is designed for users who want to identify applicable resource options and spend a fair amount of time collecting the data, computing the calculations, and compiling the results. The Detailed level contains a production costing module and optimization algorithms. The software contains the worksheets that appear in the workbook version. Similar to the workbooks, the software worksheets are designed to be completed in numerical order. The software automatically saves the data entered into a worksheet, and carries the data from worksheet to worksheet, completing all calculations. RPG1.06 also contains three additional volumes. Getting Started is an introduction to RPG1.06. Getting Started helps the user understand the differences between the three workbooks and choose the workbook that best fits the users needs. Reference Data contains supply, demand, end-use, weather, and survey data that can be used in lieu of a utility''s own data. Reference Data is particulary helpful for a utility that does not have the time, staff, or money to gather all its own data. The Sample Load Forecasting Methodologies details the steps necessary to complete simple trends, end-use, and economic load forecasts.« less

  7. SETI@home, BOINC, and Volunteer Distributed Computing

    NASA Astrophysics Data System (ADS)

    Korpela, Eric J.

    2012-05-01

    Volunteer computing, also known as public-resource computing, is a form of distributed computing that relies on members of the public donating the processing power, Internet connection, and storage capabilities of their home computers. Projects that utilize this mode of distributed computation can potentially access millions of Internet-attached central processing units (CPUs) that provide PFLOPS (thousands of trillions of floating-point operations per second) of processing power. In addition, these projects can access the talents of the volunteers themselves. Projects span a wide variety of domains including astronomy, biochemistry, climatology, physics, and mathematics. This review provides an introduction to volunteer computing and some of the difficulties involved in its implementation. I describe the dominant infrastructure for volunteer computing in some depth and provide descriptions of a small number of projects as an illustration of the variety of projects that can be undertaken.

  8. Parallel Analysis and Visualization on Cray Compute Node Linux

    SciTech Connect

    Pugmire, Dave; Ahern, Sean

    2008-01-01

    Capability computer systems are deployed to give researchers the computational power required to investigate and solve key challenges facing the scientific community. As the power of these computer systems increases, the computational problem domain typically increases in size, complexity and scope. These increases strain the ability of commodity analysis and visualization clusters to effectively perform post-processing tasks and provide critical insight and understanding to the computed results. An alternative to purchasing increasingly larger, separate analysis and visualization commodity clusters is to use the computational system itself to perform post-processing tasks. In this paper, the recent successful port of VisIt, a parallel, open source analysis and visualization tool, to compute node linux running on the Cray is detailed. Additionally, the unprecedented ability of this resource for analysis and visualization is discussed and a report on obtained results is presented.

  9. An iron–oxygen intermediate formed during the catalytic cycle of cysteine dioxygenase† †Electronic supplementary information (ESI) available: Experimental and computational details. See DOI: 10.1039/c6cc03904a Click here for additional data file.

    PubMed Central

    Tchesnokov, E. P.; Faponle, A. S.; Davies, C. G.; Quesne, M. G.; Turner, R.; Fellner, M.; Souness, R. J.; Wilbanks, S. M.

    2016-01-01

    Cysteine dioxygenase is a key enzyme in the breakdown of cysteine, but its mechanism remains controversial. A combination of spectroscopic and computational studies provides the first evidence of a short-lived intermediate in the catalytic cycle. The intermediate decays within 20 ms and has absorption maxima at 500 and 640 nm. PMID:27297454

  10. Incontinence - resources

    MedlinePlus

    Resources - incontinence ... The following organizations are good resources for information on incontinence. Fecal incontinence : The American Congress of Obstetricians and Gynecologists -- www.acog.org/~/media/for%20patients/faq139.ashx ...

  11. Scleroderma - resources

    MedlinePlus

    Resources - scleroderma ... The following organizations are good resources for information on scleroderma : American College of Rheumatology -- www.rheumatology.org/practice/clinical/patients/diseases_and_conditions/scleroderma.asp National Institute ...

  12. Epilepsy - resources

    MedlinePlus

    Resources - epilepsy ... The following organizations are good resources for information on epilepsy : Epilepsy Foundation -- www.efa.org National Institute of Neurologic Disorders and Stroke -- www.ninds.nih.gov/disorders/ ...

  13. Cancer - resources

    MedlinePlus

    Resources - cancer ... The following organizations are good resources for information on cancer : American Cancer Society -- www.cancer.org Cancer Care -- www.cancercare.org National Cancer Institute -- www.cancer.gov

  14. Breastfeeding - resources

    MedlinePlus

    Resources - breastfeeding ... The following organizations are good resources for information on breastfeeding and breastfeeding problems : La Leche League International Inc. -- www.lalecheleague.org March of Dimes -- www.marchofdimes.com/ ...

  15. Alzheimer - resources

    MedlinePlus

    Resources - Alzheimer ... The following organizations are good resources for information on Alzheimer disease : Alzheimer's Association -- www.alz.org Alzheimer's Disease Education and Referral Center -- www.nia.nih.gov/alzheimers ...

  16. Alcoholism - resources

    MedlinePlus

    Resources - alcoholism ... The following organizations are good resources for information on alcoholism : Alcoholics Anonymous -- www.aa.org Al-Anon/Alateen -- www.al-anon.org/home National Institute on Alcohol ...

  17. Scoliosis - resources

    MedlinePlus

    Resources - scoliosis ... The following organizations are good resources for information on scoliosis : American Academy of Orthopedic Surgeons -- orthoinfo.aaos.org/topic.cfm?topic=A00626 National Institute of Arthritis and ...

  18. Lupus - resources

    MedlinePlus

    Resources - lupus ... The following organizations are good resources for information on systemic lupus erythematosus : The Lupus Foundation of America -- www.lupus.org The National Institute of Arthritis and Musculoskeletal ...

  19. SIDS - resources

    MedlinePlus

    Resources - SIDS ... The following organizations are good resources for information on SIDS : American SIDS Institute -- www.sids.org Centers for Disease Control and Prevention -- www.cdc.gov/sids National ...

  20. Migraine - resources

    MedlinePlus

    Resources - migraine ... The following organizations are good resources for information on migraines : American Migraine Foundation -- www.americanmigrainefoundation.org National Headache Foundation -- www.headaches.org National Institute of Neurological Disorders ...

  1. Blindness - resources

    MedlinePlus

    Resources - blindness ... The following organizations are good resources for information on blindness : American Foundation for the Blind -- www.afb.org Foundation Fighting Blindness -- www.blindness.org National Eye Institute -- ...

  2. Psoriasis - resources

    MedlinePlus

    Resources - psoriasis ... The following organizations are good resources for information about psoriasis : American Academy of Dermatology -- www.aad.org/skin-conditions/dermatology-a-to-z/psoriasis National Institute of ...

  3. Infertility - resources

    MedlinePlus

    Resources - infertility ... The following organizations are good resources for information on infertility : Centers for Disease Control and Prevention -- www.cdc/gov/reproductivehealth/infertility March of Dimes -- www.marchofdimes.com/ ...

  4. ALS - resources

    MedlinePlus

    Resources - ALS ... The following organizations are good resources for information on amyotrophic lateral sclerosis : Muscular Dystrophy Association -- mda.org/disease/amyotrophic-lateral-sclerosis National Amyotrophic Lateral Sclerosis (ALS) Registry -- ...

  5. Ostomy - resources

    MedlinePlus

    Resources - ostomy ... The following organizations are good resources for information on ostomies: American Society of Colon and Rectal Surgeons -- www.fascrs.org/patients/treatments-screening and www.fascrs.org/ ...

  6. Human Resource Information System

    ERIC Educational Resources Information Center

    Swinford, Paul

    1978-01-01

    A computer at Valley View Schools, Illinois, is used to collect, store, maintain, and retrieve information about a school district's human resources. The system was designed to increase the efficiency and thoroughness of personnel and payroll record keeping, and reporting functions. (Author/MLF)

  7. Agriculture, forestry, range resources

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.

    1974-01-01

    The necessary elements to perform global inventories of agriculture, forestry, and range resources are being brought together through the use of satellites, sensors, computers, mathematics, and phenomenology. Results of ERTS-1 applications in these areas, as well as soil mapping, are described.

  8. Computer-aided analysis of Skylab multispectral scanner data in mountainous terrain for land use, forestry, water resource, and geologic applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. One of the most significant results of this Skylab research involved the geometric correction and overlay of the Skylab multispectral scanner data with the LANDSAT multispectral scanner data, and also with a set of topographic data, including elevation, slope, and aspect. The Skylab S192 multispectral scanner data had distinct differences in noise level of the data in the various wavelength bands. Results of the temporal evaluation of the SL-2 and SL-3 photography were found to be particularly important for proper interpretation of the computer-aided analysis of the SL-2 and SL-3 multispectral scanner data. There was a quality problem involving the ringing effect introduced by digital filtering. The modified clustering technique was found valuable when working with multispectral scanner data involving many wavelength bands and covering large geographic areas. Analysis of the SL-2 scanner data involved classification of major cover types and also forest cover types. Comparison of the results obtained wth Skylab MSS data and LANDSAT MSS data indicated that the improved spectral resolution of the Skylab scanner system enabled a higher classification accuracy to be obtained for forest cover types, although the classification performance for major cover types was not significantly different.

  9. Cloud computing basics for librarians.

    PubMed

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article.

  10. Cloud computing basics for librarians.

    PubMed

    Hoy, Matthew B

    2012-01-01

    "Cloud computing" is the name for the recent trend of moving software and computing resources to an online, shared-service model. This article briefly defines cloud computing, discusses different models, explores the advantages and disadvantages, and describes some of the ways cloud computing can be used in libraries. Examples of cloud services are included at the end of the article. PMID:22289098

  11. An expert fitness diagnosis system based on elastic cloud computing.

    PubMed

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  12. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    PubMed Central

    Tseng, Kevin C.; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service. PMID:24723842

  13. Mediagraphy: Print and Nonprint Resources.

    ERIC Educational Resources Information Center

    Educational Media and Technology Yearbook, 1998

    1998-01-01

    Lists educational media-related journals, books, ERIC documents, journal articles, and nonprint resources classified by Artificial Intelligence, Robotics, Electronic Performance Support Systems; Computer-Assisted Instruction; Distance Education; Educational Research; Educational Technology; Electronic Publishing; Information Science and…

  14. Controlling user access to electronic resources without password

    SciTech Connect

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  15. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  16. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  17. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  18. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  19. Implementation of CCNUGrid-based Computational Environment for Molecular Modeling

    NASA Astrophysics Data System (ADS)

    Liu, Kai; Luo, Changhua; Ren, Yanliang; Wan, Jian; Xu, Xin

    2007-12-01

    Grid computing technology has being regarded as one of the most promising solutions for the tremendous requirement of computing resources in the field of molecular modeling up to date. Contrast to building a more and more powerful super-computer with novel hardware in a local network, grid technology enable us, in principle, to integrate various previous and present computing resources located in different location into a computing platform as a whole. As a case demonstration, we reported herein that a campus grid entitled CCNUGrid was implemented with grid middleware, consisting of four local computing networks distributed in College of Chemistry, College of Physics, Center for Network, and Center for Education Information Technology and Engineering, respectively, at Central China Normal University. Visualization functions of monitoring computer machines in each local network, monitoring job processing flow, and monitoring computational results were realized in this campus grid-based computational environment, in addition to the conventional components of grid architecture: universal portal, task management, computing node and security. In the last section of this paper, a molecular docking-based virtual screening study was performed at the CCNUGrid, as one example of CCNUGrid applications.

  20. Cloudbus Toolkit for Market-Oriented Cloud Computing

    NASA Astrophysics Data System (ADS)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  1. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  2. Computers in Fundamental Mathematics.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Computer Information Services.

    This manual provides a resource for mathematics teachers who wish to take advantage of the extensive library of computer software materials available to expand and strengthen classroom instruction. Classroom management procedures, software duplication guidelines, copying procedures of diskettes for MS DOS and Apple II computers, and methods for…

  3. UT-CT: A National Resource for Applications of High-Resolution X-ray Computed Tomography in the Geological Sciences

    NASA Astrophysics Data System (ADS)

    Carlson, W. D.; Ketcham, R. A.; Rowe, T. B.

    2002-12-01

    An NSF-sponsored (EAR-IF) shared multi-user facility dedicated to research applications of high-resolution X-ray computed tomography (CT) in the geological sciences has been in operation since 1997 at the University of Texas at Austin. The centerpiece of the facility is an industrial CT scanner custom-designed for geological applications. Because the instrument can optimize trade-offs among penetrating ability, spatial resolution, density discrimination, imaging modes, and scan times, it can image a very broad range of geological specimens and materials, and thus offers significant advantages over medical scanners and desktop microtomographs. Two tungsten-target X-ray sources (200-kV microfocal and 420-kV) and three X-ray detectors (image-intensifier, high-sensitivity cadmium tungstate linear array, and high-resolution gadolinium-oxysulfide radiographic line scanner) can be used in various combinations to meet specific imaging goals. Further flexibility is provided by multiple imaging modes: second-generation (translate-rotate), third-generation (rotate-only; centered and variably offset), and cone-beam (volume CT). The instrument can accommodate specimens as small as about 1 mm on a side, and as large as 0.5 m in diameter and 1.5 m tall. Applications in petrology and structural geology include measuring crystal sizes and locations to identify mechanisms governing the kinetics of metamorphic reactions; visualizing relationships between alteration zones and abundant macrodiamonds in Siberian eclogites to elucidate metasomatic processes in the mantle; characterizing morphologies of spiral inclusion trails in garnet to test hypotheses of porphyroblast rotation during growth; measuring vesicle size distributions in basaltic flows for determination of elevation at the time of eruption to constrain timing and rates of continental uplift; analysis of the geometry, connectivity, and tortuosity of migmatite leucosomes to define the topology of melt flow paths, for numerical

  4. The Evolution of Cloud Computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Berghaus, Frank; Brasolin, Franco; Domingues Cordeiro, Cristovao Jose; Desmarais, Ron; Field, Laurence; Gable, Ian; Giordano, Domenico; Di Girolamo, Alessandro; Hover, John; LeBlanc, Matthew; Love, Peter; Paterson, Michael; Sobie, Randall; Zaytsev, Alexandr

    2015-12-01

    The ATLAS experiment at the LHC has successfully incorporated cloud computing technology and cloud resources into its primarily grid-based model of distributed computing. Cloud R&D activities continue to mature and transition into stable production systems, while ongoing evolutionary changes are still needed to adapt and refine the approaches used, in response to changes in prevailing cloud technology. In addition, completely new developments are needed to handle emerging requirements. This paper describes the overall evolution of cloud computing in ATLAS. The current status of the virtual machine (VM) management systems used for harnessing Infrastructure as a Service resources are discussed. Monitoring and accounting systems tailored for clouds are needed to complete the integration of cloud resources within ATLAS' distributed computing framework. We are developing and deploying new solutions to address the challenge of operation in a geographically distributed multi-cloud scenario, including a system for managing VM images across multiple clouds, a system for dynamic location-based discovery of caching proxy servers, and the usage of a data federation to unify the worldwide grid of storage elements into a single namespace and access point. The usage of the experiment's high level trigger farm for Monte Carlo production, in a specialized cloud environment, is presented. Finally, we evaluate and compare the performance of commercial clouds using several benchmarks.

  5. On the Cutting Edge of Change: The Community College Library/Learning Resource Center.

    ERIC Educational Resources Information Center

    Giles, Louise

    Community college libraries tend to be the pace setters for change in libraries. As the new media have been introduced into libraries, ways have been found to integrate them into the library to create multimedia resource collections. In addition to traditional library services, instructional design, materials production, computer operation,…

  6. Symphony Time Accounting Resource (STAR)

    SciTech Connect

    Newfield, S.E.; Booth, J.W.; Redman, D.L.

    1986-05-01

    The Symphony Time Accounting Resource, a new time accounting system, that can be run on personal computers instead of computer mainframes is described. This new system is useful for organizations that do work under several job order numbers and/or accounting codes and could also be adapted for use by organizations on the recharge system. 1 fig., 2 tabs.

  7. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  8. Problem Contexts for Thinking about Equality: An Additional Resource

    ERIC Educational Resources Information Center

    Barlow, Angela T.; Harmon, Shannon E.

    2012-01-01

    It has been well-documented that many students do not understand the meaning of the equal sign. Thus, researchers have called for instruction that specifically addresses misconceptions about the equal sign, and have indicated that such work must start at the elementary level. In response to these recommendations, some state curriculums require…

  9. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  10. Extractable resources

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The use of information from space systems in the operation of extractive industries, particularly in exploration for mineral and fuel resources was reviewed. Conclusions and recommendations reported are based on the fundamental premise that survival of modern industrial society requires a continuing secure flow of resources for energy, construction and manufacturing, and for use as plant foods.

  11. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  12. Advancing computational methods for calibration of the Soil and Water Assessment Tool (SWAT): Application for modeling climate change impacts on water resources in the Upper Neuse Watershed of North Carolina

    NASA Astrophysics Data System (ADS)

    Ercan, Mehmet Bulent

    Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non

  13. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  14. Graphic engine resource management

    NASA Astrophysics Data System (ADS)

    Bautin, Mikhail; Dwarakinath, Ashok; Chiueh, Tzi-cker

    2008-01-01

    Modern consumer-grade 3D graphic cards boast a computation/memory resource that can easily rival or even exceed that of standard desktop PCs. Although these cards are mainly designed for 3D gaming applications, their enormous computational power has attracted developers to port an increasing number of scientific computation programs to these cards, including matrix computation, collision detection, cryptography, database sorting, etc. As more and more applications run on 3D graphic cards, there is a need to allocate the computation/memory resource on these cards among the sharing applications more fairly and efficiently. In this paper, we describe the design, implementation and evaluation of a Graphic Processing Unit (GPU) scheduler based on Deficit Round Robin scheduling that successfully allocates to every process an equal share of the GPU time regardless of their demand. This scheduler, called GERM, estimates the execution time of each GPU command group based on dynamically collected statistics, and controls each process's GPU command production rate through its CPU scheduling priority. Measurements on the first GERM prototype show that this approach can keep the maximal GPU time consumption difference among concurrent GPU processes consistently below 5% for a variety of application mixes.

  15. Resources Management System

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Delta Data Systems, Inc. was originally formed by NASA and industry engineers to produce a line of products that evolved from ELAS, a NASA-developed computer program. The company has built on that experience, using ELAS as the basis for other remote sensing products. One of these is AGIS, a computer package for geographic and land information systems. AGIS simultaneously processes remotely sensed and map data. The software is designed to operate on a low cost microcomputer, putting resource management tools within reach of small operators.

  16. A resource management architecture for metacomputing systems.

    SciTech Connect

    Czajkowski, K.; Foster, I.; Karonis, N.; Kesselman, C.; Martin, S.; Smith, W.; Tuecke, S.

    1999-08-24

    Metacomputing systems are intended to support remote and/or concurrent use of geographically distributed computational resources. Resource management in such systems is complicated by five concerns that do not typically arise in other situations: site autonomy and heterogeneous substrates at the resources, and application requirements for policy extensibility, co-allocation, and online control. We describe a resource management architecture that addresses these concerns. This architecture distributes the resource management problem among distinct local manager, resource broker, and resource co-allocator components and defines an extensible resource specification language to exchange information about requirements. We describe how these techniques have been implemented in the context of the Globus metacomputing toolkit and used to implement a variety of different resource management strategies. We report on our experiences applying our techniques in a large testbed, GUSTO, incorporating 15 sites, 330 computers, and 3600 processors.

  17. Aviation & Space Education: A Teacher's Resource Guide.

    ERIC Educational Resources Information Center

    Texas State Dept. of Aviation, Austin.

    This resource guide contains information on curriculum guides, resources for teachers, computer software and computer related programs, audio/visual presentations, model aircraft and demonstration aids, training seminars and career education, and an aerospace bibliography for primary grades. Each entry includes all or some of the following items:…

  18. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  19. Resources in Environmental Chemistry: 1. An Annotated Bibliography of the Chemistry of Pollution and Resources

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth A.

    1976-01-01

    References are divided into three areas: general; air; and resources, wastes, and recycling. Includes books, periodicals, abstracts, government documents, experiments, computer models, simulations, and films. (MLH)

  20. Resources for flow and image cytometry

    SciTech Connect

    Cassidy, M.

    1990-01-01

    This paper describes resources available to the flow and image cytometry community. I have been asked to limit the discussion to resources available in the United States, so reference to resources exclusively available in Japan, Europe, or Australia are not included. It is not the intention of this paper to include each and every resource available, rather, to describe the types available and give some examples. Included in this manuscript are listings of some of the examples of resources which readers may find useful. Addresses of commercial companies are not included in the interest of space. Most of the examples listed advertise on a regular basis in journals publishing in cytometry fields. The resources to be described are divided into five categories: instrument resources, computer and software resources, standards, physical or user'' resources, and instructional resources. Each of these resources will be discussed separately. 4 tabs.

  1. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  2. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 3 2013-07-01 2013-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  3. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  4. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 3 2012-07-01 2012-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  5. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 3 2014-07-01 2014-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  6. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  7. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  8. 40 CFR 51.280 - Resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Resources. 51.280 Section 51.280... Resources. Each plan must include a description of the resources available to the State and local agencies at the date of submission of the plan and of any additional resources needed to carry out the...

  9. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 3 2011-07-01 2011-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  10. 40 CFR 52.135 - Resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Resources. 52.135 Section 52.135... PROMULGATION OF IMPLEMENTATION PLANS Arizona § 52.135 Resources. (a) The requirements of § 51.280 of this... resources available to the State and local agencies and of additional resources needed to carry out the...

  11. Automatic Computer Mapping of Terrain

    NASA Technical Reports Server (NTRS)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  12. Gateways among Academic Computer Networks.

    ERIC Educational Resources Information Center

    McCredie, John W.

    National intercampus computer networks are discussed, along with six illustrative networks. Attention is focused on computer networks with significant academic usage through which special software is available to manage resources in the network. It is noted that computer networks have widespread use among academics for communication in the form of…

  13. Computer Viruses: Pathology and Detection.

    ERIC Educational Resources Information Center

    Maxwell, John R.; Lamon, William E.

    1992-01-01

    Explains how computer viruses were originally created, how a computer can become infected by a virus, how viruses operate, symptoms that indicate a computer is infected, how to detect and remove viruses, and how to prevent a reinfection. A sidebar lists eight antivirus resources. (four references) (LRW)

  14. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  15. Community-Authored Resources for Education

    ERIC Educational Resources Information Center

    Tinker, Robert; Linn, Marcia; Gerard, Libby; Staudt, Carolyn

    2010-01-01

    Textbooks are resources for learning that provide the structure, content, assessments, and teacher guidance for an entire course. Technology can provide a far better resource that provides the same functions, but takes full advantage of computers. This resource would be much more than text on a screen. It would use the best technology; it would be…

  16. Hemophilia - resources

    MedlinePlus

    Resources - hemophilia ... The following organizations provide further information on hemophilia : Centers for Disease Control and Prevention -- www.cdc.gov/ncbddd/hemophilia/index.html National Heart, Lung, and Blood Institute -- www.nhlbi.nih.gov/ ...

  17. Diabetes - resources

    MedlinePlus

    Resources - diabetes ... The following sites provide further information on diabetes: American Diabetes Association -- www.diabetes.org Juvenile Diabetes Research Foundation International -- www.jdrf.org National Center for Chronic Disease Prevention and Health Promotion -- ...

  18. Depression - resources

    MedlinePlus

    Resources - depression ... Depression is a medical condition. If you think you may be depressed, see a health care provider. ... following organizations are good sources of information on depression : American Psychological Association -- www.apa.org/topics/depress/ ...

  19. Arthritis - resources

    MedlinePlus

    Resources - arthritis ... The following organizations provide more information on arthritis : American Academy of Orthopaedic Surgeons -- orthoinfo.aaos.org/menus/arthritis.cfm Arthritis Foundation -- www.arthritis.org Centers for Disease Control and Prevention -- www. ...

  20. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.