Science.gov

Sample records for science grid experimental

  1. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  2. The open science grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2004-12-01

    The U.S. LHC Tier-1 and Tier-2 laboratories and universities are developing production Grids to support LHC applications running across a worldwide Grid computing system. Together with partners in computer science, physics grid projects and active experiments, we will build a common national production grid infrastructure which is open in its architecture, implementation and use. The Open Science Grid (OSG) model builds upon the successful approach of last year's joint Grid2003 project. The Grid3 shared infrastructure has for over eight months provided significant computational resources and throughput to a range of applications, including ATLAS and CMS data challenges, SDSS, LIGO, and biology analyses, and computer science demonstrators and experiments. To move towards LHC-scale data management, access and analysis capabilities, we must increase the scale, services, and sustainability of the current infrastructure by an order of magnitude or more. Thus, we must achieve a significant upgrade in its functionalities and technologies. The initial OSG partners will build upon a fully usable, sustainable and robust grid. Initial partners include the US LHC collaborations, DOE & NSF Laboratories and Universities & Trillium Grid projects. The approach is to federate with other application communities in the U.S. to build a shared infrastructure open to other sciences and capable of being modified and improved to respond to needs of other applications, including CDF, D0, BaBar, and RHIC experiments. We describe the application-driven, engineered services of the OSG, short term plans and status, and the roadmap for a consortium, its partnerships and national focus.

  3. Reliable multicast for the Grid: a case study in experimental computer science.

    PubMed

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  4. The Open Science Grid

    SciTech Connect

    Pordes, Ruth; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Wurthwein, Frank; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  5. New Science on the Open Science Grid

    SciTech Connect

    Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; /Fermilab /Florida U. /Chicago U. /Caltech /LBL, Berkeley /Wisconsin U., Madison /Indiana U. /Brookhaven /UC, San Diego

    2008-06-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  6. New science on the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Pordes, R.; Altunay, M.; Avery, P.; Bejan, A.; Blackburn, K.; Blatecky, A.; Gardner, R.; Kramer, B.; Livny, M.; McGee, J.; Potekhin, M.; Quick, R.; Olson, D.; Roy, A.; Sehgal, C.; Wenaus, T.; Wilde, M.; Würthwein, F.

    2008-07-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large-scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement, and the distributed facility. This paper gives both a brief general description and specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  7. Space-based Science Operations Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    science experimenters. There is an international aspect to the Grid involving the America's Pathway (AMPath) network, the Chilean REUNA Research and Education Network and the University of Chile in Santiago that will further demonstrate how extensive these services can be used. From the user's perspective, the Prototype will provide a single interface and logon to these varied services without the complexity of knowing the where's and how's of each service. There is a separate and deliberate emphasis on security. Security will be addressed by specifically outlining the different approaches and tools used. Grid technology, unlike the Internet, is being designed with security in mind. In addition we will show the locations, configurations and network paths associated with each service and virtual organization. We will discuss the separate virtual organizations that we define for the varied user communities. These will include certain, as yet undetermined, space-based science functions and/or processes and will include specific virtual organizations required for public and educational outreach and science and engineering collaboration. We will also discuss the Grid Prototype performance and the potential for further Grid applications both space-based and ground based projects and processes. In this paper and presentation we will detail each service and how they are integrated using Grid

  8. Enabling Campus Grids with Open Science Grid Technology

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian; Fraser, Dan; Pordes, Ruth; Swanson, David

    2011-12-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  9. Grid for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  10. Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, Vickie; Chen, Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of 1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  11. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  12. TeraGrid Gateways for Earth Science

    NASA Astrophysics Data System (ADS)

    Wilkins-Diehr, Nancy

    2010-05-01

    The increasingly digital component of science today poses exciting challenges and opportunities for researchers. Whether it's streaming data from sensors to computations, tagging video in the study of language patterns or the use of geographic information systems to anticipate the spread of disease, the challenges are enormous and continue to grow. The existence of advanced cyberinfrastructure (CI) tools or science gateways can significantly increase the productivity of researchers facing the most difficult challenges - in some cases making the impossible possible. The TeraGrid Science Gateways program works to incorporate high end resources through these community-designed interfaces. This talk will present an overview of TeraGrid's gateway program and highlight several gateways in atmospheric science, earth sciences and geography and regional science, geophysics, global atmospheric research, materials research and seismology.

  13. Parallel Grid Manipulations in Earth Science Calculations

    NASA Technical Reports Server (NTRS)

    Sawyer, W.; Lucchesi, R.; daSilva, A.; Takacs, L. L.

    1999-01-01

    sparse interpolation with little data locality between the physical lat-lon grid and a pole rotated computational grid- can be solved efficiently and at the GFlop/s rates needed to solve tomorrow's high resolution earth science models. In the subsequent presentation we will discuss the design and implementation of PILGRIM as well as a number of the problems it is required to solve. Some conclusions will be drawn about the potential performance of the overall earth science models on the supercomputer platforms foreseen for these problems.

  14. Grid Computing for Earth Science

    NASA Astrophysics Data System (ADS)

    Renard, Philippe; Badoux, Vincent; Petitdidier, Monique; Cossu, Roberto

    2009-04-01

    The fundamental challenges facing humankind at the beginning of the 21st century require an effective response to the massive changes that are putting increasing pressure on the environment and society. The worldwide Earth science community, with its mosaic of disciplines and players (academia, industry, national surveys, international organizations, and so forth), provides a scientific basis for addressing issues such as the development of new energy resources; a secure water supply; safe storage of nuclear waste; the analysis, modeling, and mitigation of climate changes; and the assessment of natural and industrial risks. In addition, the Earth science community provides short- and medium-term prediction of weather and natural hazards in real time, and model simulations of a host of phenomena relating to the Earth and its space environment. These capabilities require that the Earth science community utilize, both in real and remote time, massive amounts of data, which are usually distributed among many different organizations and data centers.

  15. Grids for Dummies: Featuring Earth Science Data Mining Application

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  16. Technology for a NASA Space-Based Science Operations Grid

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.

    2003-01-01

    This viewgraph representation presents an overview of a proposal to develop a space-based operations grid in support of space-based science experiments. The development of such a grid would provide a dynamic, secure and scalable architecture based on standards and next-generation reusable software and would enable greater science collaboration and productivity through the use of shared resources and distributed computing. The authors propose developing this concept for use on payload experiments carried aboard the International Space Station. Topics covered include: grid definitions, portals, grid development and coordination, grid technology and potential uses of such a grid.

  17. The Open Science Grid status and architecture

    SciTech Connect

    Pordes, Ruth; Petravick, Don; Kramer, Bill; Olsen, James D.; Livny, Miron; Roy, Gordon A.; Avery, Paul Ralph; Blackburn, Kent; Wenaus, Torre J.; Wuerthwein, Frank K.; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  18. European grid services for global earth science

    NASA Astrophysics Data System (ADS)

    Brewer, S.; Sipos, G.

    2012-04-01

    This presentation will provide an overview of the distributed computing services that the European Grid Infrastructure (EGI) offers to the Earth Sciences community and also explain the processes whereby Earth Science users can engage with the infrastructure. One of the main overarching goals for EGI over the coming year is to diversify its user-base. EGI therefore - through the National Grid Initiatives (NGIs) that provide the bulk of resources that make up the infrastructure - offers a number of routes whereby users, either individually or as communities, can make use of its services. At one level there are two approaches to working with EGI: either users can make use of existing resources and contribute to their evolution and configuration; or alternatively they can work with EGI, and hence the NGIs, to incorporate their own resources into the infrastructure to take advantage of EGI's monitoring, networking and managing services. Adopting this approach does not imply a loss of ownership of the resources. Both of these approaches are entirely applicable to the Earth Sciences community. The former because researchers within this field have been involved with EGI (and previously EGEE) as a Heavy User Community and the latter because they have very specific needs, such as incorporating HPC services into their workflows, and these will require multi-skilled interventions to fully provide such services. In addition to the technical support services that EGI has been offering for the last year or so - the applications database, the training marketplace and the Virtual Organisation services - there now exists a dynamic short-term project framework that can be utilised to establish and operate services for Earth Science users. During this talk we will present a summary of various on-going projects that will be of interest to Earth Science users with the intention that suggestions for future projects will emerge from the subsequent discussions: • The Federated Cloud Task

  19. Service engineering for grid services in medicine and life science.

    PubMed

    Weisbecker, Anette; Falkner, Jürgen

    2009-01-01

    Clearly defined services with appropriate business models are necessary in order to exploit the benefit of grid computing for industrial and academic users in medicine and life sciences. In the project Services@MediGRID the service engineering approach is used to develop those clearly defined grid services and to provide sustainable business models for their usage.

  20. AstroGrid-D: Grid technology for astronomical science

    NASA Astrophysics Data System (ADS)

    Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve

    2011-02-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.

  1. Virtual Experiments on the Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, V. E.; Cobb, J. W.; Farhi, E.; Miller, S. D.; Taylor, M.

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted.

  2. Public storage for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  3. Migrating Open Science Grid to RPMs*

    NASA Astrophysics Data System (ADS)

    Roy, Alain

    2012-12-01

    We recently completed a significant transition in the Open Science Grid (OSG) in which we moved our software distribution mechanism from the useful but niche system called Pacman to a community-standard native package system, RPM. In this paper we explore some of the lessons learned during this transition as well as our earlier work, lessons that we believe are valuable not only for software distribution and packaging, but also for software engineering in a distributed computing environment where reliability is critical. We discuss the benefits found in moving to a community standard, including the abilities to reuse existing packaging, to donate existing packaging back to the community, and to leverage existing skills in the community. We describe our approach to testing in which we test our software against multiple versions of the OS, including pre-releases of the OS, in order to find surprises before our users do. Finally, we discuss our large-scale evaluation testing and community testing, which are essential for both quality and community acceptance.

  4. Pilot job accounting and auditing in Open Science Grid

    SciTech Connect

    Sfiligoi, Igor; Green, Chris; Quinn, Greg; Thain, Greg; /Wisconsin U., Madison

    2008-06-01

    The Grid accounting and auditing mechanisms were designed under the assumption that users would submit their jobs directly to the Grid gatekeepers. However, many groups are starting to use pilot-based systems, where users submit jobs to a centralized queue and are successively transferred to the Grid resources by the pilot infrastructure. While this approach greatly improves the user experience, it does disrupt the established accounting and auditing procedures. Open Science Grid deploys gLExec on the worker nodes to keep the pilot-related accounting and auditing information and centralizes the accounting collection with GRATIA.

  5. A Dynamic Bridge for Data Sharing on e-Science Grid Implementing Web 2.0 Service

    NASA Astrophysics Data System (ADS)

    Jung, Im Y.; Yeom, Heon Y.

    This paper proposes a dynamic bridge for e-Science Grid, implementing Web 2.0 service in order to share experimental data effectively.An e-Science Grid has been established as a cyber laboratory for the users with a special research purpose on science. As an open space, e-Science Grid is expected to stimulate the collaborative researches and the cross domain ones. These research trends need a more efficient and convenient data service satisfying the science researchers. A dynamic bridge designed based on HVEM DataGrid, satisfies the users' requirements for the data sharing on e-Science Grid effectively. It supports a data tagging service in order for HVEM DataGrid to be utilized more extensively without any modification of the existing Grid architecture or services. Moreover, it can be adopted and deleted easily without any effect to the legacy Grid. With the legacyinterface to access data in e-Science Grid, the data tags endow the Grid with the flexibility for data access. This paper evaluates the usefulness of the dynamic bridge by analyzing its overhead and performance.

  6. Grid Technology as a Cyber Infrastructure for Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    This paper describes how grids and grid service technologies can be used to develop an infrastructure for the Earth Science community. This cyberinfrastructure would be populated with a hierarchy of services, including discipline specific services such those needed by the Earth Science community as well as a set of core services that are needed by most applications. This core would include data-oriented services used for accessing and moving data as well as computer-oriented services used to broker access to resources and control the execution of tasks on the grid. The availability of such an Earth Science cyberinfrastructure would ease the development of Earth Science applications. With such a cyberinfrastructure, application work flows could be created to extract data from one or more of the Earth Science archives and then process it by passing it through various persistent services that are part of the persistent cyberinfrastructure, such as services to perform subsetting, reformatting, data mining and map projections.

  7. Nuclear test experimental science

    SciTech Connect

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S.

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research.

  8. Massive Science with VO and Grids

    NASA Astrophysics Data System (ADS)

    Nichol, R.; Smith, G.; Miller, C.; Freeman, P.; Genovese, C.; Wasserman, L.; Bryan, B.; Gray, A.; Schneider, J.; Moore, A.

    2006-07-01

    There is a growing need for massive computational resources for the analysis of new astronomical datasets. To tackle this problem, we present here our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, AstroGrid) and the computational grid (e.g. TeraGrid, COSMOS etc.). We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of computational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We discuss our planned usages of the VOTechBroker in computing a huge number of n--point correlation functions from the SDSS data and massive model-fitting of millions of CMBfast models to WMAP data. We also discuss other applications including the determination of the XMM Cluster Survey selection function and the construction of new WMAP maps.

  9. ISS Space-Based Science Operations Grid for the Ground Systems Architecture Workshop (GSAW)

    NASA Technical Reports Server (NTRS)

    Welch, Clara; Bradford, Bob

    2003-01-01

    Contents include the following:What is grid? Benefits of a grid to space-based science operations. Our approach. Score of prototype grid. The security question. Short term objectives. Long term objectives. Space-based services required for operations. The prototype. Score of prototype grid. Prototype service layout. Space-based science grid service components.

  10. Unlocking the potential of smart grid technologies with behavioral science

    DOE PAGES

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-04-09

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizingmore » the impact of smart grid technologies. In this study, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.« less

  11. Unlocking the potential of smart grid technologies with behavioral science.

    PubMed

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  12. Unlocking the potential of smart grid technologies with behavioral science

    PubMed Central

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  13. Unlocking the potential of smart grid technologies with behavioral science.

    PubMed

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  14. Unlocking the potential of smart grid technologies with behavioral science

    SciTech Connect

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-04-09

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this study, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  15. Optimal response to attacks on the open science grids.

    SciTech Connect

    Altunay, M.; Leyffer, S.; Linderoth, J. T.; Xie, Z.

    2011-01-01

    Cybersecurity is a growing concern, especially in open grids, where attack propagation is easy because of prevalent collaborations among thousands of users and hundreds of institutions. The collaboration rules that typically govern large science experiments as well as social networks of scientists span across the institutional security boundaries. A common concern is that the increased openness may allow malicious attackers to spread more readily around the grid. We consider how to optimally respond to attacks in open grid environments. To show how and why attacks spread more readily around the grid, we first discuss how collaborations manifest themselves in the grids and form the collaboration network graph, and how this collaboration network graph affects the security threat levels of grid participants. We present two mixed-integer program (MIP) models to find the optimal response to attacks in open grid environments, and also calculate the threat level associated with each grid participant. Given an attack scenario, our optimal response model aims to minimize the threat levels at unaffected participants while maximizing the uninterrupted scientific production (continuing collaborations). By adopting some of the collaboration rules (e.g., suspending a collaboration or shutting down a site), the model finds optimal response to subvert an attack scenario.

  16. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  17. Data Grid tools: enabling science on big distributed data

    NASA Astrophysics Data System (ADS)

    Allcock, Bill; Chervenak, Ann; Foster, Ian; Kesselman, Carl; Livny, Miron

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the "plumbing" that allows scientists to do more science on an unprecedented scale in production environments.

  18. Combining the GRID with Cloud for Earth Science Computing

    NASA Astrophysics Data System (ADS)

    Mishin, Dmitry; Levchenko, Oleg; Groudnev, Andrei; Zhizhin, Mikhail

    2010-05-01

    Cloud computing is a new economic model of using large cluster computing resources which were earlier managed by GRID. Reusing existing GRID infrastructure gives an opportunity to combine the Cloud and GRID technologies on the same hardware and to provide GRID users with functionality for running high performance computing tasks inside virtual machines. In this case Cloud works "above" GRID, sharing computing power and utilizing unused processor time. We manage virtual machines with Eucalyptus elastic cloud and we use Torque system from gLite infrastructure for spreading Cloud jobs in GRID computing nodes to scale the parallel computing tasks on virtual machines created by elastic cloud. For this purpose we have added new types of tasks to the standard GRID task list: to run a virtual node and to run a job on a virtual node. This gives a possibility to seamlessly upscale the Cloud with the new tasks when needed and to shrink it when the tasks are completed. Using GRID components for managing the size of a virtual cloud simplifies building the billing system to charge the Cloud users for the processor time, disk space and outer traffic consumed. A list of Earth Science computing problems that can be solved by using the elastic Cloud include repetitive tasks of downloading, converting and storing in a database of large arrays of data (e.g. weather forecast); creating a pyramid of lower resolution images from a very large one for fast distributed browsing; processing and analyzing the large distributed amounts of data by running Earth Science numerical models.

  19. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  20. DZero data-intensive computing on the Open Science Grid

    SciTech Connect

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.; /Fermilab

    2007-09-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  1. DZero data-intensive computing on the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.

    2008-07-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  2. Open Science Grid: Linking Universities and Laboratories In National Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Avery, Paul

    2011-10-01

    Open Science Grid is a consortium of researchers from universities and national laboratories that operates a national computing infrastructure serving large-scale scientific and engineering research. While OSG's scale has been primarily driven by the demands of the LHC experiments, it currently serves particle and nuclear physics, gravitational wave searches, digital astronomy, genomic science, weather forecasting, molecular modeling, structural biology and nanoscience. The OSG distributed computing facility links campus and regional computing resources and is a major component of the Worldwide LHC Computing Grid (WLCG) that handles the massive computing and storage needs of experiments at the Large Hadron Collider. This collaborative work has provided a wealth of results, including powerful new software tools and services; a uniform packaging scheme (the Virtual Data Toolkit) that simplifies software deployment across many sites in the US and Europe; integration of complex tools and services in large science applications; multiple education and outreach projects; and new approaches to integrating advanced network infrastructure in scientific computing applications. More importantly, OSG has provided unique collaborative opportunities between researchers in a variety of research disciplines.

  3. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  4. e-Science, caGrid, and Translational Biomedical Research

    PubMed Central

    Saltz, Joel; Kurc, Tahsin; Hastings, Shannon; Langella, Stephen; Oster, Scott; Ervin, David; Sharma, Ashish; Pan, Tony; Gurcan, Metin; Permar, Justin; Ferreira, Renato; Payne, Philip; Catalyurek, Umit; Caserta, Enrico; Leone, Gustavo; Ostrowski, Michael C.; Madduri, Ravi; Foster, Ian; Madhavan, Subhashree; Buetow, Kenneth H.; Shanbhag, Krishnakant; Siegel, Eliot

    2011-01-01

    Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies. PMID:21311723

  5. Physical Science Laboratory Manual, Experimental Version.

    ERIC Educational Resources Information Center

    Cooperative General Science Project, Atlanta, GA.

    Provided are physical science laboratory experiments which have been developed and used as a part of an experimental one year undergraduate course in general science for non-science majors. The experiments cover a limited number of topics representative of the scientific enterprise. Some of the topics are pressure and buoyancy, heat, motion,…

  6. SCEC Earthworks: A TeraGrid Science Gateway

    NASA Astrophysics Data System (ADS)

    Francoeur, H.; Muench, J.; Okaya, D.; Maechling, P.; Deelman, E.; Mehta, G.

    2006-12-01

    SCEC Earthworks is a scientific gateway designed to provide community wide access to the TeraGrid. Earthworks provides its users with a portal based interface for easily running anelastic wave propagation (AWM) simulations. Using Gridsphere and several portlets developed as a collaborative effort with IRIS, Earthworks enables users to run simulations without any knowledge of the underlying workflow technology needed to utilize the TeraGrid. The workflow technology behind Earthworks has been developed as a collaborative effort between SCEC and the Information Sciences Institute (ISI). Earthworks uses a complex software stack to translate abstract workflows defined by the user into a series of jobs that run on a number of computational resources. These computational resources include a combination of servers provided by SCEC, USC High Performance Computing Center and NSF TeraGrid supercomputer facilities. Workflows are constructed after input from the user is passed via a Java based interface to the Earthworks backend, where a DAX (directed acyclic graph in XML) is generated. This DAX describes each step of the workflow including its inputs, outputs, and arguments, as well as the parent child relationships between each process. The DAX is then handed off to the Virtual Data System (VDS) and Pegasus provided by ISI, which translate it from an abstract workflow to a concrete workflow by filling in logical file and application names with their physical path and location. This newly created DAG (directed acyclic graph) is handed off to the Condor scheduler. The bottom part of the software stack is a Globus installation at each site the provides local transfer and resource management capabilities. Resources across different sites are transparently managed and tracked by VDS which allows greater flexibility in running the workflows. After a workflow is completed, products and metadata are registered with integrated data management tools. This allows for metadata querying

  7. Who Needs Plants? Science (Experimental).

    ERIC Educational Resources Information Center

    Ropeik, Bernard H.; Kleinman, David Z.

    The basic elective course in introductory botany is designed for secondary students who probably will not continue study in plant science. The objectives of the course are to help the student 1) identify, compare and differentiate types of plants; 2) identify plant cell structures; 3) distinguish between helpful and harmful plants; 4) predict…

  8. Progress in Earth and Space Science Infrastructure: Grids, Frameworks and Semantics

    NASA Astrophysics Data System (ADS)

    Middleton, D.; McGuinness, D.; Fox, P.

    2006-12-01

    We report on the very substantial progress made in several medium size cyberinfrastructure projects that involve multiple institutions and diverse provider and user communities. Of particular note isthe degree to which we have implemented robust production application frameworks built upon what up until a few years ago was considered experimental technologies in the fields of Earth and space sciences. These include Grid technologies, advanced software frameworks and semantic web technologies. We discuss our methods and successes across projects such as the Earth System Grid, the Virtual Solar-Terrestrial Observatory, and others. One trend that is emerging is a growing community of researchers who are recognizing a need for significant foundational resources such as geo-science ontologies, ontology languages and related environments for generation, evolution, maintenance, diagnosis, explanation, and verification, and cross-institutional services for single sign-on, resource and service integration and interconnection. While there is a long way to go to get to the point of a set of standard reference ontologies and supporting infrastructure, significant progress has been made.These projects are supported by NCAR, DoE, NASA, and NSF.

  9. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  10. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  11. Teaching Experimental Science: Enzymes and the Laboratory.

    ERIC Educational Resources Information Center

    Foster, John M.

    1989-01-01

    Hampshire College needed to create opportunities for advanced undergraduates to have extensive laboratory or field experience in experimental sciences. A general biochemistry course, taught almost entirely in the laboratory, is described. The focus of the course is enzymes as catalysts and as proteins. (MLW)

  12. An overview of Grid portal technologies for the development of HMR science gateways

    NASA Astrophysics Data System (ADS)

    D'Agostino, D.

    2012-04-01

    Grid portals and related technologies represent an easy and transparent way for scientists to interact with Distributed Computing Infrastructures (DCIs) as the Grid and the Cloud. Many toolkits and frameworks are available, both commercial and open source, but there is a lack of best practices, customization methodologies and dedicated high-level service repositories that allow a fast development of specialized scientific gateways in Europe. Starting from the US TeraGrid-XSEDE experience, in this contribution the most interesting portal toolkits and related European projects are analyzed with the perspective to develop a science gateway for HMR community within the the Distributed Research Infrastructure for Hydrometeorology (DRIHM) project.

  13. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    NASA Astrophysics Data System (ADS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  14. Experimental Demonstration of a Self-organized Architecture for Emerging Grid Computing Applications on OBS Testbed

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Hong, Xiaobin; Wu, Jian; Lin, Jintong

    As Grid computing continues to gain popularity in the industry and research community, it also attracts more attention from the customer level. The large number of users and high frequency of job requests in the consumer market make it challenging. Clearly, all the current Client/Server(C/S)-based architecture will become unfeasible for supporting large-scale Grid applications due to its poor scalability and poor fault-tolerance. In this paper, based on our previous works [1, 2], a novel self-organized architecture to realize a highly scalable and flexible platform for Grids is proposed. Experimental results show that this architecture is suitable and efficient for consumer-oriented Grids.

  15. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  16. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  17. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  18. Analysis of the Current Use, Benefit, and Value of the Open Science Grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2009-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by nonphysics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  19. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  20. Remote Job Testing for the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Miller, Stephen D; Reuter, Michael A; Smith, Bradford C

    2009-01-01

    Remote job execution gives neutron science facilities access to high performance computing such as the TeraGrid. A scientific community can use community software with a community certificate and account through a common interface of a portal. Results show this approach is successful, but with more testing and problem solving, we expect remote job executions to become more reliable.

  1. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1992-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science data set browsing, sampling and manipulation. The system will be coupled to a super computer in a distributed computing environment for near real-time interaction between scientists and computational results.

  2. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1993-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science dataset browsing, sampling and manipulation. The system will be coupled to a supercomputer in a distributed computing environment for near real-time interaction between scientists and computational results.

  3. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  4. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    SciTech Connect

    Livny, Miron; Shank, James; Ernst, Michael; Blackburn, Kent; Goasguen, Sebastien; Tuts, Michael; Gibbons, Lawrence; Pordes, Ruth; Sliz, Piotr; Deelman, Ewa; Barnett, William; Olson, Doug; McGee, John; Cowles, Robert; Wuerthwein, Frank; Gardner, Robert; Avery, Paul; Wang, Shaowen; Lincoln, David Swanson

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  5. Architectural Aspects of Grid Computing and its Global Prospects for E-Science Community

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    2008-05-01

    The paper reviews the imminent Architectural Aspects of Grid Computing for e-Science community for scientific research and business/commercial collaboration beyond physical boundaries. Grid Computing provides all the needed facilities; hardware, software, communication interfaces, high speed internet, safe authentication and secure environment for collaboration of research projects around the globe. It provides highly fast compute engine for those scientific and engineering research projects and business/commercial applications which are heavily compute intensive and/or require humongous amounts of data. It also makes possible the use of very advanced methodologies, simulation models, expert systems and treasure of knowledge available around the globe under the umbrella of knowledge sharing. Thus it makes possible one of the dreams of global village for the benefit of e-Science community across the globe.

  6. Fully Automated Single-Zone Elliptic Grid Generation for Mars Science Laboratory (MSL) Aeroshell and Canopy Geometries

    NASA Technical Reports Server (NTRS)

    kaul, Upender K.

    2008-01-01

    A procedure for generating smooth uniformly clustered single-zone grids using enhanced elliptic grid generation has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy. The procedure obviates the need for generating multizone grids for such geometries, as reported in the literature. This has been possible because the enhanced elliptic grid generator automatically generates clustered grids without manual prescription of decay parameters needed with the conventional approach. In fact, these decay parameters are calculated as decay functions as part of the solution, and they are not constant over a given boundary. Since these decay functions vary over a given boundary, orthogonal grids near any arbitrary boundary can be clustered automatically without having to break up the boundaries and the corresponding interior domains into various zones for grid generation.

  7. GENESIS SciFlo: Enabling Multi-Instrument Atmospheric Science Using Grid Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Tang, B.; Manipon, G.; Yunck, T.; Fetzer, E.; Braverman, A.; Dobinson, E.

    2004-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of web services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations will include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-strato-sphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we are developing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable web services and executable operators into a distributed computing flow (operator tree). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out

  8. Ultrasonic Technique for Experimental Investigation of Statistical Characteristics of Grid Generated Turbulence.

    NASA Astrophysics Data System (ADS)

    Andreeva, Tatiana; Durgin, William

    2001-11-01

    This paper focuses on ultrasonic measurements of a grid-generated turbulent flow using the travel time technique. In the present work an attempt to describe a turbulent flow by means of statistics of ultrasound wave propagation time is undertaken in combination with Kolmogorov (2/3)-power law. There are two objectives in current research work. The first one is to demonstrate an application of the travel-time ultrasonic technique for data acquisition in the grid-generated turbulence produced in a wind tunnel. The second one is to use the experimental data to verify or refute the analytically obtained expression for travel time dispersion as a function of velocity fluctuation metrics. The theoretical analysis and derivations of that formula are based on Kolmogorov theory. The series of experiment was conducted at different values of wind speeds and distances from the grid giving rise to different values of the dimensional turbulence characteristic coefficient K. Theoretical analysis, based on the experimental data reveals strong dependence of the turbulent characteristic K on the mean wind velocity. Tabulated values of the turbulent characteristic coefficient may be used for further understanding of the effect of turbulence on sound propagation.

  9. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid

  10. Thermoplastic Composites Reinforced with Textile Grids: Development of a Manufacturing Chain and Experimental Characterisation

    NASA Astrophysics Data System (ADS)

    Böhm, R.; Hufnagl, E.; Kupfer, R.; Engler, T.; Hausding, J.; Cherif, C.; Hufenbach, W.

    2013-12-01

    A significant improvement in the properties of plastic components can be achieved by introducing flexible multiaxial textile grids as reinforcement. This reinforcing concept is based on the layerwise bonding of biaxially or multiaxially oriented, completely stretched filaments of high-performance fibers, e.g. glass or carbon, and thermoplastic components, using modified warp knitting techniques. Such pre-consolidated grid-like textiles are particularly suitable for use in injection moulding, since the grid geometry is very robust with respect to flow pressure and temperature on the one hand and possesses an adjustable spacing to enable a complete filling of the mould cavity on the other hand. The development of pre-consolidated textile grids and their further processing into composites form the basis for providing tailored parts with a large number of additional integrated functions like fibrous sensors or electroconductive fibres. Composites reinforced in that way allow new product groups for promising lightweight structures to be opened up in future. The article describes the manufacturing process of this new composite class and their variability regarding reinforcement and function integration. An experimentally based study of the mechanical properties is performed. For this purpose, quasi-static and highly dynamic tensile tests have been carried out as well as impact penetration experiments. The reinforcing potential of the multiaxial grids is demonstrated by means of evaluating drop tower experiments on automotive components. It has been shown that the load-adapted reinforcement enables a significant local or global improvement of the properties of plastic components depending on industrial requirements.

  11. ReSS: A Resource Selection Service for the Open Science Grid

    SciTech Connect

    Garzoglio, Gabriele; Levshina, Tanya; Mhashilkar, Parag; Timm, Steve; /Fermilab

    2008-01-01

    The Open Science Grid offers access to hundreds of computing and storage resources via standard Grid interfaces. Before the deployment of an automated resource selection system, users had to submit jobs directly to these resources. They would manually select a resource and specify all relevant attributes in the job description prior to submitting the job. The necessity of a human intervention in resource selection and attribute specification hinders automated job management components from accessing OSG resources and it is inconvenient for the users. The Resource Selection Service (ReSS) project addresses these shortcomings. The system integrates condor technology, for the core match making service, with the gLite CEMon component, for gathering and publishing resource information in the Glue Schema format. Each one of these components communicates over secure protocols via web services interfaces. The system is currently used in production on OSG by the DZero Experiment, the Engagement Virtual Organization, and the Dark Energy. It is also the resource selection service for the Fermilab Campus Grid, FermiGrid. ReSS is considered a lightweight solution to push-based workload management. This paper describes the architecture, performance, and typical usage of the system.

  12. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    PubMed Central

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  13. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    PubMed

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  14. Montage: a grid-enabled engine for delivering custom science-grade mosaics on demand

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Deelman, Ewa; Good, John C.; Jacob, Joseph C.; Katz, Daniel S.; Kesselman, Carl; Laity, Anastasia C.; Prince, Thomas A.; Singh, Gurmeet; Su, Mei-Hu

    2004-09-01

    This paper describes the design of a grid-enabled version of Montage, an astronomical image mosaic service, suitable for large scale processing of the sky. All the re-projection jobs can be added to a pool of tasks and performed by as many processors as are available, exploiting the parallelization inherent in the Montage architecture. We show how we can describe the Montage application in terms of an abstract workflow so that a planning tool such as Pegasus can derive an executable workflow that can be run in the Grid environment. The execution of the workflow is performed by the workflow manager DAGMan and the associated Condor-G. The grid processing will support tiling of images to a manageable size when the input images can no longer be held in memory. Montage will ultimately run operationally on the Teragrid. We describe science applications of Montage, including its application to science product generation by Spitzer Legacy Program teams and large-scale, all-sky image processing projects.

  15. Formal specification is an experimental science

    SciTech Connect

    Bjorner, D.

    1992-09-01

    Traditionally, abstract models of large, complex systems have been given in free-form mathematics, combining - often in ad-hoc, not formally supported ways - notions from the disciplines of partial differential equations, functional analysis, mathematical statistics, etc. Such models have been very useful for assimilation of information, analysis (investigation), and prediction (simulation). These models have, however, usually not been helpful in deriving computer representations of the modelled systems - for the purposes of computerized monitoring and control, Computing science, concerned with how to construct objects that can exist within the computer, offers ways of complementing, and in some cases, replacing or combining traditional mathematical models. Formal, model-, as well as property-oriented, specifications in the styles of denotational (respectively, algebraic semantics) represent major approaches to such modelling. In this expository, discursive paper we illustrate what we mean by model-oriented specifications of large, complex technological computing systems. The three modelling examples covers the introvert programming methodological subject of SDEs: software development environments, the distributed computing system subject of wfs`s: (transaction) work flow systems, and the extrovert subject of robots: robotics! the thesis is, just as for mathematical modelling, that we can derive much understanding, etc., from experimentally creating such formally specified models - on paper - and that we gain little in additionally building ad-hoc prototypes. Our models are expressed in a model-oriented style using the VDM specification language Meta-IV In this paper the models only reflect the {open_quotes}data modelling{close_quotes} aspects. We observe that such data models are more easily captured in the model-oriented siyle than in the algebraic semantics property-oriented style which originally was built of the abstraction of operations. 101 refs., 4 figs.

  16. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  17. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  18. Experimental Observation of a Periodically Oscillating Plasma Sphere in a Gridded Inertial Electrostatic Confinement Device

    SciTech Connect

    Park, J.; Nebel, R.A.; Stange, S.; Murali, S. Krupakar

    2005-07-01

    The periodically oscillating plasma sphere (POPS) [D. C. Barnes and R. A. Nebel, Phys. Plasmas 5, 2498 (1998).] oscillation has been observed in a gridded inertial electrostatic confinement device. In these experiments, ions in the virtual cathode exhibit resonant behavior when driven at the POPS frequency. Excellent agreement between the observed POPS resonance frequency and theoretical predictions has been observed for a wide range of potential well depths and for three different ion species. The results provide the first experimental validation of the POPS concept proposed by Barnes and Nebel [R. A. Nebel and D. C. Barnes, Fusion Technol. 34, 28 (1998).].

  19. Experimental optimization of the FireFly 600 photovoltaic off-grid system.

    SciTech Connect

    Boyson, William Earl; Orozco, Ron; Ralph, Mark E.; Brown, Marlene Laura; King, David L.; Hund, Thomas D.

    2003-10-01

    A comprehensive evaluation and experimental optimization of the FireFly{trademark} 600 off-grid photovoltaic system manufactured by Energia Total, Ltd. was conducted at Sandia National Laboratories in May and June of 2001. This evaluation was conducted at the request of the manufacturer and addressed performance of individual system components, overall system functionality and performance, safety concerns, and compliance with applicable codes and standards. A primary goal of the effort was to identify areas for improvement in performance, reliability, and safety. New system test procedures were developed during the effort.

  20. Sealife: a semantic grid browser for the life sciences applied to the study of infectious diseases.

    PubMed

    Schroeder, Michael; Burger, Albert; Kostkova, Patty; Stevens, Robert; Habermann, Bianca; Dieng-Kuntz, Rose

    2006-01-01

    The objective of Sealife is the conception and realisation of a semantic Grid browser for the life sciences, which will link the existing Web to the currently emerging eScience infrastructure. The SeaLife Browser will allow users to automatically link a host of Web servers and Web/Grid services to the Web content he/she is visiting. This will be accomplished using eScience's growing number of Web/Grid Services and its XML-based standards and ontologies. The browser will identify terms in the pages being browsed through the background knowledge held in ontologies. Through the use of Semantic Hyperlinks, which link identified ontology terms to servers and services, the SeaLife Browser will offer a new dimension of context-based information integration. In this paper, we give an overview over the different components of the browser and their interplay. This SeaLife Browser will be demonstrated within three application scenarios in evidence-based medicine, literature & patent mining, and molecular biology, all relating to the study of infectious diseases. The three applications vertically integrate the molecule/cell, the tissue/organ and the patient/population level by covering the analysis of high-throughput screening data for endocytosis (the molecular entry pathway into the cell), the expression of proteins in the spatial context of tissue and organs, and a high-level library on infectious diseases designed for clinicians and their patients. For more information see http://www.biote.ctu-dresden.de/sealife.

  1. Materials Science and Materials Chemistry for Large Scale Electrochemical Energy Storage: From Transportation to Electrical Grid

    SciTech Connect

    Liu, Jun; Zhang, Jiguang; Yang, Zhenguo; Lemmon, John P.; Imhoff, Carl H.; Graff, Gordon L.; Li, Liyu; Hu, Jian Z.; Wang, Chong M.; Xiao, Jie; Xia, Guanguang; Viswanathan, Vilayanur V.; Baskaran, Suresh; Sprenkle, Vincent L.; Li, Xiaolin; Shao, Yuyan; Schwenzer, Birgit

    2013-02-15

    Large-scale electrical energy storage has become more important than ever for reducing fossil energy consumption in transportation and for the widespread deployment of intermittent renewable energy in electric grid. However, significant challenges exist for its applications. Here, the status and challenges are reviewed from the perspective of materials science and materials chemistry in electrochemical energy storage technologies, such as Li-ion batteries, sodium (sulfur and metal halide) batteries, Pb-acid battery, redox flow batteries, and supercapacitors. Perspectives and approaches are introduced for emerging battery designs and new chemistry combinations to reduce the cost of energy storage devices.

  2. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  3. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  4. Experimental control requirements for life sciences

    NASA Technical Reports Server (NTRS)

    Berry, W. E.; Sharp, J. C.

    1978-01-01

    The Life Sciences dedicated Spacelab will enable scientists to test hypotheses in various disciplines. Building upon experience gained in mission simulations, orbital flight test experiments, and the first three Spacelab missions, NASA will be able to progressively develop the engineering and management capabilities necessary for the first Life Sciences Spacelab. Development of experiments for these missions will require implementation of life-support systems not previously flown in space. Plant growth chambers, animal holding facilities, aquatic specimen life-support systems, and centrifuge-mounted specimen holding units are examples of systems currently being designed and fabricated for flight.

  5. Visual monitoring of autonomous life sciences experimentation

    NASA Technical Reports Server (NTRS)

    Blank, G. E.; Martin, W. N.

    1987-01-01

    The design and implementation of a computerized visual monitoring system to aid in the monitoring and control of life sciences experiments on board a space station was investigated. A likely multiprocessor design was chosen, a plausible life science experiment with which to work was defined, the theoretical issues involved in the programming of a visual monitoring system for the experiment was considered on the multiprocessor, a system for monitoring the experiment was designed, and simulations of such a system was implemented on a network of Apollo workstations.

  6. Geomorphology, Science (Experimental): 5343.09.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    Performance objectives are stated for this secondary school instructional unit concerned with aspects of earth science with emphases on the internal and external forces that bring about changes in the earth's crust. Lists of films and state-adopted and other texts are presented. Included are a course outline summarizing the unit content; numerous…

  7. RAON experimental facilities for nuclear science

    SciTech Connect

    Kwon, Y. K.; Kim, Y. K.; Komatsubara, T.; Moon, J. Y.; Park, J. S.; Shin, T. S.; Kim, Y. J.

    2014-05-02

    The Rare Isotope Science Project (RISP) was established in December 2011 and has put quite an effort to carry out the design and construction of the accelerator complex facility named “RAON”. RAON is a rare isotope (RI) beam facility that aims to provide various RI beams of proton-and neutron-rich nuclei as well as variety of stable ion beams of wide ranges of energies up to a few hundreds MeV/nucleon for the researches in basic science and application. Proposed research programs for nuclear physics and nuclear astrophysics at RAON include studies of the properties of exotic nuclei, the equation of state of nuclear matter, the origin of the universe, process of nucleosynthesis, super heavy elements, etc. Various high performance magnetic spectrometers for nuclear science have been designed, which are KOBRA (KOrea Broad acceptance Recoil spectrometer and Apparatus), LAMPS (Large Acceptance Multi-Purpose Spectrometer), and ZDS (Zero Degree Spectrometer). The status of those spectrometers for nuclear science will be presented with a brief report on the RAON.

  8. Environmental Science, Grade 9. Experimental Curriculum Bulletin.

    ERIC Educational Resources Information Center

    Bernstein, Leonard, Ed.

    This is the teacher's guide for the required, interdisciplinary, ninth-year environmental science course for the New York City Schools. One hundred twenty lesson plans, divided into nine units, are presented. Areas of study include the living and non-living environment, ecosystems, population, urban ecology, energy and technology, pollution, and…

  9. Nuclear Test-Experimental Science: Annual report, fiscal year 1988

    SciTech Connect

    Struble, G.L.; Donohue, M.L.; Bucciarelli, G.; Hymer, J.D.; Kirvel, R.D.; Middleton, C.; Prono, J.; Reid, S.; Strack, B.

    1988-01-01

    Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challenges and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.

  10. Experimental study of the effect of spacer grid on the flow structure in fuel assemblies of the AES 2006 reactor

    NASA Astrophysics Data System (ADS)

    Kashinskii, O. N.; Lobanov, P. D.; Pribaturin, N. A.; Kurdyumov, A. S.; Volkov, S. E.

    2013-01-01

    Results from an experimental study of the local hydrodynamic structure of liquid flow in a 37-cell model simulating a fuel assembly used in the AES-2006 reactor are presented. Special attention is paid to the effect of spacer grid on flow hydrodynamics. Data on variations of the local and integral values of the liquid axial velocity and friction stress on the fuel rod simulator's wall with distance from the grid are given.

  11. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    PubMed

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing).

  12. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    PubMed

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing). PMID:21521586

  13. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  14. A 35 Year Earth Science Data Record of Gridded IR Atmospheric Radiances

    NASA Astrophysics Data System (ADS)

    Halem, M.; Chapman, D.; Nguyen, P.

    2008-12-01

    We present the generation of a 35 year Earth Science Data Record (ESDR) of gridded level 1B atmospheric radiances at a 250 km spatial resolution from sources of satellite data including the Vertical Temperature Profiler Radiometer (VTPR), High Resolution Infrared Sounder (HIRS2/3/4), and Atmospheric Infrared Sounder (AIRS). We use the MODIS long wave channels to validate the calibration of the AIRS and HIRS data. VTPR is an operational 8-spectral channel infrared sounding system with an IFOV around 55km at nadir that operated from 1972 to 1979. The HIRS/2 sensor is a 20 spectral channel instrument with an IFOV approx. 20km, that flew from 1979 to 2001 forming a 22 year record. HIRS/3, is an advanced HIRS sounder that flew on NOAA 15-17 from 1998 to the present. HIRS/4, essentially the same as HIRS/3 except for an IFOV of 10 km has flown on the ESA Met 0p A from 2006 to present. AIRS on Aqua satellite launched on May 2002 has 2374 spectral channels from 3.7 μm to 15 μm and is well calibrated as compared with MODIS channels on the same satellite. Based on the Aqua Senior Project Review of available flight fuel, power and orbital maneuvers, the assessed life span of the satellite Aqua is estimated to be 2013. No such gridded data products of just the observed IR radiances are available since the emphasis for these sensors was the inference of temperature profiles from the observations for use in weather analysis and prediction. We have developed a system, SOAR, that provides gridded radiance data for AIRS and MODIS radiances that can meet the precision and accuracy required for a Fundamental Data Record (FDR). We are exploiting the IBM Cell blade cluster (at UMBC) of 250 processors to geolocate and grid the entire data volume of AIRS and MODIS instruments employing a data intensive raycasting algorithm. The Observation Coverage (obscov) based geolocation significantly improves gridded accuracy by 1 Kelvin Brightness Temperature over most regions on Earth, when

  15. [Experimentation with women: science fiction or reality?].

    PubMed

    Villar Amigó, Vicente M

    2008-01-01

    Many people will not have heard about the experimentation that has been, and continues to be, carried out on women, because much of the media makes no mention of the matter. Just a few examples that could be mentioned are experimentation with the contraceptive pill, forced sterilization, egg donation, surrogate motherhood, kidney and other organ donation, and unnecessary therapy and surgery. In a few cases such experimentation could well be termed exploitation of women, with all kinds of excuses or humanitarian reasons, and sometimes communitarian purposes and even reasons concerning possible benefits for the whole of society, being mentioned. The present work aims to stimulate reflection about some types of research which can only be regarded as exploitation of women.

  16. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  17. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  18. Scalability of network facing services used in the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Sfiligoi, I.; Pi, H.; Würthwein, F.; Theissen, C.; Dost, J. M.

    2011-12-01

    The Open Science Grid relies on several network facing services to deliver resources to its users. The major services are the Compute Elements, Storage Elements, Workload Management Systems and Information Systems. All of these services are exposed to traffic coming from all over the world in an unmanaged way, so it is very important to know how they behave at different levels of load. In this paper we present the methodology and the results of scalability and reliability tests performed by OSG on some of the above services. The major services being tested are the Condor batch system, the GT2, GRAM5 and CREAM CEs, and the BeStMan SRM SE.

  19. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  20. The NASA/GSFC Advanced Data Grid: A Prototype for Future Earth Science Ground System Architectures

    NASA Technical Reports Server (NTRS)

    Gasster, Samuel D.; Lee, Craig; Davis, Brooks; Clark, Matt; AuYeung, Mike; Wilson, John R.; Ladwig, Debra M.

    2003-01-01

    Contents include the following: Background and motivation. Grid computing concepts. Advanced data grid (ADG) prototype development. ADG requirements and operations concept. ADG architecture. ADG implementation. ADG test plan. ADG schedule. Summary and status.

  1. The Structure of Scientific Arguments by Secondary Science Teachers: Comparison of Experimental and Historical Science Topics

    ERIC Educational Resources Information Center

    Gray, Ron; Kang, Nam-Hwa

    2014-01-01

    Just as scientific knowledge is constructed using distinct modes of inquiry (e.g. experimental or historical), arguments constructed during science instruction may vary depending on the mode of inquiry underlying the topic. The purpose of this study was to examine whether and how secondary science teachers construct scientific arguments during…

  2. The Virtual Kidney: an eScience interface and Grid portal.

    PubMed

    Harris, Peter J; Buyya, Rajkumar; Chu, Xingchen; Kobialka, Tom; Kazmierczak, Ed; Moss, Robert; Appelbe, William; Hunter, Peter J; Thomas, S Randall

    2009-06-13

    The Virtual Kidney uses a web interface and distributed computing to provide experimental scientists and analysts with access to computational simulations and knowledge databases hosted in geographically separated laboratories. Users can explore a variety of complex models without requiring the specific programming environment in which applications have been developed. This initiative exploits high-bandwidth communication networks for collaborative research and for shared access to knowledge resources. The Virtual Kidney has been developed within a specialist community of renal scientists but is transferable to other areas of research requiring interaction between published literature and databases, theoretical models and simulations and the formulation of effective experimental designs. A web-based three-dimensional interface provides access to experimental data, a parameter database and mathematical models. A multi-scale kidney reconstruction includes blood vessels and serially sectioned nephrons. Selection of structures provides links to the database, returning parameter values and extracts from the literature. Models are run locally or remotely with a Grid resource broker managing scheduling, monitoring and visualization of simulation results and application, credential and resource allocation. Simulation results are viewed graphically or as scaled colour gradients on the Virtual Kidney structures, allowing visual and quantitative appreciation of the effects of simulated parameter changes. PMID:19414450

  3. Environmental Science. An Experimental Programme for Primary Teachers.

    ERIC Educational Resources Information Center

    Linke, R. D.

    An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…

  4. An Illustration of the Experimenter Expectancy Effect in School Science

    ERIC Educational Resources Information Center

    Allen, Michael; Briten, Elizabeth

    2012-01-01

    Two groups of year 6 pupils (age 10-11 years) each experienced science practical lessons that were essentially identical but for one difference: one group (theory-led) were told by the teacher what result they should expect, and the other group (hypothetico-deductive) were not. The theory-led group demonstrated experimental bias, recording results…

  5. Experimental Study of Two Phase Flow Behavior Past BWR Spacer Grids

    SciTech Connect

    Ratnayake, Ruwan K.; Hochreiter, L.E.; Ivanov, K.N.; Cimbala, J.M.

    2002-07-01

    Performance of best estimate codes used in the nuclear industry can be significantly improved by reducing the empiricism embedded in their constitutive models. Spacer grids have been found to have an important impact on the maximum allowable Critical Heat Flux within the fuel assembly of a nuclear reactor core. Therefore, incorporation of suitable spacer grids models can improve the critical heat flux prediction capability of best estimate codes. Realistic modeling of entrainment behavior of spacer grids requires understanding the different mechanisms that are involved. Since visual information pertaining to the entrainment behavior of spacer grids cannot possibly be obtained from operating nuclear reactors, experiments have to be designed and conducted for this specific purpose. Most of the spacer grid experiments available in literature have been designed in view of obtaining quantitative data for the purpose of developing or modifying empirical formulations for heat transfer, critical heat flux or pressure drop. Very few experiments have been designed to provide fundamental information which can be used to understand spacer grid effects and phenomena involved in two phase flow. Air-water experiments were conducted to obtain visual information on the two-phase flow behavior both upstream and downstream of Boiling Water Reactor (BWR) spacer grids. The test section was designed and constructed using prototypic dimensions such as the channel cross-section, rod diameter and other spacer grid configurations of a typical BWR fuel assembly. The test section models the flow behavior in two adjacent sub channels in the BWR core. A portion of a prototypic BWR spacer grid accounting for two adjacent channels was used with industrial mild steel rods for the purpose of representing the channel internals. Symmetry was preserved in this practice, so that the channel walls could effectively be considered as the channel boundaries. Thin films were established on the rod surfaces

  6. SciFlo: Semantically-Enabled Grid Workflow for Collaborative Science

    NASA Astrophysics Data System (ADS)

    Yunck, T.; Wilson, B. D.; Raskin, R.; Manipon, G.

    2005-12-01

    SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (WS-* standards and the Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable SOAP Services, native executables, local command-line scripts, and python codes into a distributed computing flow (a graph of operators). SciFlo's XML dataflow documents can be a mixture of concrete operators (fully bound operations) and abstract template operators (late binding via semantic lookup). All data objects and operators can be both simply typed (simple and complex types in XML schema) and semantically typed using controlled vocabularies (linked to OWL ontologies such as SWEET). By exploiting ontology-enhanced search and inference, one can discover (and automatically invoke) Web Services and operators that have been semantically labeled as performing the desired transformation, and adapt a particular invocation to the proper interface (number, types, and meaning of inputs and outputs). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. A Visual Programming tool is also being developed, but it is not required. Once an analysis has been specified for a granule or day of data, it can be easily repeated with different control parameters and over months or years of data. SciFlo uses and preserves semantics, and also generates and infers new semantic annotations. Specifically, the SciFlo engine uses semantic metadata to

  7. OASIS: a data and software distribution service for Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  8. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  9. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    SciTech Connect

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  10. Comments on cognitive science in the experimental analysis of behavior

    PubMed Central

    Morris, Edward K.; Higgins, Stephen T.; Bickel, Warren K.

    1982-01-01

    Arguments are increasingly being made for the inclusion of cognitive science in the experimental analysis of behavior (TEAB). These arguments are described, and a critical analysis of them is presented, especially in regards to the logic of objective inference and the renewed use of cognitive intervening variables. In addition, one particular defining feature of cognitive processes (i.e., the absence of an immediate controlling stimulus) is described, along with alternative points of view stressing molar-molecular levels of analysis and historical causation. Finally, comments are made on the use of cognitive concepts and language in the behavioral sciences. On all of these issues, counter-arguments are based on available material in behavior analysis metatheory, concepts, and experimental practices. PMID:22478563

  11. Experimental Physical Sciences Vistas: MaRIE (draft)

    SciTech Connect

    Shlachter, Jack

    2010-09-08

    To achieve breakthrough scientific discoveries in the 21st century, a convergence and integration of world-leading experimental facilities and capabilities with theory, modeling, and simulation is necessary. In this issue of Experimental Physical Sciences Vistas, I am excited to present our plans for Los Alamos National Laboratory's future flagship experimental facility, MaRIE (Matter-Radiation Interactions in Extremes). MaRIE is a facility that will provide transformational understanding of matter in extreme conditions required to reduce or resolve key weapons performance uncertainties, develop the materials needed for advanced energy systems, and transform our ability to create materials by design. Our unique role in materials science starting with the Manhattan Project has positioned us well to develop a contemporary materials strategy pushing the frontiers of controlled functionality - the design and tailoring of a material for the unique demands of a specific application. Controlled functionality requires improvement in understanding of the structure and properties of materials in order to synthesize and process materials with unique characteristics. In the nuclear weapons program today, improving data and models to increase confidence in the stockpile can take years from concept to new knowledge. Our goal with MaRIE is to accelerate this process by enhancing predictive capability - the ability to compute a priori the observables of an experiment or test and pertinent confidence intervals using verified and validated simulation tools. It is a science-based approach that includes the use of advanced experimental tools, theoretical models, and multi-physics codes, simultaneously dealing with multiple aspects of physical operation of a system that are needed to develop an increasingly mature predictive capability. This same approach is needed to accelerate improvements to other systems such as nuclear reactors. MaRIE will be valuable to many national security

  12. Animal experimentation in forensic sciences: How far have we come?

    PubMed

    Cattaneo, C; Maderna, E; Rendinelli, A; Gibelli, D

    2015-09-01

    In the third millennium where ethical, ethological and cultural evolution seem to be leading more and more towards an inter-species society, the issue of animal experimentation is a moral dilemma. Speaking from a self-interested human perspective, avoiding all animal testing where human disease and therapy are concerned may be very difficult or even impossible; such testing may not be so easily justifiable when suffering-or killing-of non human animals is inflicted for forensic research. In order to verify how forensic scientists are evolving in this ethical issue, we undertook a systematic review of the current literature. We investigated the frequency of animal experimentation in forensic studies in the past 15 years and trends in publication in the main forensic science journals. Types of species, lesions inflicted, manner of sedation or anesthesia and euthanasia were examined in a total of 404 articles reviewed, among which 279 (69.1%) concerned studies involving animals sacrificed exclusively for the sake of the experiment. Killing still frequently includes painful methods such as blunt trauma, electrocution, mechanical asphyxia, hypothermia, and even exsanguination; of all these animals, apparently only 60.8% were anesthetized. The most recent call for a severe reduction if not a total halt to the use of animals in forensic sciences was made by Bernard Knight in 1992. In fact the principle of reduction and replacement, frequently respected in clinical research, must be considered the basis for forensic science research needing animals. PMID:26216717

  13. Animal experimentation in forensic sciences: How far have we come?

    PubMed

    Cattaneo, C; Maderna, E; Rendinelli, A; Gibelli, D

    2015-09-01

    In the third millennium where ethical, ethological and cultural evolution seem to be leading more and more towards an inter-species society, the issue of animal experimentation is a moral dilemma. Speaking from a self-interested human perspective, avoiding all animal testing where human disease and therapy are concerned may be very difficult or even impossible; such testing may not be so easily justifiable when suffering-or killing-of non human animals is inflicted for forensic research. In order to verify how forensic scientists are evolving in this ethical issue, we undertook a systematic review of the current literature. We investigated the frequency of animal experimentation in forensic studies in the past 15 years and trends in publication in the main forensic science journals. Types of species, lesions inflicted, manner of sedation or anesthesia and euthanasia were examined in a total of 404 articles reviewed, among which 279 (69.1%) concerned studies involving animals sacrificed exclusively for the sake of the experiment. Killing still frequently includes painful methods such as blunt trauma, electrocution, mechanical asphyxia, hypothermia, and even exsanguination; of all these animals, apparently only 60.8% were anesthetized. The most recent call for a severe reduction if not a total halt to the use of animals in forensic sciences was made by Bernard Knight in 1992. In fact the principle of reduction and replacement, frequently respected in clinical research, must be considered the basis for forensic science research needing animals.

  14. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  15. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  16. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  17. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  18. Boundary condition identification for a grid model by experimental and numerical dynamic analysis

    NASA Astrophysics Data System (ADS)

    Mao, Qiang; Devitis, John; Mazzotti, Matteo; Bartoli, Ivan; Moon, Franklin; Sjoblom, Kurt; Aktan, Emin

    2015-04-01

    There is a growing need to characterize unknown foundations and assess substructures in existing bridges. It is becoming an important issue for the serviceability and safety of bridges as well as for the possibility of partial reuse of existing infrastructures. Within this broader contest, this paper investigates the possibility of identifying, locating and quantifying changes of boundary conditions, by leveraging a simply supported grid structure with a composite deck. Multi-reference impact tests are operated for the grid model and modification of one supporting bearing is done by replacing a steel cylindrical roller with a roller of compliant material. Impact based modal analysis provide global modal parameters such as damped natural frequencies, mode shapes and flexibility matrix that are used as indicators of boundary condition changes. An updating process combining a hybrid optimization algorithm and the finite element software suit ABAQUS is presented in this paper. The updated ABAQUS model of the grid that simulates the supporting bearing with springs is used to detect and quantify the change of the boundary conditions.

  19. Effects of mesh style and grid convergence on particle deposition in bifurcating airway models with comparisons to experimental data.

    PubMed

    Longest, P Worth; Vinchurkar, Samir

    2007-04-01

    A number of research studies have employed a wide variety of mesh styles and levels of grid convergence to assess velocity fields and particle deposition patterns in models of branching biological systems. Generating structured meshes based on hexahedral elements requires significant time and effort; however, these meshes are often associated with high quality solutions. Unstructured meshes that employ tetrahedral elements can be constructed much faster but may increase levels of numerical diffusion, especially in tubular flow systems with a primary flow direction. The objective of this study is to better establish the effects of mesh generation techniques and grid convergence on velocity fields and particle deposition patterns in bifurcating respiratory models. In order to achieve this objective, four widely used mesh styles including structured hexahedral, unstructured tetrahedral, flow adaptive tetrahedral, and hybrid grids have been considered for two respiratory airway configurations. Initial particle conditions tested are based on the inlet velocity profile or the local inlet mass flow rate. Accuracy of the simulations has been assessed by comparisons to experimental in vitro data available in the literature for the steady-state velocity field in a single bifurcation model as well as the local particle deposition fraction in a double bifurcation model. Quantitative grid convergence was assessed based on a grid convergence index (GCI), which accounts for the degree of grid refinement. The hexahedral mesh was observed to have GCI values that were an order of magnitude below the unstructured tetrahedral mesh values for all resolutions considered. Moreover, the hexahedral mesh style provided GCI values of approximately 1% and reduced run times by a factor of 3. Based on comparisons to empirical data, it was shown that inlet particle seedings should be consistent with the local inlet mass flow rate. Furthermore, the mesh style was found to have an observable

  20. Effects of mesh style and grid convergence on particle deposition in bifurcating airway models with comparisons to experimental data.

    PubMed

    Longest, P Worth; Vinchurkar, Samir

    2007-04-01

    A number of research studies have employed a wide variety of mesh styles and levels of grid convergence to assess velocity fields and particle deposition patterns in models of branching biological systems. Generating structured meshes based on hexahedral elements requires significant time and effort; however, these meshes are often associated with high quality solutions. Unstructured meshes that employ tetrahedral elements can be constructed much faster but may increase levels of numerical diffusion, especially in tubular flow systems with a primary flow direction. The objective of this study is to better establish the effects of mesh generation techniques and grid convergence on velocity fields and particle deposition patterns in bifurcating respiratory models. In order to achieve this objective, four widely used mesh styles including structured hexahedral, unstructured tetrahedral, flow adaptive tetrahedral, and hybrid grids have been considered for two respiratory airway configurations. Initial particle conditions tested are based on the inlet velocity profile or the local inlet mass flow rate. Accuracy of the simulations has been assessed by comparisons to experimental in vitro data available in the literature for the steady-state velocity field in a single bifurcation model as well as the local particle deposition fraction in a double bifurcation model. Quantitative grid convergence was assessed based on a grid convergence index (GCI), which accounts for the degree of grid refinement. The hexahedral mesh was observed to have GCI values that were an order of magnitude below the unstructured tetrahedral mesh values for all resolutions considered. Moreover, the hexahedral mesh style provided GCI values of approximately 1% and reduced run times by a factor of 3. Based on comparisons to empirical data, it was shown that inlet particle seedings should be consistent with the local inlet mass flow rate. Furthermore, the mesh style was found to have an observable

  1. Analyzing Sustainable Energy Opportunities for a Small Scale Off-Grid Facility: A Case Study at Experimental Lakes Area (ELA), Ontario

    NASA Astrophysics Data System (ADS)

    Duggirala, Bhanu

    This thesis explored the opportunities to reduce energy demand and renewable energy feasibility at an off-grid science "community" called the Experimental Lakes Area (ELA) in Ontario. Being off-grid, ELA is completely dependent on diesel and propane fuel supply for all its electrical and heating needs, which makes ELA vulnerable to fluctuating fuel prices. As a result ELA emits a large amount of greenhouse gases (GHG) for its size. Energy efficiency and renewable energy technologies can reduce energy consumption and consequently energy cost, as well as GHG. Energy efficiency was very important to ELA due to the elevated fuel costs at this remote location. Minor upgrades to lighting, equipment and building envelope were able to reduce energy costs and reduce load. Efficient energy saving measures were recommended that save on operating and maintenance costs, namely, changing to LED lights, replacing old equipment like refrigerators and downsizing of ice makers. This resulted in a 4.8% load reduction and subsequently reduced the initial capital cost for biomass by 27,000, by 49,500 for wind power and by 136,500 for solar power. Many alternative energies show promise as potential energy sources to reduce the diesel and propane consumption at ELA including wind energy, solar heating and biomass. A biomass based CHP system using the existing diesel generators as back-up has the shortest pay back period of the technologies modeled. The biomass based CHP system has a pay back period of 4.1 years at 0.80 per liter of diesel, as diesel price approaches $2.00 per liter the pay back period reduces to 0.9 years, 50% the generation cost compared to present generation costs. Biomass has been successfully tried and tested in many off-grid communities particularly in a small-scale off-grid setting in North America and internationally. Also, the site specific solar and wind data show that ELA has potential to harvest renewable resources and produce heat and power at competitive

  2. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    NASA Astrophysics Data System (ADS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  3. Experimental Evaluation of Load Rejection Over-Voltage from Grid-Tied Solar Inverters

    SciTech Connect

    Nelson, Austin; Hoke, Anderson; Chakraborty, Sudipta; Ropp, Michael; Chebahtah, Justin; Wang, Trudie; Zimmerly, Brian

    2015-06-14

    This paper investigates the impact of load rejection over-voltage (LRO) from commercially available grid-tied photovoltaic (PV) inverters. LRO can occur when a breaker opens and the power output from a distributed energy resource (DER) exceeds the load. Simplified models of current-controlled inverters can over-predict LRO magnitudes, thus it is useful to quantify the effect through laboratory testing. The load rejection event was replicated using a hardware testbed at the National Renewable Energy Laboratory (NREL), and a set of commercially available PV inverters was tested to quantify the impact of LRO for a range of generation-to-load ratios. The magnitude and duration of the over-voltage events are reported in this paper along with a discussion of characteristic inverter output behavior. The results for the inverters under test showed that maximum over-voltage magnitudes were less than 200% of nominal voltage, and much lower in many test cases. These research results are important because utilities that interconnect inverter-based DER need to understand their characteristics under abnormal grid conditions.

  4. [Science museums and psychology: interactivity, experimentation, and context].

    PubMed

    Colinvaux, Dominique

    2005-01-01

    The article reflects on the notion of the museum experience from the perspective of a visitor to a science and technology museum. Unlike studies that postulate a generic, abstract 'model visitor', the goal was to discuss the perspectives of the visitor as a psychological being, and to this end the research relied on the notion of interactivity. Using two classic psychology studies analyzing the behavior of children and adolescents, the current study first focused on the notion of experimentation, characterized as an interaction between subject and object. It then explored interactions between subjects and contexts, approaching from the notion of mediated action. My conclusion is that a museum experience should, on the one hand, take into account the visitor's ability to act, ask, and experiment and, on the other, the specific museum contexts that invite and propose but may also limit these very chances to act, question, and experiment.

  5. EverVIEW: A visualization platform for hydrologic and Earth science gridded data

    NASA Astrophysics Data System (ADS)

    Romañach, Stephanie S.; McKelvy, Mark; Suir, Kevin; Conzelmann, Craig

    2015-03-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  6. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  7. The Distinction between Experimental and Historical Sciences as a Framework for Improving Classroom Inquiry

    ERIC Educational Resources Information Center

    Gray, Ron

    2014-01-01

    Inquiry experiences in secondary science classrooms are heavily weighted toward experimentation. We know, however, that many fields of science (e.g., evolutionary biology, cosmology, and paleontology), while they may utilize experiments, are not justified by experimental methodologies. With the focus on experimentation in schools, these fields of…

  8. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation, and Research

    NASA Astrophysics Data System (ADS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-06-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing -1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  9. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-01-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  10. Students' epistemologies about experimental physics: Validating the Colorado Learning Attitudes about Science Survey for experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-06-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder and elsewhere, we developed the Colorado Learning Attitudes about Science Survey for experimental physics (E-CLASS). Previous work with this assessment has included establishing the accuracy and clarity of the instrument through student interviews and preliminary testing. Several years of data collection at multiple institutions has resulted in a growing national data set of student responses. Here, we report on results of the analysis of these data to investigate the statistical validity and reliability of the E-CLASS as a measure of students' epistemologies for a broad student population. We find that the E-CLASS demonstrates an acceptable level of both validity and reliability on measures of item and test discrimination, test-retest reliability, partial-sample reliability, internal consistency, concurrent validity, and convergent validity. We also examine students' responses using principal component analysis and find that, as expected, the E-CLASS does not exhibit strong factors (a.k.a. categories).

  11. Scientific Grid computing.

    PubMed

    Coveney, Peter V

    2005-08-15

    We introduce a definition of Grid computing which is adhered to throughout this Theme Issue. We compare the evolution of the World Wide Web with current aspirations for Grid computing and indicate areas that need further research and development before a generally usable Grid infrastructure becomes available. We discuss work that has been done in order to make scientific Grid computing a viable proposition, including the building of Grids, middleware developments, computational steering and visualization. We review science that has been enabled by contemporary computational Grids, and associated progress made through the widening availability of high performance computing.

  12. Data Grids: a new computational infrastructure for data-intensive science.

    PubMed

    Avery, Paul

    2002-06-15

    Twenty-first-century scientific and engineering enterprises are increasingly characterized by their geographic dispersion and their reliance on large data archives. These characteristics bring with them unique challenges. First, the increasing size and complexity of modern data collections require significant investments in information technologies to store, retrieve and analyse them. Second, the increased distribution of people and resources in these projects has made resource sharing and collaboration across significant geographic and organizational boundaries critical to their success. In this paper I explore how computing infrastructures based on Data Grids offer data-intensive enterprises a comprehensive, scalable framework for collaboration and resource sharing. A detailed example of a Data Grid framework is presented for a Large Hadron Collider experiment, where a hierarchical set of laboratory and university resources comprising petaflops of processing power and a multi-petabyte data archive must be efficiently used by a global collaboration. The experience gained with these new information systems, providing transparent managed access to massive distributed data collections, will be applicable to large-scale, data-intensive problems in a wide spectrum of scientific and engineering disciplines, and eventually in industry and commerce. Such systems will be needed in the coming decades as a central element of our information-based society.

  13. Data Grids: a new computational infrastructure for data-intensive science.

    PubMed

    Avery, Paul

    2002-06-15

    Twenty-first-century scientific and engineering enterprises are increasingly characterized by their geographic dispersion and their reliance on large data archives. These characteristics bring with them unique challenges. First, the increasing size and complexity of modern data collections require significant investments in information technologies to store, retrieve and analyse them. Second, the increased distribution of people and resources in these projects has made resource sharing and collaboration across significant geographic and organizational boundaries critical to their success. In this paper I explore how computing infrastructures based on Data Grids offer data-intensive enterprises a comprehensive, scalable framework for collaboration and resource sharing. A detailed example of a Data Grid framework is presented for a Large Hadron Collider experiment, where a hierarchical set of laboratory and university resources comprising petaflops of processing power and a multi-petabyte data archive must be efficiently used by a global collaboration. The experience gained with these new information systems, providing transparent managed access to massive distributed data collections, will be applicable to large-scale, data-intensive problems in a wide spectrum of scientific and engineering disciplines, and eventually in industry and commerce. Such systems will be needed in the coming decades as a central element of our information-based society. PMID:12804274

  14. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    NASA Astrophysics Data System (ADS)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  15. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William W.; Schuster, David; Adams, Betty; Applegate, Brooks; Skjold, Brandy; Undreiu, Adriana; Loving, Cathleen C.; Gobert, Janice D.

    2010-01-01

    There are continuing educational and political debates about "inquiry" versus "direct" teaching of science. Traditional science instruction has been largely direct but in the US, recent national and state science education standards advocate inquiry throughout K-12 education. While inquiry-based instruction has the advantage of modelling aspects…

  16. Ground-based magnetometer arrays and geomagnetically induced currents in power grids: science and operations

    NASA Astrophysics Data System (ADS)

    Thomson, A. W.; Beggan, C.; Kelly, G.

    2012-12-01

    Space weather impacts on worldwide technological infrastructures are likely to be at their greatest between 2012 and 2015, during the peak and early descending phase of the current solar cycle. Examples of infrastructures at risk include power grids, pipelines, railways, communications, satellite operations, high latitude air travel and global navigation satellite systems. For example, severe magnetic storms in March 1989 and October 2003, near the peaks of recent solar cycles, were particularly significant in causing problems for a wide variety of technologies. Further back in time, severe storms in September 1859 and May 1921 are known to have been a problem for the more rudimentary technologies of the time. In this talk we will review how magnetic observatory data can best contribute to ongoing efforts to develop new space weather data products, particularly in aiding the management of electrical power transmission networks. Examples of existing and perhaps some suggestions for new data products and services will be given. Throughout, the need for near to real time data will be emphasised. We will also emphasise the importance of developing regional magnetometer networks and promoting magnetic data sharing to help turn research into operations. Developing research consortia, for example as in the European EURISGIC GIC project (www.eurisgic.eu), where magnetic and other data, as well as expertise, is pooled and shared is also recommended and adds to our ability to monitor the dynamic state of magnetospheric and ionospheric currents. We will discuss how industry currently perceives the space weather hazard, using recent examples from the power industry, where the concerns are with the risk to high voltage transformers and the safe and uninterrupted distribution of electrical power. Industry measurements of geomagnetic induced currents (GIC) are also vital for the validation of scientific models of the flow of GIC in power systems. Examples of GIC data sources and

  17. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    SciTech Connect

    Jablonowski, Christiane

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  18. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William; Schuster, David; Adams, Betty

    2010-01-01

    It is evident that "experientially-based" instruction and "active student engagement" are advantageous for effective science learning. However, "hands-on" and "minds-on" aspects can occur in both inquiry and direct science instruction, and convincing comparative evidence for the superiority of either mode remains rare. Thus, the pertinent question…

  19. Experimental Investigation of the Behavior of Sub-Grid Scale Motions in Turbulent Shear Flow

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian

    1992-01-01

    Experiments have been carried out on a vertical jet of helium issuing into a co-flow of air at a fixed exit velocity ratio of 2.0. At all the experimental conditions studied, the flow exhibits a strong self excited periodicity. The natural frequency behavior of the jet, the underlying fine-scale flow structure, and the transition to turbulence have been studied over a wide range of flow conditions. The experiments were conducted in a variable pressure facility which made it possible to vary the Reynolds number and Richardson number independently. A stroboscopic schlieren system was used for flow visualization and single-component Laser Doppler Anemometry was used to measure the axial component of velocity. The flow exhibits several interesting features. The presence of co-flow eliminates the random meandering typical of buoyant plumes in a quiescent environment and the periodicity of the helium jet under high Richardson number conditions is striking. Under these conditions transition to turbulence consists of a rapid but highly structured and repeatable breakdown and intermingling of jet and freestream fluid. At Ri = 1.6 the three-dimensional structure of the flow is seen to repeat from cycle to cycle. The point of transition moves closer to the jet exit as either the Reynolds number or the Richardson number increases. The wavelength of the longitudinal instability increases with Richardson number. At low Richardson numbers, the natural frequency scales on an inertial time scale. At high Richardson number the natural frequency scales on a buoyancy time scale. The transition from one flow regime to another occurs over a narrow range of Richardson numbers from 0.7 to 1. A buoyancy Strouhal number is used to correlate the high Richardson number frequency behavior.

  20. Vanguard: A New Science Mission For Experimental Astrobiology

    NASA Astrophysics Data System (ADS)

    Ellery, A.; Wynn-Williams, D.; Edwards, H.; Dickensheets, D.; Welch, C.; Curley, A.

    As an alternative to technically and financially problemat ic sample return missions, a rover-mounted laser Raman spectrometer sensitive to biomolecules and their mineral substrata is a promising alternative in the search for evidence of former life on Mars. We presented a new remote in situ analysis package being designed for experimental astrobiology on terrestrial-type planetary surfaces. The science is based on the hypothesis that if life arose on Mars, the selective pressure of solar radiation would have led to the evolution of pigmented systems to harness the energy of sunlight and to protect cells from concurrent UV stress. Microbial communities would have therefore become stratified by the light gradient, and our remote system would penetrate the near-subsurface profile in a vertical transect of horizontal strata in ancient sediments (such as palaeolake beds). The system will include an extensive array of robotic support to translocate and deploy a Raman spectrometer detectors beneath the surface of Mars ­ it will comprise of a base station lander to support communications, a robotic micro-rover to permit well- separated triplicate profiles made by three ground-penetrating moles mounted in a vertical configuration. Each mole will deploy a tether carrying fibre optic cables coupling the Raman spectrometer onboard the rover and the side-scanning sensor head on the mole. The complete system has been named Vanguard, and it represents a close collaboration between a space robotics engineer (Ellery), an astrobiologist (Wynn-Williams), a molecular spectroscopist (Edwards), an opto-electronic technologist (Dickensheets), a spacecraft engineer (Welch) and a robotic vision specialist (Curley). The autonomy requirement for the Vanguard instrument requires that significant scientific competence is imparted to the instrument through an expert system to ensure that quick-look analysis is performed onboard in real-time as the mole penetrates beneath the surface. Onboard

  1. FermiGrid

    SciTech Connect

    Yocum, D.R.; Berman, E.; Canal, P.; Chadwick, K.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Sharma, N.; Timm, S.; /Fermilab

    2007-05-01

    As one of the founding members of the Open Science Grid Consortium (OSG), Fermilab enables coherent access to its production resources through the Grid infrastructure system called FermiGrid. This system successfully provides for centrally managed grid services, opportunistic resource access, development of OSG Interfaces for Fermilab, and an interface to the Fermilab dCache system. FermiGrid supports virtual organizations (VOs) including high energy physics experiments (USCMS, MINOS, D0, CDF, ILC), astrophysics experiments (SDSS, Auger, DES), biology experiments (GADU, Nanohub) and educational activities.

  2. Grid Architecture 2

    SciTech Connect

    Taft, Jeffrey D.

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  3. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  4. The DOE SunShot Initiative: Science and Technology to enable Solar Electricity at Grid Parity

    NASA Astrophysics Data System (ADS)

    Ramesh, Ramamoorthy

    2012-02-01

    The SunShot Initiative's mission is to develop solar energy technologies through a collaborative national push to make solar Photovoltaic (PV) and Concentrated Solar Power (CSP) energy technologies cost-competitive with fossil fuel based energy by reducing the cost of solar energy systems by ˜ 75 percent before 2020. Reducing the total installed cost for utility-scale solar electricity to roughly 6 cents per kilowatt hour (1/Watt) without subsidies will result in rapid, large-scale adoption of solar electricity across the United States and the world. Achieving this goal will require significant reductions and technological innovations in all PV system components, namely modules, power electronics, and balance of systems (BOS), which includes all other components and costs required for a fully installed system including permitting and inspection costs. This investment will re-establish American technological and market leadership, improve the nation's energy security, strengthen U.S. economic competitiveness and catalyze domestic economic growth in the global clean energy race. SunShot is a cooperative program across DOE, involving the Office of Science, the Office of Energy Efficiency and Renewable Energy and ARPA-E.

  5. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  6. Learning Political Science with Prediction Markets: An Experimental Study

    ERIC Educational Resources Information Center

    Ellis, Cali Mortenson; Sami, Rahul

    2012-01-01

    Prediction markets are designed to aggregate the information of many individuals to forecast future events. These markets provide participants with an incentive to seek information and a forum for interaction, making markets a promising tool to motivate student learning. We carried out a quasi-experiment in an introductory political science class…

  7. Early Adolescence: Using Consumer Science to Develop Experimental Techniques.

    ERIC Educational Resources Information Center

    Padilla, Michael

    1981-01-01

    Describes several consumer science activities useful for introducing process skills for the middle/junior high school student. Activities described include testing laundry detergent effectiveness for stain removal, comparison of quantities in fast foods, and various activities concerning tests of product claims. (DS)

  8. Geometric and Applied Optics, Science (Experimental): 5318.04.

    ERIC Educational Resources Information Center

    Sanderson, Robert C.

    This unit of instruction presents a laboratory-oriented course which relates the sources and behaviors of light to man's control and uses of light. Successful completion of Algebra I and Plane Geometry is strongly recommended as indicators of success. The course is recommended if the student plans further studies in science, optical technology, or…

  9. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    ERIC Educational Resources Information Center

    Allen, Michael; Coole, Hilary

    2012-01-01

    This paper describes a randomised educational experiment (n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from…

  10. Accounting for reciprocal host-microbiome interactions in experimental science.

    PubMed

    Stappenbeck, Thaddeus S; Virgin, Herbert W

    2016-06-08

    Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research.

  11. The First Language in Science Class: A Quasi-Experimental Study in Late French Immersion

    ERIC Educational Resources Information Center

    Turnbull, Miles; Cormier, Marianne; Bourque, Jimmy

    2011-01-01

    This article reports analysis of data collected from a quasi-experimental study in 2 Canadian late French immersion science classes. We examine if, how, and when the first language (L1) is used when students in the first years of their second language learning talk about complex science concepts. We compare differences in groups following a…

  12. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor.

    PubMed

    Singh, M J; De Esch, H P L

    2010-01-01

    This paper describes the physics design of a 100 keV, 60 A H(-) accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated.

  13. Life Science Research and Drug Discovery at the Turn of the 21st Century: The Experience of SwissBioGrid

    PubMed Central

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-01-01

    Background It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling “in-silico” the processes observed “in-vitro.” The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. Results SwissBioGrid was established to provide computational support to two pilot projects: one for proteomics data analysis, and the other for high-throughput molecular docking (“virtual screening”) to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a large-scale data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the

  14. Improving plant bioaccumulation science through consistent reporting of experimental data.

    PubMed

    Fantke, Peter; Arnot, Jon A; Doucette, William J

    2016-10-01

    Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments.

  15. Improving plant bioaccumulation science through consistent reporting of experimental data.

    PubMed

    Fantke, Peter; Arnot, Jon A; Doucette, William J

    2016-10-01

    Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments. PMID:27393944

  16. An Experimental Evaluation of the Effects of ESCP and General Science on the Development of Interdisciplinary Science Concepts by Ninth Grade Students.

    ERIC Educational Resources Information Center

    Coleman, Esther Montague

    This study was an experimental evaluation of achievement in understanding interdisciplinary science concepts by ninth grade students enrolled in two different integrated science courses. The experimental group used "Investigating the Earth", the textbook/laboratory program, developed by the Earth Science Curriculum Project (ESCP) staff. The…

  17. Experimental study of an active grid-generated shearless mixing layer and comparisons with large-eddy simulation

    NASA Astrophysics Data System (ADS)

    Kang, Hyung Suk; Meneveau, Charles

    2008-12-01

    A shearless mixing layer characterized by interactions between two regions with different turbulence intensities but without mean shear is investigated experimentally in a wind tunnel. Reynolds numbers higher than those of prior studies [B. Gilbert, "Diffusion mixing in grid turbulence without mean shear," J. Fluid Mech. 100, 349 (1980); S. Veeravalli and Z. Warhaft, "The shearless turbulent mixing layer," J. Fluid Mech. 207, 191 (1989); B. Knaepen, O. Debliquy, and D. Carati, "Direct numerical simulation and large-eddy simulation of a shear-free mixing layer," J. Fluid Mech. 514, 153 (2004); D. Tordella and M. Iovieno, "Numerical experiments on the intermediate asymptotics of shear-free turbulent transport and diffusion," J. Fluid Mech. 549, 429 (2006); D. A. Briggs, J. H. Ferziger, J. R. Koseff, and S. G. Monismith, "Entrainment in a shear-free turbulent mixing layer," J. Fluid Mech. 310, 215 (1996)] are achieved by using an active grid with rotating winglets on one-half of its cross section. Stationary flow-conditioning fine meshes are used to avoid mean velocity gradients. Measurements are performed at five different downstream wind-tunnel locations using an X-type hot-wire probe and a stereoscopic particle image velocimetry system. The Reynolds numbers based on the Taylor microscale in the high- and low-kinetic energy regions are 170 and 88, respectively. The energy and integral length-scale ratios between the two regions are 4.27 and 1.73, respectively. The inlet turbulence in the upper and lower portions of the shearless mixing layer is not fully isotropic, with the streamwise velocity fluctuations being between 6% and 13% higher than the cross-stream ones. Fundamental statistical properties of the flow are documented and analyzed at various scales using band-pass box-filtered velocities. Downstream evolution of variance and half-width of the mixing layer, skewness and flatness factors, as well as the statistics of two-point velocity increments at various

  18. An 11-year global gridded aerosol optical thickness reanalysis (v1.0) for atmospheric and climate sciences

    NASA Astrophysics Data System (ADS)

    Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.

    2016-04-01

    While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how

  19. The Beliefs and Behaviors of Pupils in an Experimental School: The Science Lab.

    ERIC Educational Resources Information Center

    Lancy, David F.

    This booklet, the second in a series, reports on the results of a year-long research project conducted in an experimental school associated with the Learning Research and Development Center, University of Pittsburgh. Specifically, this is a report of findings pertaining to one major setting in the experimental school, the science lab. The science…

  20. Is Physicality an Important Aspect of Learning through Science Experimentation among Kindergarten Students?

    ERIC Educational Resources Information Center

    Zacharia, Zacharias C.; Loizou, Eleni; Papaevripidou, Marios

    2012-01-01

    The purpose of this study was to investigate whether physicality (actual and active touch of concrete material), as such, is a necessity for science experimentation learning at the kindergarten level. We compared the effects of student experimentation with Physical Manipulatives (PM) and Virtual Manipulatives (VM) on kindergarten students'…

  1. Challenges facing production grids

    SciTech Connect

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  2. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    NASA Astrophysics Data System (ADS)

    Allen, Michael; Coole, Hilary

    2012-06-01

    This paper describes a randomised educational experiment ( n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from learners that were triggered by their own confirmation biases. The treatment group showed superior learning gains to control at post-test immediately after the lesson, although benefits had dissipated after 6 weeks. Findings are discussed with reference to the conceptual change paradigm and to the importance of feeling emotion during a learning experience, having implications for the teaching of pedagogies to adults that have been previously shown to be successful with children.

  3. Students' Epistemologies about Experimental Physics: Validating the Colorado Learning Attitudes about Science Survey for Experimental Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-01-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder…

  4. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  5. Considerations for Life Science experimentation on the Space Shuttle.

    PubMed

    Souza, K A; Davies, P; Rossberg Walker, K

    1992-10-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions.

  6. Considerations for Life Science experimentation on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Davies, P.; Rossberg Walker, K.

    1992-01-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions.

  7. Social Science and Neuroscience beyond Interdisciplinarity: Experimental Entanglements

    PubMed Central

    Callard, Felicity

    2015-01-01

    This article is an account of the dynamics of interaction across the social sciences and neurosciences. Against an arid rhetoric of ‘interdisciplinarity’, it calls for a more expansive imaginary of what experiment – as practice and ethos – might offer in this space. Arguing that opportunities for collaboration between social scientists and neuroscientists need to be taken seriously, the article situates itself against existing conceptualizations of these dynamics, grouping them under three rubrics: ‘critique’, ‘ebullience’ and ‘interaction’. Despite their differences, each insists on a distinction between sociocultural and neurobiological knowledge, or does not show how a more entangled field might be realized. The article links this absence to the ‘regime of the inter-’, an ethic of interdisciplinarity that guides interaction between disciplines on the understanding of their pre-existing separateness. The argument of the paper is thus twofold: (1) that, contra the ‘regime of the inter-’, it is no longer practicable to maintain a hygienic separation between sociocultural webs and neurobiological architecture; (2) that the cognitive neuroscientific experiment, as a space of epistemological and ontological excess, offers an opportunity to researchers, from all disciplines, to explore and register this realization. PMID:25972621

  8. Science and society: different bioethical approaches towards animal experimentation.

    PubMed

    Brom, Frans W A

    2002-01-01

    respect their integrity. By weighing these prima facie duties, the moral problem of animal experimentation exists in finding which duty actually has to be considered as the decisive duty. It will be argued that these three views, even though they will all justify animal experimentation to some extent, will do so in practice under different conditions. Many current conflicts regarding the use of animals for research may be better understood in light of the conflict between the three bioethical perspectives provided by these views.

  9. The Earth System Grid Federation (ESGF): Climate Science Infrastructure for Large-scale Data Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2015-12-01

    Progress in understanding and predicting climate change requires advanced tools to securely store, manage, access, process, analyze, and visualize enormous and distributed data sets. Only then can climate researchers understand the effects of climate change across all scales and use this information to inform policy decisions. With the advent of major international climate modeling intercomparisons, a need emerged within the climate-change research community to develop efficient, community-based tools to obtain relevant meteorological and other observational data, develop custom computational models, and export analysis tools for climate-change simulations. While many nascent efforts to fill these gaps appeared, they were not integrated and therefore did not benefit from collaborative development. Sharing huge data sets was difficult, and the lack of data standards prevented the merger of output data from different modeling groups. Thus began one of the largest-ever collaborative data efforts in climate science, resulting in the Earth System Grid Federation (ESGF), which is now used to disseminate model, observational, and reanalysis data for research assessed by the Intergovernmental Panel on Climate Change (IPCC). Today, ESGF is an open-source petabyte-level data storage and dissemination operational code-base that manages secure resources essential for climate change study. It is designed to remain robust even as data volumes grow exponentially. The internationally distributed, peer-to-peer ESGF "data cloud" archive represents the culmination of an effort that began in the late 1990s. ESGF portals are gateways to scientific data collections hosted at sites around the globe that allow the user to register and potentially access the entire ESGF network of data and services. The growing international interest in ESGF development efforts has attracted many others who want to make their data more widely available and easy to use. For example, the World Climate

  10. Safe Grid

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Stewart, Helen; Korsmeyer, David (Technical Monitor)

    2003-01-01

    The biggest users of GRID technologies came from the science and technology communities. These consist of government, industry and academia (national and international). The NASA GRID is moving into a higher technology readiness level (TRL) today; and as a joint effort among these leaders within government, academia, and industry, the NASA GRID plans to extend availability to enable scientists and engineers across these geographical boundaries collaborate to solve important problems facing the world in the 21 st century. In order to enable NASA programs and missions to use IPG resources for program and mission design, the IPG capabilities needs to be accessible from inside the NASA center networks. However, because different NASA centers maintain different security domains, the GRID penetration across different firewalls is a concern for center security people. This is the reason why some IPG resources are been separated from the NASA center network. Also, because of the center network security and ITAR concerns, the NASA IPG resource owner may not have full control over who can access remotely from outside the NASA center. In order to obtain organizational approval for secured remote access, the IPG infrastructure needs to be adapted to work with the NASA business process. Improvements need to be made before the IPG can be used for NASA program and mission development. The Secured Advanced Federated Environment (SAFE) technology is designed to provide federated security across NASA center and NASA partner's security domains. Instead of one giant center firewall which can be difficult to modify for different GRID applications, the SAFE "micro security domain" provide large number of professionally managed "micro firewalls" that can allow NASA centers to accept remote IPG access without the worry of damaging other center resources. The SAFE policy-driven capability-based federated security mechanism can enable joint organizational and resource owner approved remote

  11. "Exploratory experimentation" as a probe into the relation between historiography and philosophy of science.

    PubMed

    Schickore, Jutta

    2016-02-01

    This essay utilizes the concept "exploratory experimentation" as a probe into the relation between historiography and philosophy of science. The essay traces the emergence of the historiographical concept "exploratory experimentation" in the late 1990s. The reconstruction of the early discussions about exploratory experimentation shows that the introduction of the concept had unintended consequences: Initially designed to debunk philosophical ideas about theory testing, the concept "exploratory experimentation" quickly exposed the poverty of our conceptual tools for the analysis of experimental practice. Looking back at a number of detailed analyses of experimental research, we can now appreciate that the concept of exploratory experimentation is too vague and too elusive to fill the desideratum whose existence it revealed.

  12. "Exploratory experimentation" as a probe into the relation between historiography and philosophy of science.

    PubMed

    Schickore, Jutta

    2016-02-01

    This essay utilizes the concept "exploratory experimentation" as a probe into the relation between historiography and philosophy of science. The essay traces the emergence of the historiographical concept "exploratory experimentation" in the late 1990s. The reconstruction of the early discussions about exploratory experimentation shows that the introduction of the concept had unintended consequences: Initially designed to debunk philosophical ideas about theory testing, the concept "exploratory experimentation" quickly exposed the poverty of our conceptual tools for the analysis of experimental practice. Looking back at a number of detailed analyses of experimental research, we can now appreciate that the concept of exploratory experimentation is too vague and too elusive to fill the desideratum whose existence it revealed. PMID:26774065

  13. Analysis and Experimental Verification of New Power Flow Control for Grid-Connected Inverter with LCL Filter in Microgrid

    PubMed Central

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  14. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    PubMed

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  15. Characterizing the Experimental Procedure in Science Laboratories: A Preliminary Step towards Students Experimental Design

    ERIC Educational Resources Information Center

    Girault, Isabelle; d'Ham, Cedric; Ney, Muriel; Sanchez, Eric; Wajeman, Claire

    2012-01-01

    Many studies have stressed students' lack of understanding of experiments in laboratories. Some researchers suggest that if students design all or parts of entire experiment, as part of an inquiry-based approach, it would overcome certain difficulties. It requires that a procedure be written for experimental design. The aim of this paper is to…

  16. Students' transition from an engineering model to a science model of experimentation

    NASA Astrophysics Data System (ADS)

    Schauble, Leona; Klopfer, Leopold E.; Raghavan, Kalyani

    This study investigates the hypothesis that when children are engaged in science experiments, the goal of which is to understand relations among causes and effects, they often use the engineering model of experimentation, characterized by the more familiar goal of manipulating variables to produce a desired outcome. Sixteen fifth- and sixth-graders worked on two experimentation problems consistent with the engineering and science models, respectively. The context in which these problems were framed was also varied, to encourage adoption of either an engineering or science model. Over six 40-min sessions, the group achieved significant increases in the percentages of inferences about variables that were both correct and valid. Improvement was greatest for those who began with the engineering problem and then went on to the science problem. The science model was associated with broader exploration, more selectiveness about evidence interpreted, and greater attention to establishing that some variables are not causal. The findings suggest that research on scientific inquiry processes should attend not only to the science content students are reasoning about, but also to their beliefs about the goals of inquiry.

  17. General Science, Ninth Grade: Theme III and Theme IV. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This document is the student laboratory manual that was designed to accompany some of the experimental activities found in the teacher's guide to this general science course for ninth graders. It contains laboratory worksheets for lessons on such topics as: (1) soil; (2) hazardous waste; (3) wildlife refuges; (4) the water cycle; (5) water…

  18. Opening Possibilities in Experimental Science and Its History: Critical Explorations with Pendulums and Singing Tubes

    ERIC Educational Resources Information Center

    Cavicchi, Elizabeth

    2008-01-01

    A teacher and a college student explore experimental science and its history by reading historical texts, and responding with replications and experiments of their own. A curriculum of ever-widening possibilities evolves in their ongoing interactions with each other, history, and such materials as pendulums, flame, and resonant singing tubes.…

  19. Views of the STS-5 Science Press briefing with Student Experimenters

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Views of the STS-5 Science Press briefing with Student Experimenters. Photos include Michelle Issel of Wallingfor, Connecticut showing her studen experiment dealing with the formation of crystals in a weightless environment (37862); Aaron Gillette of Winter Haven, Florida displaying his student experiment dealing with the growth of Porifera in zero gravity (37863).

  20. An Experimental Learning Resources Center and a New Curriculum in the Social Sciences.

    ERIC Educational Resources Information Center

    Leeb, David

    At Mercer County Community College (New Jersey) an experimental learning resources center and a new curriculum in the social sciences were developed having primary objectives of: (1) keeping more minority-group students in school, (2) reducing their withdrawal rate, (3) developing assessment techniques accommodating inner-city populations, (4)…

  1. General Science, Ninth Grade: Theme I and Theme II. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This ninth grade student manual was developed to be used in conjunction with some of the experimental science activities described in the teacher's guide. It contains laboratory worksheets for: (1) measurement; (2) basic energy concepts; (3) heat energy; (4) light; (5) sound; (6) electricity; and (7) present and future energy resources. Additional…

  2. Experimental Evaluations of Elementary Science Programs: A Best-Evidence Synthesis

    ERIC Educational Resources Information Center

    Slavin, Robert E.; Lake, Cynthia; Hanley, Pam; Thurston, Allen

    2014-01-01

    This article presents a systematic review of research on the achievement outcomes of all types of approaches to teaching science in elementary schools. Study inclusion criteria included use of randomized or matched control groups, a study duration of at least 4 weeks, and use of achievement measures independent of the experimental treatment. A…

  3. Mathematics and Experimental Sciences in the FRG-Upper Secondary Schools. Occasional Paper 40.

    ERIC Educational Resources Information Center

    Steiner, Hans-Georg

    The mathematics and experimental science courses in the programs of the upper secondary school in the Federal Republic of Germany (FRG) are discussed. The paper addresses: (1) the two "secondary levels" within the FRG school system, indicating that the Secondary I-Level (SI) comprises grades 5 through 9 or 10 while the Secondary II-Level (SII)…

  4. Factors Influencing Students' Choice(s) of Experimental Science Subjects within the International Baccalaureate Diploma Programme

    ERIC Educational Resources Information Center

    James, Kieran

    2007-01-01

    This article outlines a study conducted in Finland and Portugal into the reasons why International Baccalaureate (IB) Diploma Programme (DP) students choose particular Experimental Science (Group 4) subjects. Its findings suggest that interest, enjoyment, university course and career requirements have most influence on students' choices.…

  5. Experimental evaluation of prefiltering for 56 Gbaud DP-QPSK signal transmission in 75 GHz WDM grid

    NASA Astrophysics Data System (ADS)

    Borkowski, Robert; de Carvalho, Luis Henrique H.; Silva, Edson Porto da; Diniz, Júlio César M.; Zibar, Darko; de Oliveira, Júlio César R. F.; Tafur Monroy, Idelfonso

    2014-01-01

    We investigate optical prefiltering for 56 Gbaud (224 Gbit/s) electrical time-division multiplexed (ETDM) dual polarization (DP) quaternary phase shift keying (QPSK) transmission. Different transmitter-side optical filter shapes are tested and their bandwidths are varied. Comparison of studied filter shapes shows an advantage of a pre-emphasis filter. Subsequently, we perform a fiber transmission of the 56 Gbaud DP QPSK signal filtered with the 65 GHz pre-emphasis filter to fit the 75 GHz transmission grid. Bit error rate (BER) of the signal remains below forward error correction (FEC) limit after 300 km of fiber propagation.

  6. Grid in Geosciences

    NASA Astrophysics Data System (ADS)

    Petitdidier, Monique; Schwichtenberg, Horst

    2010-05-01

    The worldwide Earth science community covers a mosaic of disciplines and players such as academia, industry, national surveys, international organizations, and so forth. It provides a scientific basis for addressing societal issues, which require that the Earth science community utilize massive amounts of data, both in real and remote time. This data is usually distributed among many different organizations and data centers. These facts, the utilization of massive, distributed data amounts, explain the interest of the Earth science community for Grid technology, also noticeable by the variety of applications ported and tools developed. In parallel to the participation in EGEE, other projects involving ES disciplines were or have been carried out as related projects to EGEE (Enabling Grids for E-sciencE) such as CYCLOPS, SEEGrid, EELA2, EUASIA or outside e.g., in the framework of WGISS/CEOS. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity were deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. Examples are in hydrology for flood or Black Sea Catchment monitoring, and in fire monitoring. Meteorological, pollution and climate applications are based on meteorological models ported on Grid such as MM5 (Mesoscale Model), WRF (Weather Research and Forecasting), RAMS (Regional Atmospheric Modeling System) or CAM (Community Atmosphere Model). Seismological applications on Grid are numerous in locations where their occurrence is important and computer resources too small; then interfaces and gateways have been developed to facilitate the access to data and specific software and avoid work duplication. A portal has been deployed for commercial seismological software, Geocluster, for academic users. In this presentation examples of such applications will

  7. The semantic web and knowledge grids.

    PubMed

    Goble, Carole; Stevens, Robert; Bechhofer, Sean

    2005-01-01

    The Semantic Web and the Knowledge Grid are recently proposed technological solutions to distributed knowledge management. Early experimental applications from the Life Science community indicate that the approaches have promise and suggest that this community be an appropriate nursery for grounding, developing and hardening the current, rather immature, machinery needed to deliver on the technological visions, which thus far have been dominated by technological curiosity rather than application-led practicality and relevance. Further necessary developments in theory, infrastructure, tools, and content management should and could be steered opportunistically by the needs and applications of Life Science.:

  8. The resisted rise of randomisation in experimental design: British agricultural science, c.1910-1930.

    PubMed

    Berry, Dominic

    2015-09-01

    The most conspicuous form of agricultural experiment is the field trial, and within the history of such trials, the arrival of the randomised control trial (RCT) is considered revolutionary. Originating with R.A. Fisher within British agricultural science in the 1920s and 1930s, the RCT has since become one of the most prodigiously used experimental techniques throughout the natural and social sciences. Philosophers of science have already scrutinised the epistemological uniqueness of RCTs, undermining their status as the 'gold standard' in experimental design. The present paper introduces a historical case study from the origins of the RCT, uncovering the initially cool reception given to this method by agricultural scientists at the University of Cambridge and the (Cambridge based) National Institute of Agricultural Botany. Rather than giving further attention to the RCT, the paper focuses instead on a competitor method-the half-drill strip-which both predated the RCT and remained in wide use for at least a decade beyond the latter's arrival. In telling this history, John Pickstone's Ways of Knowing is adopted, as the most flexible and productive way to write the history of science, particularly when sciences and scientists have to work across a number of different kinds of place. It is shown that those who resisted the RCT did so in order to preserve epistemic and social goals that randomisation would have otherwise run a tractor through. PMID:26205200

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  10. Very high temperature chemistry: Science justification for containerless experimentation in space

    NASA Technical Reports Server (NTRS)

    Hofmeister, William H.; Nordine, Paul

    1990-01-01

    A summary is presented of the justification for application of containerless processing in space to high temperature science. Low earth orbit offers a gravitational environment that allows samples to be positioned in an experimental apparatus by very small forces. Well controlled experiments become possible on reactive materials at high temperatures in a reasonably quiescent state and without container contamination. This provides an opportunity to advance the science of high temperature chemistry that can only be realized with a commitment by NASA to provide advanced facilities for in-space containerless study of materials at very high temperature.

  11. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    ERIC Educational Resources Information Center

    Onghena, Sofie

    2013-01-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact…

  12. Experimental Characterization of a Grid-Loss Event on a 2.5-MW Dynamometer Using Advanced Operational Modal Analysis: Preprint

    SciTech Connect

    Helsen, J.; Weijtjens, W.; Guo, Y.; Keller, J.; McNiff, B.; Devriendt, C.; Guillaume, P.

    2015-02-01

    This paper experimentally investigates a worst case grid loss event conducted on the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) drivetrain mounted on the 2.5MW NREL dynamic nacelle test-rig. The GRC drivetrain has a directly grid-coupled, fixed speed asynchronous generator. The main goal is the assessment of the dynamic content driving this particular assess the dynamic content of the high-speed stage of the GRC gearbox. In addition to external accelerometers, high frequency sampled measurements of strain gauges were used to assess torque fluctuations and bending moments both at the nacelle main shaft and gearbox high-speed shaft (HSS) through the entire duration of the event. Modal analysis was conducted using a polyreference Least Squares Complex Frequency-domain (pLSCF) modal identification estimator. The event driving the torsional resonance was identified. Moreover, the pLSCF estimator identified main drivetrain resonances based on a combination of acceleration and strain measurements. Without external action during the grid-loss event, a mode shape characterized by counter phase rotation of the rotor and generator rotor determined by the drivetrain flexibility and rotor inertias was the main driver of the event. This behavior resulted in significant torque oscillations with large amplitude negative torque periods. Based on tooth strain measurements of the HSS pinion, this work showed that at each zero-crossing, the teeth lost contact and came into contact with the backside flank. In addition, dynamic nontorque loads between the gearbox and generator at the HSS played an important role, as indicated by strain gauge-measurements.

  13. Data Grid Implementations

    SciTech Connect

    Moore, Reagan W.; Studham, Ronald S.; Rajasekar, Arcot; Watson, Chip; Stockinger, Heinz; Kunszt, Peter; Charlie Catlett and Ian Foster

    2002-02-27

    Data grids link distributed, heterogeneous storage resources into a coherent data management system. From a user perspective, the data grid provides a uniform name space across the underlying storage systems, while supporting retrieval and storage of files. In the high energy physics community, at least six data grids have been implemented for the storage and distribution of experimental data. Data grids are also being used to support projects as diverse as digital libraries (National Library of Medicine Visible Embryo project), federation of multiple astronomy sky surveys (NSF National Virtual Observatory project), and integration of distributed data sets (Long Term Ecological Reserve). Data grids also form the core interoperability mechanisms for creating persistent archives, in which data collections are migrated to new technologies over time. The ability to provide a uniform name space across multiple administration domains is becoming a critical component of national-scale, collaborative projects.

  14. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  15. Highly transparent low resistance Ga doped ZnO/Cu grid double layers prepared at room temperature

    NASA Astrophysics Data System (ADS)

    Jang, Cholho; Zhizhen, Ye; Jianguo, Lü

    2015-12-01

    Ga doped ZnO (GZO)/Cu grid double layer structures were prepared at room temperature (RT). We have studied the electrical and optical characteristics of the GZO/Cu grid double layer as a function of the Cu grid spacing distance. The optical transmittance and sheet resistance of the GZO/Cu grid double layer are higher than that of the GZO/Cu film double layer regardless of the Cu grid spacing distance and increase as the Cu grid spacing distance increases. The calculated values for the transmittance and sheet resistance of the GZO/Cu grid double layer well follow the trend of the experimentally observed transmittance and sheet resistance ones. For the GZO/Cu grid double layer with a Cu grid spacing distance of 1 mm, the highest figure of merit (ΦTC = 6.19 × 10-3 Ω-1) was obtained. In this case, the transmittance, resistivity and filling factor (FF) of the GZO/Cu grid double layer are 83.74%, 1.10 × 10-4 Ω·cm and 0.173, respectively. Project supported by the Key Project of the National Natural Science Foundation of China (No. 91333203), the Program for Innovative Research Team in University of Ministry of Education of China (No. IRT13037), the National Natural Science Foundation of China (No. 51172204), and the Zhejiang Provincial Department of Science and Technology of China (No. 2010R50020).

  16. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    PubMed

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment. PMID:24787842

  17. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    PubMed

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment.

  18. MAGNETIC GRID

    DOEpatents

    Post, R.F.

    1960-08-01

    An electronic grid is designed employing magnetic forces for controlling the passage of charged particles. The grid is particularly applicable to use in gas-filled tubes such as ignitrons. thyratrons, etc., since the magnetic grid action is impartial to the polarity of the charged particles and, accordingly. the sheath effects encountered with electrostatic grids are not present. The grid comprises a conductor having sections spaced apart and extending in substantially opposite directions in the same plane, the ends of the conductor being adapted for connection to a current source.

  19. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  20. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  1. Implementing Production Grids

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Ziobarth, John (Technical Monitor)

    2002-01-01

    We have presented the essence of experience gained in building two production Grids, and provided some of the global context for this work. As the reader might imagine, there were a lot of false starts, refinements to the approaches and to the software, and several substantial integration projects (SRB and Condor integrated with Globus) to get where we are today. However, the point of this paper is to try and make it substantially easier for others to get to the point where Information Power Grids (IPG) and the DOE Science Grids are today. This is what is needed in order to move us toward the vision of a common cyber infrastructure for science. The author would also like to remind the readers that this paper primarily represents the actual experiences that resulted from specific architectural and software choices during the design and implementation of these two Grids. The choices made were dictated by the criteria laid out in section 1. There is a lot more Grid software available today that there was four years ago, and various of these packages are being integrated into IPG and the DOE Grids. However, the foundation choices of Globus, SRB, and Condor would not be significantly different today than they were four years ago. Nonetheless, if the GGF is successful in its work - and we have every reason to believe that it will be - then in a few years we will see that the 28 functions provided by these packages will be defined in terms of protocols and MIS, and there will be several robust implementations available for each of the basic components, especially the Grid Common Services. The impact of the emerging Web Grid Services work is not yet clear. It will likely have a substantial impact on building higher level services, however it is the opinion of the author that this will in no way obviate the need for the Grid Common Services. These are the foundation of Grids, and the focus of almost all of the operational and persistent infrastructure aspects of Grids.

  2. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  3. Can Jurors Recognize Missing Control Groups, Confounds, and Experimenter Bias in Psychological Science?

    PubMed Central

    McAuliff, Bradley D.; Kovera, Margaret Bull; Nunez, Gabriel

    2010-01-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed. PMID:18587635

  4. Can jurors recognize missing control groups, confounds, and experimenter bias in psychological science?

    PubMed

    McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel

    2009-06-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.

  5. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    NASA Astrophysics Data System (ADS)

    Onghena, Sofie

    2013-04-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact that Belgium, as a result of its geographical position, considered itself as the centre of scientific relations between France and Germany, and as actually strengthened by its linguistic and cultural dualism in this regard. This pursuit of internationalist nationalism also affected the configuration of chemistry and physics as experimental courses at Belgian Royal State Schools, although the years preceding WWI are usually characterized as a period of rising nationalism in science, with countries such as Germany and France as prominent actors. To what extent did France and Germany influence Belgian debates on science education, science teachers' training, the use of textbooks, and the instalment of school laboratories and teaching collections?

  6. Fibonacci Grids

    NASA Technical Reports Server (NTRS)

    Swinbank, Richard; Purser, James

    2006-01-01

    Recent years have seen a resurgence of interest in a variety of non-standard computational grids for global numerical prediction. The motivation has been to reduce problems associated with the converging meridians and the polar singularities of conventional regular latitude-longitude grids. A further impetus has come from the adoption of massively parallel computers, for which it is necessary to distribute work equitably across the processors; this is more practicable for some non-standard grids. Desirable attributes of a grid for high-order spatial finite differencing are: (i) geometrical regularity; (ii) a homogeneous and approximately isotropic spatial resolution; (iii) a low proportion of the grid points where the numerical procedures require special customization (such as near coordinate singularities or grid edges). One family of grid arrangements which, to our knowledge, has never before been applied to numerical weather prediction, but which appears to offer several technical advantages, are what we shall refer to as "Fibonacci grids". They can be thought of as mathematically ideal generalizations of the patterns occurring naturally in the spiral arrangements of seeds and fruit found in sunflower heads and pineapples (to give two of the many botanical examples). These grids possess virtually uniform and highly isotropic resolution, with an equal area for each grid point. There are only two compact singular regions on a sphere that require customized numerics. We demonstrate the practicality of these grids in shallow water simulations, and discuss the prospects for efficiently using these frameworks in three-dimensional semi-implicit and semi-Lagrangian weather prediction or climate models.

  7. The Frequency of Hands-On Experimentation and Student Attitudes toward Science: A Statistically Significant Relation (2005-51-Ornstein)

    ERIC Educational Resources Information Center

    Ornstein, Avi

    2006-01-01

    Attitudinal data tested hypotheses that students have more positive attitudes toward science when teachers regularly emphasize hands-on laboratory activities and when students more frequently experience higher levels of experimentation or inquiry. The first predicted that students would have more positive attitudes toward science in classrooms…

  8. Early laboratories c.1600 - c.1800 and the location of experimental science.

    PubMed

    Crosland, Maurice

    2005-04-01

    Surprisingly little attention has been given hitherto to the definition of the laboratory. A space has to be specially adapted to deserve that title. It would be easy to assume that the two leading experimental sciences, physics and chemistry, have historically depended in a similar way on access to a laboratory. But while chemistry, through its alchemical ancestry with batteries of stills, had many fully fledged laboratories by the seventeenth century, physics was discovering the value of mathematics. Even experimental physics was content to make use of almost any indoor space, if not outdoors, ignoring the possible value of a laboratory. The development of the physics laboratory had to wait until the nineteenth century.

  9. The EUAsiaGrid Project

    NASA Astrophysics Data System (ADS)

    Paganoni, Marco

    The EUAsiaGrid proposal contributes to the aims of the Research Infrastructures part of the EU Seventh Framework Programme (FP7) by promoting interoperation between the European and the Asian-Pacific Grids. The project, with a total number of 15 partners coordinated by INFN, started on April 1st 2008. It will disseminate the knowledge about the EGEE Grid infrastructure, organize specific training events and support applications both within the scientific communities with an already long experience in the Computing Grids (High Energy Physics, Computational Chemistry, Bioinformatics and Biomedics) and in the most recent ones (Social Sciences, Disaster Mitigation, Cultural Heritage). Ultimately the EUAsiaGrid project will pave the way towards a common e-Infrastructure with the European and the Asian Grids.

  10. Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra

    2004-01-01

    Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice

  11. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  12. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them. PMID:25311906

  13. Striped ratio grids for scatter estimation

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Wang, Adam S.; Star-Lack, Josh

    2016-03-01

    Striped ratio grids are a new concept for scatter management in cone-beam CT. These grids are a modification of conventional anti-scatter grids and consist of stripes which alternate between high grid ratio and low grid ratio. Such a grid is related to existing hardware concepts for scatter estimation such as blocker-based methods or primary modulation, but rather than modulating the primary, the striped ratio grid modulates the scatter. The transitions between adjacent stripes can be used to estimate and subtract the remaining scatter. However, these transitions could be contaminated by variation in the primary radiation. We describe a simple nonlinear image processing algorithm to estimate scatter, and proceed to validate the striped ratio grid on experimental data of a pelvic phantom. The striped ratio grid is emulated by combining data from two scans with different grids. Preliminary results are encouraging and show a significant reduction of scatter artifact.

  14. FermiGrid - experience and future plans

    SciTech Connect

    Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Timm, S.; Yocum, D.; /Fermilab

    2007-09-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and the Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems.

  15. Solar Fridges and Personal Power Grids: How Berkeley Lab is Fighting Global Poverty (LBNL Science at the Theater)

    SciTech Connect

    Buluswar, Shashi; Gadgil, Ashok

    2012-11-26

    At this November 26, 2012 Science at the Theater, scientists discussed the recently launched LBNL Institute for Globally Transformative Technologies (LIGTT) at Berkeley Lab. LIGTT is an ambitious mandate to discover and develop breakthrough technologies for combating global poverty. It was created with the belief that solutions will require more advanced R&D and a deep understanding of market needs in the developing world. Berkeley Lab's Ashok Gadgil, Shashi Buluswar and seven other LIGTT scientists discussed what it takes to develop technologies that will impact millions of people. These include: 1) Fuel efficient stoves for clean cooking: Our scientists are improving the Berkeley Darfur Stove, a high efficiency stove used by over 20,000 households in Darfur; 2) The ultra-low energy refrigerator: A lightweight, low-energy refrigerator that can be mounted on a bike so crops can survive the trip from the farm to the market; 3) The solar OB suitcase: A low-cost package of the five most critical biomedical devices for maternal and neonatal clinics; 4) UV Waterworks: A device for quickly, safely and inexpensively disinfecting water of harmful microorganisms.

  16. Using R in experimental design with BIBD: An application in health sciences

    NASA Astrophysics Data System (ADS)

    Oliveira, Teresa A.; Francisco, Carla; Oliveira, Amílcar; Ferreira, Agostinho

    2016-06-01

    Considering the implementation of an Experimental Design, in any field, the experimenter must pay particular attention and look for the best strategies in the following steps: planning the design selection, conduct the experiments, collect observed data, proceed to analysis and interpretation of results. The focus is on providing both - a deep understanding of the problem under research and a powerful experimental process at a reduced cost. Mainly thanks to the possibility of allowing to separate variation sources, the importance of Experimental Design in Health Sciences is strongly recommended since long time. Particular attention has been devoted to Block Designs and more precisely to Balanced Incomplete Block Designs - in this case the relevance states from the fact that these designs allow testing simultaneously a number of treatments bigger than the block size. Our example refers to a possible study of inter reliability of the Parkinson disease, taking into account the UPDRS (Unified Parkinson's disease rating scale) in order to test if there are significant differences between the specialists who evaluate the patients performances. Statistical studies on this disease were described for example in Richards et al (1994), where the authors investigate the inter-rater Reliability of the Unified Parkinson's Disease Rating Scale Motor Examination. We consider a simulation of a practical situation in which the patients were observed by different specialists and the UPDRS on assessing the impact of Parkinson's disease in patients was observed. Assigning treatments to the subjects following a particular BIBD(9,24,8,3,2) structure, we illustrate that BIB Designs can be used as a powerful tool to solve emerging problems in this area. Once a structure with repeated blocks allows to have some block contrasts with minimum variance, see Oliveira et al. (2006), the design with cardinality 12 was selected for the example. R software was used for computations.

  17. Development and validation of the Colorado learning attitudes about science survey for experimental physics

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Finkelstein, Noah; Lewandowski, H. J.

    2013-01-01

    As part of a comprehensive effort to transform our undergraduate physics laboratories and evaluate the impacts of these efforts, we have developed the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS). The E-CLASS assesses the changes in students' attitudes about a variety of scientific laboratory practices before and after a lab course and compares attitudes with perceptions of the course grading requirements and laboratory practices. The E-CLASS is designed to give researchers insight into students' attitudes and also to provide actionable evidence to instructors looking for feedback on their courses. We present the development, validation, and preliminary results from the initial implementation of the survey in three undergraduate physics lab courses.

  18. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-01

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  19. A grid service-based tool for hyperspectral imaging analysis

    NASA Astrophysics Data System (ADS)

    Carvajal, Carmen L.; Lugo, Wilfredo; Rivera, Wilson; Sanabria, John

    2005-06-01

    This paper outlines the design and implementation of Grid-HSI, a Service Oriented Architecture-based Grid application to enable hyperspectral imaging analysis. Grid-HSI provides users with a transparent interface to access computational resources and perform remotely hyperspectral imaging analysis through a set of Grid services. Grid-HSI is composed by a Portal Grid Interface, a Data Broker and a set of specialized Grid services. Grid based applications, contrary to other clientserver approaches, provide the capabilities of persistence and potential transient process on the web. Our experimental results on Grid-HSI show the suitability of the prototype system to perform efficiently hyperspectral imaging analysis.

  20. Smart Grid Integration Laboratory

    SciTech Connect

    Troxell, Wade

    2011-12-22

    The initial federal funding for the Colorado State University Smart Grid Integration Laboratory is through a Congressionally Directed Project (CDP), DE-OE0000070 Smart Grid Integration Laboratory. The original program requested in three one-year increments for staff acquisition, curriculum development, and instrumentation all which will benefit the Laboratory. This report focuses on the initial phase of staff acquisition which was directed and administered by DOE NETL/ West Virginia under Project Officer Tom George. Using this CDP funding, we have developed the leadership and intellectual capacity for the SGIC. This was accomplished by investing (hiring) a core team of Smart Grid Systems engineering faculty focused on education, research, and innovation of a secure and smart grid infrastructure. The Smart Grid Integration Laboratory will be housed with the separately funded Integrid Laboratory as part of CSU's overall Smart Grid Integration Center (SGIC). The period of performance of this grant was 10/1/2009 to 9/30/2011 which included one no cost extension due to time delays in faculty hiring. The Smart Grid Integration Laboratory's focus is to build foundations to help graduate and undergraduates acquire systems engineering knowledge; conduct innovative research; and team externally with grid smart organizations. Using the results of the separately funded Smart Grid Workforce Education Workshop (May 2009) sponsored by the City of Fort Collins, Northern Colorado Clean Energy Cluster, Colorado State University Continuing Education, Spirae, and Siemens has been used to guide the hiring of faculty, program curriculum and education plan. This project develops faculty leaders with the intellectual capacity to inspire its students to become leaders that substantially contribute to the development and maintenance of Smart Grid infrastructure through topics such as: (1) Distributed energy systems modeling and control; (2) Energy and power conversion; (3) Simulation of

  1. Helping parents to motivate adolescents in mathematics and science: an experimental test of a utility-value intervention.

    PubMed

    Harackiewicz, Judith M; Rozek, Christopher S; Hulleman, Chris S; Hyde, Janet S

    2012-08-01

    The pipeline toward careers in science, technology, engineering, and mathematics (STEM) begins to leak in high school, when some students choose not to take advanced mathematics and science courses. We conducted a field experiment testing whether a theory-based intervention that was designed to help parents convey the importance of mathematics and science courses to their high school-aged children would lead them to take more mathematics and science courses in high school. The three-part intervention consisted of two brochures mailed to parents and a Web site, all highlighting the usefulness of STEM courses. This relatively simple intervention led students whose parents were in the experimental group to take, on average, nearly one semester more of science and mathematics in the last 2 years of high school, compared with the control group. Parents are an untapped resource for increasing STEM motivation in adolescents, and the results demonstrate that motivational theory can be applied to this important pipeline problem.

  2. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  3. Grid reliability

    NASA Astrophysics Data System (ADS)

    Saiz, P.; Andreeva, J.; Cirstoiu, C.; Gaidioz, B.; Herrala, J.; Maguire, E. J.; Maier, G.; Rocha, R.

    2008-07-01

    Thanks to the Grid, users have access to computing resources distributed all over the world. The Grid hides the complexity and the differences of its heterogeneous components. In such a distributed system, it is clearly very important that errors are detected as soon as possible, and that the procedure to solve them is well established. We focused on two of its main elements: the workload and the data management systems. We developed an application to investigate the efficiency of the different centres. Furthermore, our system can be used to categorize the most common error messages, and control their time evolution.

  4. Gratia: New Challenges in Grid Accounting

    NASA Astrophysics Data System (ADS)

    Canal, Philippe

    2011-12-01

    Gratia originated as an accounting system for batch systems and Linux process accounting. In production since 2006 at FNAL, it was adopted by the Open Science Grid as a distributed, grid-wide accounting system in 2007. Since adoption Gratia's next challenge has been to adapt to an explosive increase in data volume and to handle several new categories of accounting data. Gratia now accounts for regular grid jobs, file transfers, glide-in jobs, and the state of grid services. We show that Gratia gives access to a thorough picture of the OSG and discuss the complexity caused by newer grid techniques such as pilot jobs, job forwarding, and backfill.

  5. Experimental investigations of the nonlinear dynamics of a complex space-charge configuration inside and around a grid cathode with hole

    NASA Astrophysics Data System (ADS)

    Teodorescu-Soare, C. T.; Dimitriu, D. G.; Ionita, C.; Schrittwieser, R. W.

    2016-03-01

    By negatively biasing a metallic grid with a small hole, down to a critical value of the applied potential a complex space-charge structure appears inside and around the grid cathode. The static current-voltage characteristic of the discharge shows one or two current jumps (the number of current jumps depending on the working gas pressure), one of them being of hysteretic type. Electrical probe measurements show a positive potential inside the grid cathode with respect to the potential applied on it. This is interpreted as being due to the hollow cathode effect. Thus, the inner fireball appears around the virtual anode inside the grid cathode. For more negative potentials, the electrons inside the cathode reach sufficient energy to penetrate the inner sheath near the cathode, passing through the hole and giving rise to a second fireball-like structure located outside the cathode. This second structure interacts with the negative glow of the discharge. The recorded time series of the discharge current oscillations reveal strongly nonlinear dynamics of the complex space-charge structure: by changing the negative potential applied on the grid cathode, the structure passes through different dynamic states involving chaos, quasi-periodicity, intermittency and period-doubling bifurcations, appearing like a competition of different routes to chaos.

  6. Heritage Education: Exploring the Conceptions of Teachers and Administrators from the Perspective of Experimental and Social Science Teaching

    ERIC Educational Resources Information Center

    Perez, Roque Jimenez; Lopez, Jose Maria Cuenca; Listan, D. Mario Ferreras

    2010-01-01

    This paper describes a research project into heritage education. Taking an interdisciplinary perspective from within the field of Experimental and Social Science Education, it presents an analysis of teachers' and administrators' conceptions of heritage, its teaching and its dissemination in Spain. A statistical description is provided of the…

  7. Changes in Critical Thinking Skills Following a Course on Science and Pseudoscience: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    McLean, Carmen P.; Miller, Nathan A.

    2010-01-01

    We assessed changes in paranormal beliefs and general critical thinking skills among students (n = 23) enrolled in an experimental course designed to teach distinguishing science from pseudoscience and a comparison group of students (n = 30) in an advanced research methods course. On average, both courses were successful in reducing paranormal…

  8. 'Mind genomics': the experimental, inductive science of the ordinary, and its application to aspects of food and feeding.

    PubMed

    Moskowitz, Howard R

    2012-11-01

    The paper introduces the empirical science of 'mind genomics', whose objective is to understand the dimensions of ordinary, everyday experience, identify mind-set segments of people who value different aspects of that everyday experience, and then assign a new person to a mind-set by a statistically appropriate procedure. By studying different experiences using experimental design of ideas, 'mind genomics' constructs an empirical, inductive science of perception and experience, layer by layer. The ultimate objective of 'mind genomics' is a large-scale science of experience created using induction, with the science based upon emergent commonalities across many different types of daily experience. The particular topic investigated in the paper is the experience of healthful snacks, what makes a person 'want' them, and the dollar value of different sensory aspects of the healthful snack.

  9. Apollo-Soyuz pamphlet no. 9: General science. [experimental design in Astronomy, Biology, Geophysics, Aeronomy and Materials science

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    The objectives and planning activities for the Apollo-Soyuz mission are summarized. Aspects of the space flight considered include the docking module and launch configurations, spacecraft orbits, and weightlessness. The 28 NASA experiments conducted onboard the spacecraft are summarized. The contributions of the mission to the fields of astronomy, geoscience, biology, and materials sciences resulting from the experiments are explored.

  10. An infrastructure for the integration of geoscience instruments and sensors on the Grid

    NASA Astrophysics Data System (ADS)

    Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.

    2009-04-01

    The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV

  11. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  12. Science, suffrage, and experimentation: Mary Putnam Jacobi and the controversy over vivisection in late nineteenth-century America.

    PubMed

    Bittel, Carla Jean

    2005-01-01

    This article examines the medical activism of the New York physician Mary Putnam Jacobi (1842-1906), to illustrate the problems of gender and science at the center of the vivisection debate in late nineteenth-century America. In the post-Civil War era, individuals both inside and outside the medical community considered vivisection to be a controversial practice. Physicians divided over the value of live animal experimentation, while reformers and activists campaigned against it. Jacobi stepped into the center of the controversy and tried to use her public defense of experimentation to the advantage of women in the medical profession. Her advocacy of vivisection was part of her broader effort to reform medical education, especially at women's institutions. It was also a political strategy aimed at associating women with scientific practices to advance a women's rights agenda. Her work demonstrates how debates over women in medicine and science in medicine, suffrage, and experimentation overlapped at a critical moment of historical transition.

  13. LAPS Grid generation and adaptation

    NASA Astrophysics Data System (ADS)

    Pagliantini, Cecilia; Delzanno, Gia Luca; Guo, Zehua; Srinivasan, Bhuvana; Tang, Xianzhu; Chacon, Luis

    2011-10-01

    LAPS uses a common-data framework in which a general purpose grid generation and adaptation package in toroidal and simply connected domains is implemented. The initial focus is on implementing the Winslow/Laplace-Beltrami method for generating non-overlapping block structured grids. This is to be followed by a grid adaptation scheme based on Monge-Kantorovich optimal transport method [Delzanno et al., J. Comput. Phys,227 (2008), 9841-9864], that equidistributes application-specified error. As an initial set of applications, we will lay out grids for an axisymmetric mirror, a field reversed configuration, and an entire poloidal cross section of a tokamak plasma reconstructed from a CMOD experimental shot. These grids will then be used for computing the plasma equilibrium and transport in accompanying presentations. A key issue for Monge-Kantorovich grid optimization is the choice of error or monitor function for equi-distribution. We will compare the Operator Recovery Error Source Detector (ORESD) [Lapenta, Int. J. Num. Meth. Eng,59 (2004) 2065-2087], the Tau method and a strategy based on the grid coarsening [Zhang et al., AIAA J,39 (2001) 1706-1715] to find an ``optimal'' grid. Work supported by DOE OFES.

  14. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    SciTech Connect

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequencies are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.

  15. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  16. Experimental Methods to Evaluate Science Utility Relative to the Decadal Survey

    NASA Technical Reports Server (NTRS)

    Widergren, Cynthia

    2012-01-01

    The driving factor for competed missions is the science that it plans on performing once it has reached its target body. These science goals are derived from the science recommended by the most current Decadal Survey. This work focuses on science goals in previous Venus mission proposals with respect to the 2013 Decadal Survey. By looking at how the goals compare to the survey and how much confidence NASA has in the mission's ability to accomplish these goals, a method was created to assess the science return utility of each mission. This method can be used as a tool for future Venus mission formulation and serves as a starting point for future development of create science utility assessment tools.

  17. [Progress in development and application of experimental facilities for life sciences on onboard the International Space Station].

    PubMed

    Guo, Shuang-sheng; Fu, Lan; Ai, Wei-dang

    2003-12-01

    The construction of the International Space Station (ISS) will be completed soon, and life sciences studies are important tasks to be carried out onboard. Therefore, various related facilities for life science flight experiments are being developed aimed at diverse objectives, and some of them were completed and passed ground-based simulation experiment or airplane parabolic flight tests, and to be arranged for implementation of spaceflight experiments. This article reviews comprehensively the recent progress in the development of various types of related experimental facilities and the onboard experiments, in the hope that it will provide reference for related fields of research.

  18. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  19. Experimental setup and the system performance for single-grid-based phase-contrast x-ray imaging (PCXI) with a microfocus x-ray tube

    NASA Astrophysics Data System (ADS)

    Lim, Hyunwoo; Park, Yeonok; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Park, Chulkyu; Woo, Taeho; Lee, Minsik; Kim, Jinsoo; Chung, Nagkun; Kim, Jinwon; Kim, Jinguk

    2015-08-01

    In this work, we investigated a simplified approach to phase-contrast x-ray imaging (PCXI) by using a single antiscatter grid and a microfocus x-ray tube, which has potential to open the way to further widespread use of PCXI into the related application areas. We established a table-top setup for PCXI studies of biological and non-biological samples and investigated the system performance. The PCXI system consists of a focused-linear grid having a strip density of 200 lines/in. (JPI Healthcare Corp.), a microfocus x-ray tube having a focal spot size of about 5 μm (Hamamatsu, L7910), and a high-resolution CMOS imaging detector having a pixel size of 48 μm (Rad-icon Imaging Corp., Shad-o-Box 2048). By using our prototype system, we successfully obtained attenuation, scattering, and differential phase-contrast x-ray images of improved visibility from the raw images of several samples at x-ray tube conditions of 50 kVp and 6 mAs. Our initial results indicate that the single-grid-based approach seems a useful method for PCXI with great simplicity and minimal requirements on the setup alignment.

  20. An Experimental Examination of Quick Writing in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Benedek-Wood, Elizabeth; Mason, Linda H.; Wood, Philip H.; Hoffman, Katie E.; McGuire, Ashley

    2014-01-01

    A staggered A-B design study was used to evaluate the effects of Self- Regulated Strategy Development (SRSD) instruction for quick writing in middle school science across four classrooms. A sixth-grade science teacher delivered all students' writing assessment and SRSD instruction for informative quick writing. Results indicated that…

  1. An Experimental Study of the Use of Programmed Instruction in a University Physical Science Laboratory.

    ERIC Educational Resources Information Center

    Barnes, Marjorie R.

    Presented are the procedures, results, and conclusions of a study designed to compare the effectiveness of programed instructional materials with that of conventional materials in university physical science laboratory classes. The subjects were students enrolled in two similar freshmen-level physical science general education courses who were…

  2. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    ERIC Educational Resources Information Center

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-01-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually "do" science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields.…

  3. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    ERIC Educational Resources Information Center

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  4. Correlated Curriculum Program: An Experimental Program. Science Level 1 (9A, 9B, 10A).

    ERIC Educational Resources Information Center

    Loebl, Stanley, Ed.; And Others

    The unit plans in Correlated Science 1 are intended to be of use to the teacher in both lesson and team planning. The course in science was designed for optimum correlation with the work done in business, health, and industrial careers. Behavioral objectives, class routines, time allotments, student evaluation, and the design of the manual are…

  5. Your World and Welcome To It, Science (Experimental): 5314.03.

    ERIC Educational Resources Information Center

    Kleinman, David Z.

    Presented is a beginning course in biology with emphasis on ecology for students with limited interest and few experiences in science. These students most likely will not take many more science courses. Included are the basic ecological concepts of communities, population, societies and the effects humans have on the environment. Like all other…

  6. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    SciTech Connect

    Marken, Ken

    2009-05-20

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors – high-temperature superconducting (HTS) tapes – which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  7. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    SciTech Connect

    Marken, Ken

    2009-05-20

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors - high-temperature superconducting (HTS) tapes - which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  8. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    ScienceCinema

    Marken, Ken [Superconductivity Technology Center, Los Alamos, New Mexico, United States

    2016-07-12

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors – high-temperature superconducting (HTS) tapes – which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  9. GridMan: A grid manipulation system

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Wang, Zhu

    1992-01-01

    GridMan is an interactive grid manipulation system. It operates on grids to produce new grids which conform to user demands. The input grids are not constrained to come from any particular source. They may be generated by algebraic methods, elliptic methods, hyperbolic methods, parabolic methods, or some combination of methods. The methods are included in the various available structured grid generation codes. These codes perform the basic assembly function for the various elements of the initial grid. For block structured grids, the assembly can be quite complex due to a large number of clock corners, edges, and faces for which various connections and orientations must be properly identified. The grid generation codes are distinguished among themselves by their balance between interactive and automatic actions and by their modest variations in control. The basic form of GridMan provides a much more substantial level of grid control and will take its input from any of the structured grid generation codes. The communication link to the outside codes is a data file which contains the grid or section of grid.

  10. Pedagogical experimentations about participating science, in a european class, in France.

    NASA Astrophysics Data System (ADS)

    Burgio, Marion

    2015-04-01

    A european class is, in France, a class in which we teach a subject in a foreign language, for example science in English. I led, in my European class, during a seven weeks session, group work activities about different participating science actions. There were groups composed of three or four 16 years old students. Each group chose one type of participating science activity among : - Leading a visioconference with an IODP mission on board the Joides Resolution. - Being part of a "science songs community" with Tom Mc Fadden They divided the work and some of them studied the websites and contacted the actors to present the pedagogical or scientific background of their subject. Others had a concrete production like the organization of a visioconference with the Joides Resolution or the creation of a pedagogical song about geology. I will present some results of their work and explain the students motivation linked to this active learning method.

  11. "They Sweat for Science": The Harvard Fatigue Laboratory and Self-Experimentation in American Exercise Physiology.

    PubMed

    Johnson, Andi

    2015-08-01

    In many scientific fields, the practice of self-experimentation waned over the course of the twentieth century. For exercise physiologists working today, however, the practice of self-experimentation is alive and well. This paper considers the role of the Harvard Fatigue Laboratory and its scientific director, D. Bruce Dill, in legitimizing the practice of self-experimentation in exercise physiology. Descriptions of self-experimentation are drawn from papers published by members of the Harvard Fatigue Lab. Attention is paid to the ethical and practical justifications for self-experimentation in both the lab and the field. Born out of the practical, immediate demands of fatigue protocols, self-experimentation performed the long-term, epistemological function of uniting physiological data across time and space, enabling researchers to contribute to a general human biology program.

  12. Science to the people! (and experimental politics): searching for the roots of participatory discourse in science and technology in the 1970s in France.

    PubMed

    Quet, Mathieu

    2014-08-01

    The current conception of political participation in governmental institutions is deeply marked by the notions of deliberation and precaution. This normative conception of participatory politics neglects, backgrounds or disqualifies other participatory practices, in so far as they are not connected to deliberation and precaution. However, participation has not always been defined in such a restricted way: the current conception of participation is a product of the 1980s and 1990s. In this paper, the meaning ascribed to the notion of participation in the 1970s in France is explored through the study of discourses produced in three fields: the Science Policy Division of the OECD, the French radical science movement, and the emerging STS academic field. As is shown, some of the bases of the current notion of participation originate in the 1970s. Nevertheless, it is argued that in these years, the notion of participation has more to do with experimentation than with deliberation and precaution. Therefore, the conception of participation in the 1970s differs greatly from the current one. Methodologically, this paper combines tools offered by the social history of science and the French school of discourse analysis.

  13. A Case-Based Approach Improves Science Students' Experimental Variable Identification Skills

    ERIC Educational Resources Information Center

    Grunwald, Sandra; Hartman, Andrew

    2010-01-01

    Incorporation of experimental case studies into the laboratory curriculum increases students' abilities to identify experimental variables that affect the outcome of an experiment. Here the authors describe how such case studies were incorporated using an online course management system into a biochemistry laboratory curriculum and the assessment…

  14. Data Grid Management Systems

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.; Jagatheesan, Arun; Rajasekar, Arcot; Wan, Michael; Schroeder, Wayne

    2004-01-01

    The "Grid" is an emerging infrastructure for coordinating access across autonomous organizations to distributed, heterogeneous computation and data resources. Data grids are being built around the world as the next generation data handling systems for sharing, publishing, and preserving data residing on storage systems located in multiple administrative domains. A data grid provides logical namespaces for users, digital entities and storage resources to create persistent identifiers for controlling access, enabling discovery, and managing wide area latencies. This paper introduces data grids and describes data grid use cases. The relevance of data grids to digital libraries and persistent archives is demonstrated, and research issues in data grids and grid dataflow management systems are discussed.

  15. Qualitative Quantitative and Experimental Concept Possession, Criteria for Identifying Conceptual Change in Science Education

    ERIC Educational Resources Information Center

    Lappi, Otto

    2013-01-01

    Students sometimes misunderstand or misinterpret scientific content because of persistent misconceptions that need to be overcome by science education--a learning process typically called conceptual change. The acquisition of scientific content matter thus requires a transformation of the initial knowledge-state of a common-sense picture of the…

  16. Getting "What Works" Working: Building Blocks for the Integration of Experimental and Improvement Science

    ERIC Educational Resources Information Center

    Peterson, Amelia

    2016-01-01

    As a systemic approach to improving educational practice through research, "What Works" has come under repeated challenge from alternative approaches, most recently that of improvement science. While "What Works" remains a dominant paradigm for centralized knowledge-building efforts, there is need to understand why this…

  17. Mathematics Through Science, Part III: An Experimental Approach to Functions. Teacher's Commentary. Revised Edition.

    ERIC Educational Resources Information Center

    Bolduc, Elroy J., Jr.; And Others

    The purpose of this project is to teach learning and understanding of mathematics at the ninth grade level through the use of science experiments. This part of the program contains significant amounts of material normally found in a beginning algebra class. The material should be found useful for classes in general mathematics as a preparation for…

  18. Combined numerical and experimental investigations of local hydrodynamics and coolant flow mass transfer in Kvadrat-type fuel assemblies of PWR reactors with mixing grids

    NASA Astrophysics Data System (ADS)

    Dmitriev, S. M.; Samoilov, O. B.; Khrobostov, A. E.; Varentsov, A. V.; Dobrov, A. A.; Doronkov, D. V.; Sorokin, V. D.

    2014-08-01

    Results of research works on studying local hydrodynamics and mass transfer for coolant flow in the characteristic zones of PWR reactor fuel assemblies in case of using belts of mixing spacer grids are presented. The investigations were carried out on an aerodynamic rig using the admixture diffusion method (the tracer-gas method). Certain specific features pertinent to coolant flow in the fuel rod bundles of Kvadrat-type fuel assemblies were revealed during the experiments. The obtained study results were included in the database for verifying computation fluid dynamics computer codes and detailed cell-wise calculations of reactor cores with Kvadrat-type fuel assemblies. The obtained results can also be used for more exact determination of local coolant flow hydrodynamic and mass transfer characteristics in assessing thermal reliability of PWR reactor cores.

  19. Science, suffrage, and experimentation: Mary Putnam Jacobi and the controversy over vivisection in late nineteenth-century America.

    PubMed

    Bittel, Carla Jean

    2005-01-01

    This article examines the medical activism of the New York physician Mary Putnam Jacobi (1842-1906), to illustrate the problems of gender and science at the center of the vivisection debate in late nineteenth-century America. In the post-Civil War era, individuals both inside and outside the medical community considered vivisection to be a controversial practice. Physicians divided over the value of live animal experimentation, while reformers and activists campaigned against it. Jacobi stepped into the center of the controversy and tried to use her public defense of experimentation to the advantage of women in the medical profession. Her advocacy of vivisection was part of her broader effort to reform medical education, especially at women's institutions. It was also a political strategy aimed at associating women with scientific practices to advance a women's rights agenda. Her work demonstrates how debates over women in medicine and science in medicine, suffrage, and experimentation overlapped at a critical moment of historical transition. PMID:16327083

  20. Critical need for family-based, quasi-experimental designs in integrating genetic and social science research.

    PubMed

    D'Onofrio, Brian M; Lahey, Benjamin B; Turkheimer, Eric; Lichtenstein, Paul

    2013-10-01

    Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene-environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles.

  1. Science.

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science instruction and technology: "A 3-D Journey in Space: A New Visual Cognitive Adventure" (Yoav Yair, Rachel Mintz, and Shai Litvak); "Using Collaborative Inquiry and Interactive Technologies in an Environmental Science Project for Middle School Teachers: A Description and Analysis" (Patricia…

  2. A New Elliptical Grid Clustering Method

    NASA Astrophysics Data System (ADS)

    Guansheng, Zheng

    A new base on grid clustering method is presented in this paper. This new method first does unsupervised learning on the high dimensions data. This paper proposed a grid-based approach to clustering. It maps the data onto a multi-dimensional space and applies a linear transformation to the feature space instead of to the objects themselves and then approach a grid-clustering method. Unlike the conventional methods, it uses a multidimensional hyper-eclipse grid cell. Some case studies and ideas how to use the algorithms are described. The experimental results show that EGC can discover abnormity shapes of clusters.

  3. Modeling of the charge-state separation at ITEP experimental facility for material science based on a Bernas ion source.

    PubMed

    Barminova, H Y; Saratovskyh, M S

    2016-02-01

    The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10(10) ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.

  4. Using newly-designed lint cleaner grid bars to remove seed coat fragments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was conducted to remove seed coat fragments at the saw-type lint cleaner using newly-designed grid bars. The test consisted of five experimental grid bar designs and one control. The experimental grid bars had angles from the sharp toe of the grid bar (or the angle from vertical) of ...

  5. Spatial services grid

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Li, Qi; Cheng, Jicheng

    2005-10-01

    This paper discusses the concept, key technologies and main application of Spatial Services Grid. The technologies of Grid computing and Webservice is playing a revolutionary role in studying the spatial information services. The concept of the SSG (Spatial Services Grid) is put forward based on the SIG (Spatial Information Grid) and OGSA (open grid service architecture). Firstly, the grid computing is reviewed and the key technologies of SIG and their main applications are reviewed. Secondly, the grid computing and three kinds of SIG (in broad sense)--SDG (spatial data grid), SIG (spatial information grid) and SSG (spatial services grid) and their relationships are proposed. Thirdly, the key technologies of the SSG (spatial services grid) is put forward. Finally, three representative applications of SSG (spatial services grid) are discussed. The first application is urban location based services gird, which is a typical spatial services grid and can be constructed on OGSA (Open Grid Services Architecture) and digital city platform. The second application is region sustainable development grid which is the key to the urban development. The third application is Region disaster and emergency management services grid.

  6. Experimental and credentialing capital: an adaptable framework for facilitating science outreach for underrepresented youth.

    PubMed

    Drazan, John F; D'Amato, Anthony R; Winkelman, Max A; Littlejohn, Aaron J; Johnson, Christopher; Ledet, Eric H; Eglash, Ron

    2015-08-01

    Increasing the numbers of black, latino and native youth in STEM careers is both an important way to reduce poverty in low income communities, and a contribution to the diversity of thought and experience that drives STEM research. But underrepresented youth are often alienated from STEM. Two new forms of social capital have been identified that can be combined to create a learning environment in which students and researchers can meet and explore an area of shared interest. Experimental capital refers to the intrinsic motivation that students can develop when they learn inquiry techniques for exploring topics that they feel ownership over. Credentialing capital denotes a shared interest and ability between all parties engaged in the experimental endeavor. These two forms of social capital form an adaptable framework for researchers to use to create effective outreach programs. In this case study sports biomechanics was utilized as the area of shared interest and understanding the slam dunk was used as experimental capital.

  7. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2015-09-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.

  8. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    ERIC Educational Resources Information Center

    Pyatt, Kevin; Sims, Rod

    2012-01-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal…

  9. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    ERIC Educational Resources Information Center

    Wieman, Carl

    2015-01-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and…

  10. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  11. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    SciTech Connect

    Box, Dennis

    2014-01-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  12. Grid-free compressive beamforming.

    PubMed

    Xenaki, Angeliki; Gerstoft, Peter

    2015-04-01

    The direction-of-arrival (DOA) estimation problem involves the localization of a few sources from a limited number of observations on an array of sensors, thus it can be formulated as a sparse signal reconstruction problem and solved efficiently with compressive sensing (CS) to achieve high-resolution imaging. On a discrete angular grid, the CS reconstruction degrades due to basis mismatch when the DOAs do not coincide with the angular directions on the grid. To overcome this limitation, a continuous formulation of the DOA problem is employed and an optimization procedure is introduced, which promotes sparsity on a continuous optimization variable. The DOA estimation problem with infinitely many unknowns, i.e., source locations and amplitudes, is solved over a few optimization variables with semidefinite programming. The grid-free CS reconstruction provides high-resolution imaging even with non-uniform arrays, single-snapshot data and under noisy conditions as demonstrated on experimental towed array data.

  13. Grid-free compressive beamforming.

    PubMed

    Xenaki, Angeliki; Gerstoft, Peter

    2015-04-01

    The direction-of-arrival (DOA) estimation problem involves the localization of a few sources from a limited number of observations on an array of sensors, thus it can be formulated as a sparse signal reconstruction problem and solved efficiently with compressive sensing (CS) to achieve high-resolution imaging. On a discrete angular grid, the CS reconstruction degrades due to basis mismatch when the DOAs do not coincide with the angular directions on the grid. To overcome this limitation, a continuous formulation of the DOA problem is employed and an optimization procedure is introduced, which promotes sparsity on a continuous optimization variable. The DOA estimation problem with infinitely many unknowns, i.e., source locations and amplitudes, is solved over a few optimization variables with semidefinite programming. The grid-free CS reconstruction provides high-resolution imaging even with non-uniform arrays, single-snapshot data and under noisy conditions as demonstrated on experimental towed array data. PMID:25920844

  14. Grid-based Visualization Framework

    NASA Astrophysics Data System (ADS)

    Thiebaux, M.; Tangmunarunkit, H.; Kesselman, C.

    2003-12-01

    Advances in science and engineering have put high demands on tools for high-performance large-scale visual data exploration and analysis. For example, earthquake scientists can now study earthquake phenomena from first principle physics-based simulations. These simulations can generate large amounts of data, possibly high spatial resolution, and long time series. Single-system visualization software running on commodity machines cannot scale up to the large amounts of data generated by these simulations. To address this problem, we propose a flexible and extensible Grid-based visualization framework for time-critical, interactively controlled visual browsing of spatially and temporally large datasets in a Grid environment. Our framework leverages Grid resources for scalable computation and data storage to maintain performance and interactivity with large visualization jobs. Our framework utilizes Globus Toolkit 2.4 components for security (i.e., GSI), resource allocation and management (i.e., DUROC, GRAM) and communication (i.e., Globus-IO) to couple commodity desktops with remote, scalable storage and computational resources in a Grid for interactive data exploration. There are two major components in this framework---Grid Data Transport (GDT) and the Grid Visualization Utility (GVU). GDT provides libraries for performing parallel data filtering and parallel data exchange among Grid resources. GDT allows arbitrary data filtering to be integrated into the system. It also facilitates multi-tiered pipeline topology construction of compute resources and displays. In addition to scientific visualization applications, GDT can be used to support other applications that require parallel processing and parallel transfer of partial ordered independent files, such as file-set transfer. On top of GDT, we have developed the Grid Visualization Utility (GVU), which is designed to assist visualization dataset management, including file formatting, data transport and automatic

  15. Experimental education of Astronomy across the seedbeds of investigation in sciences

    NASA Astrophysics Data System (ADS)

    Taborda, E.

    2009-05-01

    In Colombia, the geographic situation help us in the moment of make academic work of astronomic observation, due to the opportunity of look almost the totality of the nocturnal sky in the hemispheres north and south in on night generating the possibility of make easy our labor as educators and to the astronomy and the related science with the students learn and the socialize in fundamental areas as mathematics, physic, chemistry, biology, art, technology, geography and history between others fundamental areas. In our presentation will be show the results of 3 years of in which we the students of primary and high school studies as a descriptive study of these research. we need economic help for the aid to this event.

  16. Large-Scale Experimental Planetary Science Meets Planetary Defense: Deorbiting an Asteroidal Satellite

    NASA Technical Reports Server (NTRS)

    Cintala, M. J.; Durda, D. D.; Housen, K. R.

    2005-01-01

    Other than remote-sensing and spacecraft-derived data, the only information that exists regarding the physical and chemical properties of asteroids is that inferred through calculations, numerical simulations, extrapolation of experiments, and meteorite studies. Our understanding of the dynamics of accretion of planetesimals, collisional disruption of asteroids, and the macroscopic, shock-induced modification of the surfaces of such small objects is also, for the most part, founded on similar inferences. While considerable strides have been made in improving the state of asteroid science, too many unknowns remain to assert that we understand the parameters necessary for the more practical problem of deflecting an asteroid or asteroid pair on an Earth-intersecting trajectory. Many of these deficiencies could be reduced or eliminated by intentionally deorbiting an asteroidal satellite and monitoring the resulting collision between it and the primary asteroid, a capability that is well within the limitations of current technology.

  17. Parallel grid population

    SciTech Connect

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  18. Proposal for grid computing for nuclear applications

    SciTech Connect

    Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.; Sulaiman, Mohamad Safuan B.; Aslan, Mohd Dzul Aiman Bin.; Samsudin, Nursuliza Bt.; Ibrahim, Maizura Bt.; Ahmad, Megat Harun Al Rashid B. Megat; Yazid, Hafizal B.; Jamro, Rafhayudi B.; Azman, Azraf B.; Rahman, Anwar B. Abdul; Ibrahim, Mohd Rizal B. Mamat; Muhamad, Shalina Bt. Sheik; Hassan, Hasni; Abdullah, Wan Ahmad Tajuddin Wan; Ibrahim, Zainol Abidin; Zolkapli, Zukhaimira; Anuar, Afiq Aizuddin; Norjoharuddeen, Nurfikri; and others

    2014-02-12

    The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

  19. Proposal for grid computing for nuclear applications

    NASA Astrophysics Data System (ADS)

    Idris, Faridah Mohamad; Abdullah, Wan Ahmad Tajuddin Wan; Ibrahim, Zainol Abidin; Zolkapli, Zukhaimira; Anuar, Afiq Aizuddin; Norjoharuddeen, Nurfikri; Ali, Mohd Adli bin Md; Mohamed, Abdul Aziz; Ismail, Roslan; Ahmad, Abdul Rahim; Ismail, Saaidi; Haris, Mohd Fauzi B.; Sulaiman, Mohamad Safuan B.; Aslan, Mohd Dzul Aiman Bin.; Samsudin, Nursuliza Bt.; Ibrahim, Maizura Bt.; Ahmad, Megat Harun Al Rashid B. Megat; Yazid, Hafizal B.; Jamro, Rafhayudi B.; Azman, Azraf B.; Rahman, Anwar B. Abdul; Ibrahim, Mohd Rizal B. Mamat @; Muhamad, Shalina Bt. Sheik; Hassan, Hasni; Sjaugi, Farhan

    2014-02-01

    The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

  20. Research exemption/experimental use in the European Union: patents do not block the progress of science.

    PubMed

    Jaenichen, Hans-Rainer; Pitz, Johann

    2015-02-01

    In the public debate about patents, specifically in the area of biotechnology, the position has been taken that patents block the progress of science. As we demonstrate in this review, this is not the case in the European Union (EU). The national patent acts of the EU member states define research and experimental use exemptions from patent infringement that allow sufficient room for research activities to promote innovation. This review provides a comparative overview of the legal requirements and the extent and limitations of experimental use exemptions, including the so-called Bolar provision, in Germany, the United Kingdom, France, Spain, Italy, and The Netherlands. The legal framework in the respective countries is illustrated with reference to practical examples concerning tests on patent-protected genetic targets and antibodies. Specific questions concerning the use of patent-protected research tools, the outsourcing of research activities, and the use of preparatory and supplying acts for experimental purposes that are necessary for conducting experiments are covered.

  1. Research exemption/experimental use in the European Union: patents do not block the progress of science.

    PubMed

    Jaenichen, Hans-Rainer; Pitz, Johann

    2015-02-01

    In the public debate about patents, specifically in the area of biotechnology, the position has been taken that patents block the progress of science. As we demonstrate in this review, this is not the case in the European Union (EU). The national patent acts of the EU member states define research and experimental use exemptions from patent infringement that allow sufficient room for research activities to promote innovation. This review provides a comparative overview of the legal requirements and the extent and limitations of experimental use exemptions, including the so-called Bolar provision, in Germany, the United Kingdom, France, Spain, Italy, and The Netherlands. The legal framework in the respective countries is illustrated with reference to practical examples concerning tests on patent-protected genetic targets and antibodies. Specific questions concerning the use of patent-protected research tools, the outsourcing of research activities, and the use of preparatory and supplying acts for experimental purposes that are necessary for conducting experiments are covered. PMID:25377145

  2. The role of experimental science in ICF -- examples from X-ray diagnostics and targets

    NASA Astrophysics Data System (ADS)

    Kilkenny, J. D.

    2016-10-01

    The USA Inertial Confinement Fusion (ICF) Program evolved from the Nuclear Test Program which had restricted shot opportunities for experimentalists to develop sophisticated experimental techniques. In contrast the ICF program in the US was able to increase the shot availability on its large facilities, and develop sophisticated targets and diagnostics to measure and understand the properties of the high energy density plasmas (HEDP) formed. Illustrative aspects of this evolution at Lawrence Livermore National Laboratory (LLNL), with examples of the development of diagnostics and target fabrication are described.

  3. GNARE: an environment for Grid-based high-throughput genome analysis.

    SciTech Connect

    Sulakhe, D.; Rodriguez, A.; D'Souza, M.; Wilde, M.; Nefedova, V.; Foster, I.; Maltsev, N.; Mathematics and Computer Science; Univ. of Chicago

    2005-01-01

    Recent progress in genomics and experimental biology has brought exponential growth of the biological information available for computational analysis in public genomics databases. However, applying the potentially enormous scientific value of this information to the understanding of biological systems requires computing and data storage technology of an unprecedented scale. The grid, with its aggregated and distributed computational and storage infrastructure, offers an ideal platform for high-throughput bioinformatics analysis. To leverage this we have developed the Genome Analysis Research Environment (GNARE) - a scalable computational system for the high-throughput analysis of genomes, which provides an integrated database and computational backend for data-driven bioinformatics applications. GNARE efficiently automates the major steps of genome analysis including acquisition of data from multiple genomic databases; data analysis by a diverse set of bioinformatics tools; and storage of results and annotations. High-throughput computations in GNARE are performed using distributed heterogeneous grid computing resources such as Grid2003, TeraGrid, and the DOE science grid. Multi-step genome analysis workflows involving massive data processing, the use of application-specific toots and algorithms and updating of an integrated database to provide interactive Web access to results are all expressed and controlled by a 'virtual data' model which transparently maps computational workflows to distributed grid resources. This paper describes how Grid technologies such as Globus, Condor, and the Gryphyn virtual data system were applied in the development of GNARE. It focuses on our approach to Grid resource allocation and to the use of GNARE as a computational framework for the development of bioinformatics applications.

  4. A Moving Grid Capability for NPARC

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    1998-01-01

    Version 3.1 of the NPARC computational fluid dynamics flow solver introduces a capability to solve unsteady flow on moving multi-block, structured grids with nominally second-order time accuracy. The grid motion is due to segments of the boundary grid that translate and rotate in a rigid-body manner or deform. The grid is regenerated at each time step to accommodate the boundary grid motion. The flow equations and computational models sense the moving grid through the grid velocities, which are computed from a time-difference of the grids at two consecutive time levels. For three-dimensional flow domains, it is assumed that the grid retains a planar character with respect to one coordinate. The application and accuracy of NPARC v3.1 is demonstrated for flow about a flying wedge, rotating flap, a collapsing bump in a duct, and the upstart / restart flow in a variable-geometry inlet. The results compare well with analytic and experimental results.

  5. Experimental pain processing in individuals with cognitive impairment: current state of the science.

    PubMed

    Defrin, Ruth; Amanzio, Martina; de Tommaso, Marina; Dimova, Violeta; Filipovic, Sasa; Finn, David P; Gimenez-Llort, Lydia; Invitto, Sara; Jensen-Dahm, Christina; Lautenbacher, Stefan; Oosterman, Joukje M; Petrini, Laura; Pick, Chaim G; Pickering, Gisele; Vase, Lene; Kunz, Miriam

    2015-08-01

    Cognitive impairment (CI) can develop during the course of ageing and is a feature of many neurological and neurodegenerative diseases. Many individuals with CI have substantial, sustained, and complex health care needs, which frequently include pain. However, individuals with CI can have difficulty communicating the features of their pain to others, which in turn presents a significant challenge for effective diagnosis and treatment of their pain. Herein, we review the literature on responsivity of individuals with CI to experimental pain stimuli. We discuss pain responding across a large number of neurological and neurodegenerative disorders in which CI is typically present. Overall, the existing data suggest that pain processing is altered in most individuals with CI compared with cognitively intact matched controls. The precise nature of these alterations varies with the type of CI (or associated clinical condition) and may also depend on the type of pain stimulation used and the type of pain responses assessed. Nevertheless, it is clear that regardless of the etiology of CI, patients do feel noxious stimuli, with more evidence for hypersensitivity than hyposensitivity to these stimuli compared with cognitively unimpaired individuals. Our current understanding of the neurobiological mechanisms underpinning these alterations is limited but may be enhanced through the use of animal models of CI, which also exhibit alterations in nociceptive responding. Further research using additional behavioural indices of pain is warranted. Increased understanding of altered experimental pain processing in CI will facilitate the development of improved diagnostic and therapeutic approaches for pain in individuals with CI.

  6. Experimental pain processing in individuals with cognitive impairment: current state of the science.

    PubMed

    Defrin, Ruth; Amanzio, Martina; de Tommaso, Marina; Dimova, Violeta; Filipovic, Sasa; Finn, David P; Gimenez-Llort, Lydia; Invitto, Sara; Jensen-Dahm, Christina; Lautenbacher, Stefan; Oosterman, Joukje M; Petrini, Laura; Pick, Chaim G; Pickering, Gisele; Vase, Lene; Kunz, Miriam

    2015-08-01

    Cognitive impairment (CI) can develop during the course of ageing and is a feature of many neurological and neurodegenerative diseases. Many individuals with CI have substantial, sustained, and complex health care needs, which frequently include pain. However, individuals with CI can have difficulty communicating the features of their pain to others, which in turn presents a significant challenge for effective diagnosis and treatment of their pain. Herein, we review the literature on responsivity of individuals with CI to experimental pain stimuli. We discuss pain responding across a large number of neurological and neurodegenerative disorders in which CI is typically present. Overall, the existing data suggest that pain processing is altered in most individuals with CI compared with cognitively intact matched controls. The precise nature of these alterations varies with the type of CI (or associated clinical condition) and may also depend on the type of pain stimulation used and the type of pain responses assessed. Nevertheless, it is clear that regardless of the etiology of CI, patients do feel noxious stimuli, with more evidence for hypersensitivity than hyposensitivity to these stimuli compared with cognitively unimpaired individuals. Our current understanding of the neurobiological mechanisms underpinning these alterations is limited but may be enhanced through the use of animal models of CI, which also exhibit alterations in nociceptive responding. Further research using additional behavioural indices of pain is warranted. Increased understanding of altered experimental pain processing in CI will facilitate the development of improved diagnostic and therapeutic approaches for pain in individuals with CI. PMID:26181216

  7. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    NASA Astrophysics Data System (ADS)

    Pyatt, Kevin; Sims, Rod

    2012-02-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal data were gathered and analyzed to measure the instructional value of physical and virtual lab experiences in terms of student performance and attitudes. Test statistics were conducted for differences of means for assessment data. Student attitudes towards virtual experiences in comparison to physical lab experiences were measured using a newly created Virtual and Physical Experimentation Questionnaire (VPEQ). VPEQ was specifically developed for this study, and included new scales of Usefulness of Lab, and Equipment Usability which measured attitudinal dimensions in virtual and physical lab experiences. A factor analysis was conducted for questionnaire data, and reliability of the scales and internal consistency of items within scales were calculated. The new scales were statistically valid and reliable. The instructional value of physical and virtual lab experiences was comparable in terms of student performance. Students showed preference towards the virtual medium in their lab experiences. Students showed positive attitudes towards physical and virtual experiences, and demonstrated a preference towards inquiry-based experiences, physical or virtual. Students found virtual experiences to have higher equipment usability as well as a higher degree of open-endedness. In regards to student access to inquiry-based lab experiences, virtual and online alternatives were viewed favorably by students.

  8. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. PMID:25146296

  9. Experimental studies in fluid mechanics and materials science using acoustic levitation

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.; Robey, J.; Arce, A.; Gaspar, M.

    1987-01-01

    Ground-based and short-duration low gravity experiments have been carried out with the use of ultrasonic levitators to study the dynamics of freely suspended liquid drops under the influence of predominantly capillary and acoustic radiation forces. Some of the effects of the levitating field on the shape as well as the fluid flow fields within the drop have been determined. The development and refinement of measurement techniques using levitated drops with size on the order of 2 mm in diameter have yielded methods having direct application to experiments in microgravity. In addition, containerless melting, undercooling, and freezing of organic materials as well as low melting metals have provided experimental data and observations on the application of acoustic positioning techniques to materials studies.

  10. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored.

  11. [Experimental, innovative and standard procedures. Ethics and science in the introduction of medical technology].

    PubMed

    Pons, J M V

    2003-01-01

    The dividing lines between experimental, innovative and standard medical procedures are frequently blurred in current clinical practice. This is even more true in the fields of surgery and implantable devices. These differ substantially from pharmacological interventions, which are better regulated.However, the character of the various medical interventions applied in human subjects should be ethically and scientifically delimited as clearly as possible. This task cannot be abandoned to personal discretion and criteria, which are currently used, especially in the field of surgical innovation. External and independent review of the risk-benefit ratio of proposed innovations should enable specification of the particular features of a technique in the patient-doctor relationship, as well as the ethical and scientific requirements for more appropriate evaluation.

  12. Collar grids for intersecting geometric components within the Chimera overlapped grid scheme

    NASA Technical Reports Server (NTRS)

    Parks, Steven J.; Buning, Pieter G.; Chan, William M.; Steger, Joseph L.

    1991-01-01

    A method for overcoming problems with using the Chimera overset grid scheme in the region of intersecting geometry components is presented. A 'collar grid' resolves the intersection region and provides communication between the component grids. This approach is validated by comparing computed and experimental data for a flow about a wing/body configuration. Application of the collar grid scheme to the Orbiter fuselage and vertical tail intersection in a computation of the full Space Shuttle launch vehicle demonstrates its usefulness for simulation of flow about complex aerospace vehicles.

  13. Corrosion chemistry closing comments: opportunities in corrosion science facilitated by operando experimental characterization combined with multi-scale computational modelling.

    PubMed

    Scully, John R

    2015-01-01

    Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.

  14. Method of grid generation

    DOEpatents

    Barnette, Daniel W.

    2002-01-01

    The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.

  15. Dynamic Power Grid Simulation

    2015-09-14

    GridDyn is a part of power grid simulation toolkit. The code is designed using modern object oriented C++ methods utilizing C++11 and recent Boost libraries to ensure compatibility with multiple operating systems and environments.

  16. IPG Power Grid Overview

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas

    2003-01-01

    This presentation will describe what is meant by grids and then cover the current state of the IPG. This will include an overview of the middleware that is key to the operation of the grid. The presentation will then describe some of the future directions that are planned for the IPG. Finally the presentation will conclude with a brief overview of the Global Grid Forum, which is a key activity that will contribute to the successful availability of grid components.

  17. Public judgment on science expenditure in the national budget of Japan: An experimental approach to examining the effects of unpacking science.

    PubMed

    Yokoyama, Hiromi M; Nakayachi, Kazuya

    2014-07-01

    How does the public assess an appropriate financial allocation to science promotion? This article empirically examined the subadditivity effect in the judgment of budgetary allocation. Results of the first experiment showed that the ratio of the national budget allocated for science promotion by participants increased when science was decomposed into more specific categories compared to when it was presented as "science promotion" alone. Consistent with these findings, results of the second experiment showed that the allotment ratio to science promotion decreased when the number of other expenditure items increased. Meanwhile, the third experiment revealed that in the case of a budgetary cutback, the total amount taken from science promotion greatly increased when science was decomposed into subcategories. The subadditivity effect and increase in the total allotment ratio by unpacking science promotion was confirmed by these three experiments not only on budgetary allocation but also on budgetary cutback.

  18. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  19. DOE SciDAC’s Earth System Grid Center for Enabling Technologies Final Report for University of Southern California Information Sciences Institute

    SciTech Connect

    Chervenak, Ann Louise

    2013-12-19

    The mission of the Earth System Grid Federation (ESGF) is to provide the worldwide climate-research community with access to the data, information, model codes, analysis tools, and intercomparison capabilities required to make sense of enormous climate data sets. Its specific goals are to (1) provide an easy-to-use and secure web-based data access environment for data sets; (2) add value to individual data sets by presenting them in the context of other data sets and tools for comparative analysis; (3) address the specific requirements of participating organizations with respect to bandwidth, access restrictions, and replication; (4) ensure that the data are readily accessible through the analysis and visualization tools used by the climate research community; and (5) transfer infrastructure advances to other domain areas. For the ESGF, the U.S. Department of Energy’s (DOE’s) Earth System Grid Center for Enabling Technologies (ESG-CET) team has led international development and delivered a production environment for managing and accessing ultra-scale climate data. This production environment includes multiple national and international climate projects (such as the Community Earth System Model and the Coupled Model Intercomparison Project), ocean model data (such as the Parallel Ocean Program), observation data (Atmospheric Radiation Measurement Best Estimate, Carbon Dioxide Information and Analysis Center, Atmospheric Infrared Sounder, etc.), and analysis and visualization tools, all serving a diverse user community. These data holdings and services are distributed across multiple ESG-CET sites (such as ANL, LANL, LBNL/NERSC, LLNL/PCMDI, NCAR, and ORNL) and at unfunded partner sites, such as the Australian National University National Computational Infrastructure, the British Atmospheric Data Centre, the National Oceanic and Atmospheric Administration Geophysical Fluid Dynamics Laboratory, the Max Planck Institute for Meteorology, the German Climate Computing

  20. Integration des sciences et de la langue: Creation et experimentation d'un modele pedagogique pour ameliorer l'apprentissage des sciences en milieu francophone minoritaire

    NASA Astrophysics Data System (ADS)

    Cormier, Marianne

    Les faibles resultats en sciences des eleves du milieu francophone minoritaire, lors d'epreuves au plan national et international, ont interpelle la recherche de solutions. Cette these avait pour but de creer et d'experimenter un modele pedagogique pour l'enseignement des sciences en milieu linguistique minoritaire. En raison de la presence de divers degres de francite chez la clientele scolaire de ce milieu, plusieurs elements langagiers (l'ecriture, la discussion et la lecture) ont ete integres a l'apprentissage scientifique. Nous avions recommande de commencer le processus d'apprentissage avec des elements langagiers plutot informels (redaction dans un journal, discussions en dyades...) pour progresser vers des activites langagieres plus formelles (redaction de rapports ou d'explications scientifiques). En ce qui a trait a l'apprentissage scientifique, le modele preconisait une demarche d'evolution conceptuelle d'inspiration socio-constructiviste tout en s'appuyant fortement sur l'apprentissage experientiel. Lors de l'experimentation du modele, nous voulions savoir si celui-ci provoquait une evolution conceptuelle chez les eleves, et si, simultanement, le vocabulaire scientifique de ces derniers s'enrichissait. Par ailleurs, nous cherchions a comprendre comment les eleves vivaient leurs apprentissages dans le cadre de ce modele pedagogique. Une classe de cinquieme annee de l'ecole de Grande-Digue, dans le Sud-est du Nouveau-Brunswick, a participe a la mise a l'essai du modele en etudiant les marais sales locaux. Lors d'entrevues initiales, nous avons remarque que les connaissances des eleves au sujet des marais sales etaient limitees. En effet, s'ils etaient conscients que les marais etaient des lieux naturels, ils ne pouvaient pas necessairement les decrire avec precision. Nous avons egalement constate que les eleves utilisaient surtout des mots communs (plantes, oiseaux, insectes) pour decrire le marais. Les resultats obtenus indiquent que les eleves ont

  1. Understanding The Smart Grid

    SciTech Connect

    2007-11-15

    The report provides an overview of what the Smart Grid is and what is being done to define and implement it. The electric industry is preparing to undergo a transition from a centralized, producer-controlled network to a decentralized, user-interactive one. Not only will the technology involved in the electric grid change, but the entire business model of the industry will change too. A major objective of the report is to identify the changes that the Smart Grid will bring about so that industry participants can be prepared to face them. A concise overview of the development of the Smart Grid is provided. It presents an understanding of what the Smart Grid is, what new business opportunities or risks might come about due to its introduction, and what activities are already taking place regarding defining or implementing the Smart Grid. This report will be of interest to the utility industry, energy service providers, aggregators, and regulators. It will also be of interest to home/building automation vendors, information technology vendors, academics, consultants, and analysts. The scope of the report includes an overview of the Smart Grid which identifies the main components of the Smart Grid, describes its characteristics, and describes how the Smart Grid differs from the current electric grid. The overview also identifies the key concepts involved in the transition to the Smart Grid and explains why a Smart Grid is needed by identifying the deficiencies of the current grid and the need for new investment. The report also looks at the impact of the Smart Grid, identifying other industries which have gone through a similar transition, identifying the overall benefits of the Smart Grid, and discussing the impact of the Smart Grid on industry participants. Furthermore, the report looks at current activities to implement the Smart Grid including utility projects, industry collaborations, and government initiatives. Finally, the report takes a look at key technology

  2. Navigation in Grid Space with the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a navigational tool for computational grids. The navigational process is based on measuring the grid characteristics with the NAS Grid Benchmarks (NGB) and using the measurements to assign tasks of a grid application to the grid machines. The tool allows the user to explore the grid space and to navigate the execution at a grid application to minimize its turnaround time. We introduce the notion of gridscape as a user view of the grid and show how it can be me assured by NGB, Then we demonstrate how the gridscape can be used with two different schedulers to navigate a grid application through a rudimentary grid.

  3. SYSTEMS MANUAL FOR THE EXPERIMENTAL LITERATURE COLLECTION AND REFERENCE RETRIEVAL SYSTEM OF THE CENTER FOR THE INFORMATION SCIENCES. EXPERIMENTAL RETRIEVAL SYSTEMS STUDIES, REPORT NUMBER 2.

    ERIC Educational Resources Information Center

    ANDERSON, RONALD R.; TAYLOR, ROBERT S.

    THE MANUAL DESCRIBES AND DOCUMENTS THE RETRIEVAL SYSTEM IN TERMS OF ITS TAPE AND DISK FILE PROGRAMS AND ITS SEARCH PROGRAMS AS USED BY THE LEHIGH CENTER FOR THE INFORMATION SCIENCES FOR SELECTED CURRENT LITERATURE OF THE INFORMATION SCIENCES, ABOUT 2500 DOCUMENT REFERENCES. THE SYSTEM IS PRESENTLY ON-LINE VIA TELETYPE AND CONVERSION IS IN PROCESS…

  4. Emerging trends: grid technology in pathology.

    PubMed

    Bueno, Gloria; García-Rojo, Marcial; Déniz, Oscar; Fernández-Carrobles, María del Milagro; Vállez, Noelia; Salido, Jesús; García-González, Jesús

    2012-01-01

    Grid technology has enabled clustering and access to, and interaction among, a wide variety of geographically distributed resources such as supercomputers, storage systems, data sources, instruments as well as special devices and services, realizing network-centric operations. Their main applications include large scale computational and data intensive problems in science and engineering. Grids are likely to have a deep impact on health related applications. Moreover, they seem to be suitable for tissue-based diagnosis. They offer a powerful tool to deal with current challenges in many biomedical domains involving complex anatomical and physiological modeling of structures from images or large image databases assembling and analysis. This chapter analyzes the general structures and functions of a Grid environment implemented for tissue-based diagnosis on digital images. Moreover, it presents a Grid middleware implemented by the authors for diagnostic pathology applications. The chapter is a review of the work done as part of the European COST project EUROTELEPATH. PMID:22925801

  5. Grid enabled Service Support Environment - SSE Grid

    NASA Astrophysics Data System (ADS)

    Goor, Erwin; Paepen, Martine

    2010-05-01

    The SSEGrid project is an ESA/ESRIN project which started in 2009 and is executed by two Belgian companies, Spacebel and VITO, and one Dutch company, Dutch Space. The main project objectives are the introduction of a Grid-based processing on demand infrastructure at the Image Processing Centre for earth observation products at VITO and the inclusion of Grid processing services in the Service Support Environment (SSE) at ESRIN. The Grid-based processing on demand infrastructure is meant to support a Grid processing on demand model for Principal Investigators (PI) and allow the design and execution of multi-sensor applications with geographically spread data while minimising the transfer of huge volumes of data. In the first scenario, 'support a Grid processing on demand model for Principal Investigators', we aim to provide processing power close to the EO-data at the processing and archiving centres. We will allow a PI (non-Grid expert user) to upload his own algorithm, as a process, and his own auxiliary data from the SSE Portal and use them in an earth observation workflow on the SSEGrid Infrastructure. The PI can design and submit workflows using his own processes, processes made available by VITO/ESRIN and possibly processes from other users that are available on the Grid. These activities must be user-friendly and not requiring detailed knowledge about the underlying Grid middleware. In the second scenario we aim to design, implement and demonstrate a methodology to set up an earth observation processing facility, which uses large volumes of data from various geographically spread sensors. The aim is to provide solutions for problems that we face today, like wasting bandwidth by copying large volumes of data to one location. We will avoid this by processing the data where they are. The multi-mission Grid-based processing on demand infrastructure will allow developing and executing complex and massive multi-sensor data (re-)processing applications more

  6. Securing smart grid technology

    NASA Astrophysics Data System (ADS)

    Chaitanya Krishna, E.; Kosaleswara Reddy, T.; Reddy, M. YogaTeja; Reddy G. M., Sreerama; Madhusudhan, E.; AlMuhteb, Sulaiman

    2013-03-01

    In the developing countries electrical energy is very important for its all-round improvement by saving thousands of dollars and investing them in other sector for development. For Growing needs of power existing hierarchical, centrally controlled grid of the 20th Century is not sufficient. To produce and utilize effective power supply for industries or people we should have Smarter Electrical grids that address the challenges of the existing power grid. The Smart grid can be considered as a modern electric power grid infrastructure for enhanced efficiency and reliability through automated control, high-power converters, modern communications infrastructure along with modern IT services, sensing and metering technologies, and modern energy management techniques based on the optimization of demand, energy and network availability and so on. The main objective of this paper is to provide a contemporary look at the current state of the art in smart grid communications as well as critical issues on smart grid technologies primarily in terms of information and communication technology (ICT) issues like security, efficiency to communications layer field. In this paper we propose new model for security in Smart Grid Technology that contains Security Module(SM) along with DEM which will enhance security in Grid. It is expected that this paper will provide a better understanding of the technologies, potential advantages and research challenges of the smart grid and provoke interest among the research community to further explore this promising research area.

  7. Grid Application for the BaBar Experiment

    SciTech Connect

    Khan, A.; Wilson, F.; /Rutherford

    2006-08-14

    This paper discusses the use of e-Science Grid in providing computational resources for modern international High Energy Physics (HEP) experiments. We investigate the suitability of the current generation of Grid software to provide the necessary resources to perform large-scale simulation of the experiment and analysis of data in the context of multinational collaboration.

  8. Impingement-Current-Erosion Characteristics of Accelerator Grids on Two-Grid Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Barker, Timothy

    1996-01-01

    Accelerator grid sputter erosion resulting from charge-exchange-ion impingement is considered to be a primary cause of failure for electrostatic ion thrusters. An experimental method was developed and implemented to measure erosion characteristics of ion-thruster accel-grids for two-grid systems as a function of beam current, accel-grid potential, and facility background pressure. Intricate accelerator grid erosion patterns, that are typically produced in a short time (a few hours), are shown. Accelerator grid volumetric and depth-erosion rates are calculated from these erosion patterns and reported for each of the parameters investigated. A simple theoretical volumetric erosion model yields results that are compared to experimental findings. Results from the model and experiments agree to within 10%, thereby verifying the testing technique. In general, the local distribution of erosion is concentrated in pits between three adjacent holes and trenches that join pits. The shapes of the pits and trenches are shown to be dependent upon operating conditions. Increases in beam current and the accel-grid voltage magnitude lead to deeper pits and trenches. Competing effects cause complex changes in depth-erosion rates as background pressure is increased. Shape factors that describe pits and trenches (i.e. ratio of the average erosion width to the maximum possible width) are also affected in relatively complex ways by changes in beam current, ac tel-grid voltage magnitude, and background pressure. In all cases, however, gross volumetric erosion rates agree with theoretical predictions.

  9. ITIL and Grid services at GridKa

    NASA Astrophysics Data System (ADS)

    Marten, H.; Koenig, T.

    2010-04-01

    The Steinbuch Centre for Computing (SCC) is a new organizational unit of the Karlsruhe Institute of Technology (KIT). Founded in February 2008 as a merger of the previous Institute for Scientific Computing of Forschungszentrum Karlsruhe and the Computing Centre of the Technical University Karlsruhe, SCC provides a broad spectrum of IT services for 8.000 employees and 18.000 students and carries out research and development in key areas of information technology under the same roof. SCC is also known to host the German WLCG [1] Tier-1 centre GridKa. In order to accompany the merging of the two existing computing centres located at a distance of about 10 km and to provide common first class services for science, SCC has selected the IT service management according to the industrial quasi-standard "IT Infrastructure Library (ITIL)" [3] as a strategic element. The paper discusses the implementation of a few ITIL key components from the perspective of a Scientific Computing Centre using examples of Grid services at GridKa.

  10. Grid computing and biomolecular simulation.

    PubMed

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  11. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  12. The experimental teaching reform in biochemistry and molecular biology for undergraduate students in Peking University Health Science Center.

    PubMed

    Yang, Xiaohan; Sun, Luyang; Zhao, Ying; Yi, Xia; Zhu, Bin; Wang, Pu; Lin, Hong; Ni, Juhua

    2015-01-01

    Since 2010, second-year undergraduate students of an eight-year training program leading to a Doctor of Medicine degree or Doctor of Philosophy degree in Peking University Health Science Center (PKUHSC) have been required to enter the "Innovative talent training project." During that time, the students joined a research lab and participated in some original research work. There is a critical educational need to prepare these students for the increasing accessibility of research experience. The redesigned experimental curriculum of biochemistry and molecular biology was developed to fulfill such a requirement, which keeps two original biochemistry experiments (Gel filtration and Enzyme kinetics) and adds a new two-experiment component called "Analysis of anti-tumor drug induced apoptosis." The additional component, also known as the "project-oriented experiment" or the "comprehensive experiment," consists of Western blotting and a DNA laddering assay to assess the effects of etoposide (VP16) on the apoptosis signaling pathways. This reformed laboratory teaching system aims to enhance the participating students overall understanding of important biological research techniques and the instrumentation involved, and to foster a better understanding of the research process all within a classroom setting. Student feedback indicated that the updated curriculum helped them improve their operational and self-learning capability, and helped to increase their understanding of theoretical knowledge and actual research processes, which laid the groundwork for their future research work.

  13. Solar cell grid patterns

    NASA Technical Reports Server (NTRS)

    Yasui, R. K.; Berman, P. A. (Inventor)

    1976-01-01

    A grid pattern is described for a solar cell of the type which includes a semiconductive layer doped to a first polarity and a top counter-doped layer. The grid pattern comprises a plurality of concentric conductive grids of selected geometric shapes which are centered about the center of the exposed active surface of the counter-doped layer. Connected to the grids is one or more conductors which extend to the cell's periphery. For the pattern area, the grids and conductors are arranged in the pattern to minimize the maximum distance which any injected majority carriers have to travel to reach any of the grids or conductors. The pattern has a multiaxes symmetry with respect to the cell center to minimize the maximum temperature differentials between points on the cell surface and to provide a more uniform temperature distribution across the cell face.

  14. A grid amplifier

    NASA Technical Reports Server (NTRS)

    Kim, Moonil; Weikle, Robert M., II; Hacker, Jonathan B.; Delisio, Michael P.; Rutledge, David B.; Rosenberg, James J.; Smith, R. P.

    1991-01-01

    A 50-MESFET grid amplifier is reported that has a gain of 11 dB at 3.3 GHz. The grid isolates the input from the output by using vertical polarization for the input beam and horizontal polarization for the transmitted output beam. The grid unit cell is a two-MESFET differential amplifier. A simple calibration procedure allows the gain to be calculated from a relative power measurement. This grid is a hybrid circuit, but the structure is suitable for fabrication as a monolithic wafer-scale integrated circuit, particularly at millimeter wavelengths.

  15. Enhanced Elliptic Grid Generation

    NASA Technical Reports Server (NTRS)

    Kaul, Upender K.

    2007-01-01

    An enhanced method of elliptic grid generation has been invented. Whereas prior methods require user input of certain grid parameters, this method provides for these parameters to be determined automatically. "Elliptic grid generation" signifies generation of generalized curvilinear coordinate grids through solution of elliptic partial differential equations (PDEs). Usually, such grids are fitted to bounding bodies and used in numerical solution of other PDEs like those of fluid flow, heat flow, and electromagnetics. Such a grid is smooth and has continuous first and second derivatives (and possibly also continuous higher-order derivatives), grid lines are appropriately stretched or clustered, and grid lines are orthogonal or nearly so over most of the grid domain. The source terms in the grid-generating PDEs (hereafter called "defining" PDEs) make it possible for the grid to satisfy requirements for clustering and orthogonality properties in the vicinity of specific surfaces in three dimensions or in the vicinity of specific lines in two dimensions. The grid parameters in question are decay parameters that appear in the source terms of the inhomogeneous defining PDEs. The decay parameters are characteristic lengths in exponential- decay factors that express how the influences of the boundaries decrease with distance from the boundaries. These terms govern the rates at which distance between adjacent grid lines change with distance from nearby boundaries. Heretofore, users have arbitrarily specified decay parameters. However, the characteristic lengths are coupled with the strengths of the source terms, such that arbitrary specification could lead to conflicts among parameter values. Moreover, the manual insertion of decay parameters is cumbersome for static grids and infeasible for dynamically changing grids. In the present method, manual insertion and user specification of decay parameters are neither required nor allowed. Instead, the decay parameters are

  16. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services

  17. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services

  18. Consideration of Experimental Approaches in the Physical and Biological Sciences in Designing Long-Term Watershed Studies in Forested Landscapes

    NASA Astrophysics Data System (ADS)

    Stallard, R. F.

    2011-12-01

    The importance of biological processes in controlling weathering, erosion, stream-water composition, soil formation, and overall landscape development is generally accepted. The U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) Project in eastern Puerto Rico and Panama and the Smithsonian Tropical Research Institute (STRI) Panama Canal Watershed Experiment (PCWE) are landscape-scale studies based in the humid tropics where the warm temperatures, moist conditions, and luxuriant vegetation promote especially rapid biological and chemical processes - photosynthesis, respiration, decay, and chemical weathering. In both studies features of small-watershed, large-watershed, and landscape-scale-biology experiments are blended to satisfy the research needs of the physical and biological sciences. The WEBB Project has successfully synthesized its first fifteen years of data, and has addressed the influence of land cover, geologic, topographic, and hydrologic variability, including huge storms on a wide range of hydrologic, physical, and biogeochemical processes. The ongoing PCWE should provide a similar synthesis of a moderate-sized humid tropical watershed. The PCWE and the Agua Salud Project (ASP) within the PCWE are now addressing the role of land cover (mature forests, pasture, invasive-grass dominated, secondary succession, native species plantation, and teak) at scales ranging from small watersheds to the whole Panama Canal watershed. Biologists have participated in the experimental design at both watershed scales, and small (0.1 ha) to large (50 ha) forest-dynamic plots have a central role in interfacing between physical scientists and biologists. In these plots, repeated, high-resolution mapping of all woody plants greater than 1-cm diameter provides a description of population changes through time presumably reflecting individual life histories, interactions with other organisms and the influence of landscape processes and climate

  19. 75 FR 57006 - Addressing Policy and Logistical Challenges to Smart Grid Implementation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... solutions being developed by the Smart Grid Subcommittee of the National Science and Technology Council's... series of RFIs issued by DOE regarding smart grid implementation. Prior RFIs sought comment on data access, data usage and privacy issues, and on communications requirements for the smart grid. In this...

  20. Geometric grid generation

    NASA Technical Reports Server (NTRS)

    Ives, David

    1995-01-01

    This paper presents a highly automated hexahedral grid generator based on extensive geometrical and solid modeling operations developed in response to a vision of a designer-driven one day turnaround CFD process which implies a designer-driven one hour grid generation process.

  1. Internet 2 Access Grid.

    ERIC Educational Resources Information Center

    Simco, Greg

    2002-01-01

    Discussion of the Internet 2 Initiative, which is based on collaboration among universities, businesses, and government, focuses on the Access Grid, a Computational Grid that includes interactive multimedia within high-speed networks to provide resources to enable remote collaboration among the research community. (Author/LRW)

  2. Security for grids

    SciTech Connect

    Humphrey, Marty; Thompson, Mary R.; Jackson, Keith R.

    2005-08-14

    Securing a Grid environment presents a distinctive set of challenges. This paper groups the activities that need to be secured into four categories: naming and authentication; secure communication; trust, policy, and authorization; and enforcement of access control. It examines the current state of the art in securing these processes and introduces new technologies that promise to meet the security requirements of Grids more completely.

  3. Transforming Power Grid Operations

    SciTech Connect

    Huang, Zhenyu; Guttromson, Ross T.; Nieplocha, Jarek; Pratt, Robert G.

    2007-04-15

    While computation is used to plan, monitor, and control power grids, some of the computational technologies now used are more than a hundred years old, and the complex interactions of power grid components impede real-time operations. Thus it is hard to speed up “state estimation,” the procedure used to estimate the status of the power grid from measured input. State estimation is the core of grid operations, including contingency analysis, automatic generation control, and optimal power flow. How fast state estimation and contingency analysis are conducted (currently about every 5 minutes) needs to be increased radically so the analysis of contingencies is comprehensive and is conducted in real time. Further, traditional state estimation is based on a power flow model and only provides a static snapshot—a tiny piece of the state of a large-scale dynamic machine. Bringing dynamic aspects into real-time grid operations poses an even bigger challenge. Working with the latest, most advanced computing techniques and hardware, researchers at Pacific Northwest National Laboratory (PNNL) intend to transform grid operations by increasing computational speed and improving accuracy. Traditional power grid computation is conducted on single PC hardware platforms. This article shows how traditional power grid computation can be reformulated to take advantage of advanced computing techniques and be converted to high-performance computing platforms (e.g., PC clusters, reconfigurable hardware, scalable multicore shared memory computers, or multithreaded architectures). The improved performance is expected to have a huge impact on how power grids are operated and managed and ultimately will lead to more reliability and better asset utilization to the power industry. New computational capabilities will be tested and demonstrated on the comprehensive grid operations platform in the Electricity Infrastructure Operations Center, which is a newly commissioned PNNL facility for

  4. D. Carlos de Braganca, a Pioneer of Experimental Marine Oceanography: Filling the Gap between Formal and Informal Science Education

    ERIC Educational Resources Information Center

    Faria, Claudia; Pereira, Goncalo; Chagas, Isabel

    2012-01-01

    The activities presented in this paper are part of a wider project that investigates the effects of infusing the history of science in science teaching, toward students' learning and attitude. Focused on the work of D. Carlos de Braganca, King of Portugal from 1889 to 1908, and a pioneer oceanographer, the activities are addressed at the secondary…

  5. Decentral Smart Grid Control

    NASA Astrophysics Data System (ADS)

    Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk

    2015-01-01

    Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals.

  6. Grid Connected Functionality

    DOE Data Explorer

    Baker, Kyri; Jin, Xin; Vaidynathan, Deepthi; Jones, Wesley; Christensen, Dane; Sparn, Bethany; Woods, Jason; Sorensen, Harry; Lunacek, Monte

    2016-08-04

    Dataset demonstrating the potential benefits that residential buildings can provide for frequency regulation services in the electric power grid. In a hardware-in-the-loop (HIL) implementation, simulated homes along with a physical laboratory home are coordinated via a grid aggregator, and it is shown that their aggregate response has the potential to follow the regulation signal on a timescale of seconds. Connected (communication-enabled), devices in the National Renewable Energy Laboratory's (NREL's) Energy Systems Integration Facility (ESIF) received demand response (DR) requests from a grid aggregator, and the devices responded accordingly to meet the signal while satisfying user comfort bounds and physical hardware limitations.

  7. Dynamic Load Balancing for Adaptive Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Saini, Subhash (Technical Monitor)

    1998-01-01

    Dynamic mesh adaptation on unstructured grids is a powerful tool for computing unsteady three-dimensional problems that require grid modifications to efficiently resolve solution features. By locally refining and coarsening the mesh to capture phenomena of interest, such procedures make standard computational methods more cost effective. Highly refined meshes are required to accurately capture shock waves, contact discontinuities, vortices, and shear layers in fluid flow problems. Adaptive meshes have also proved to be useful in several other areas of computational science and engineering like computer vision and graphics, semiconductor device modeling, and structural mechanics. Local mesh adaptation provides the opportunity to obtain solutions that are comparable to those obtained on globally-refined grids but at a much lower cost. Additional information is contained in the original extended abstract.

  8. Grid Computing Education Support

    SciTech Connect

    Steven Crumb

    2008-01-15

    The GGF Student Scholar program enabled GGF the opportunity to bring over sixty qualified graduate and under-graduate students with interests in grid technologies to its three annual events over the three-year program.

  9. Space Development Grid Portal

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2004-01-01

    This viewgraph presentation provides information on the development of a portal to provide secure and distributed grid computing for Payload Operations Integrated Center and Mission Control Center ground services.

  10. Exploring Hypersonic, Unstructured-Grid Issues through Structured Grids

    NASA Technical Reports Server (NTRS)

    Mazaheri, Ali R.; Kleb, Bill

    2007-01-01

    Pure-tetrahedral unstructured grids have been shown to produce asymmetric heat transfer rates for symmetric problems. Meanwhile, two-dimensional structured grids produce symmetric solutions and as documented here, introducing a spanwise degree of freedom to these structured grids also yields symmetric solutions. The effects of grid skewness and other perturbations of structured-grids are investigated to uncover possible mechanisms behind the unstructured-grid solution asymmetries. By using controlled experiments around a known, good solution, the effects of particular grid pathologies are uncovered. These structured-grid experiments reveal that similar solution degradation occurs as for unstructured grids, especially for heat transfer rates. Non-smooth grids within the boundary layer is also shown to produce large local errors in heat flux but do not affect surface pressures.

  11. Near-Body Grid Adaption for Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  12. Using Grid Benchmarks for Dynamic Scheduling of Grid Applications

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert

    2003-01-01

    Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.

  13. Beyond grid security

    NASA Astrophysics Data System (ADS)

    Hoeft, B.; Epting, U.; Koenig, T.

    2008-07-01

    While many fields relevant to Grid security are already covered by existing working groups, their remit rarely goes beyond the scope of the Grid infrastructure itself. However, security issues pertaining to the internal set-up of compute centres have at least as much impact on Grid security. Thus, this talk will present briefly the EU ISSeG project (Integrated Site Security for Grids). In contrast to groups such as OSCT (Operational Security Coordination Team) and JSPG (Joint Security Policy Group), the purpose of ISSeG is to provide a holistic approach to security for Grid computer centres, from strategic considerations to an implementation plan and its deployment. The generalised methodology of Integrated Site Security (ISS) is based on the knowledge gained during its implementation at several sites as well as through security audits, and this will be briefly discussed. Several examples of ISS implementation tasks at the Forschungszentrum Karlsruhe will be presented, including segregation of the network for administration and maintenance and the implementation of Application Gateways. Furthermore, the web-based ISSeG training material will be introduced. This aims to offer ISS implementation guidance to other Grid installations in order to help avoid common pitfalls.

  14. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  15. GRIDS: Grid-Scale Rampable Intermittent Dispatchable Storage

    SciTech Connect

    2010-09-01

    GRIDS Project: The 12 projects that comprise ARPA-E’s GRIDS Project, short for “Grid-Scale Rampable Intermittent Dispatchable Storage,” are developing storage technologies that can store renewable energy for use at any location on the grid at an investment cost less than $100 per kilowatt hour. Flexible, large-scale storage would create a stronger and more robust electric grid by enabling renewables to contribute to reliable power generation.

  16. The Benefits of Grid Networks

    ERIC Educational Resources Information Center

    Tennant, Roy

    2005-01-01

    In the article, the author talks about the benefits of grid networks. In speaking of grid networks the author is referring to both networks of computers and networks of humans connected together in a grid topology. Examples are provided of how grid networks are beneficial today and the ways in which they have been used.

  17. Computer Code Generates Homotopic Grids

    NASA Technical Reports Server (NTRS)

    Moitra, Anutosh

    1992-01-01

    HOMAR is computer code using homotopic procedure to produce two-dimensional grids in cross-sectional planes, which grids then stacked to produce quasi-three-dimensional grid systems for aerospace configurations. Program produces grids for use in both Euler and Navier-Stokes computation of flows. Written in FORTRAN 77.

  18. Unstructured grids on SIMD torus machines

    NASA Technical Reports Server (NTRS)

    Bjorstad, Petter E.; Schreiber, Robert

    1994-01-01

    Unstructured grids lead to unstructured communication on distributed memory parallel computers, a problem that has been considered difficult. Here, we consider adaptive, offline communication routing for a SIMD processor grid. Our approach is empirical. We use large data sets drawn from supercomputing applications instead of an analytic model of communication load. The chief contribution of this paper is an experimental demonstration of the effectiveness of certain routing heuristics. Our routing algorithm is adaptive, nonminimal, and is generally designed to exploit locality. We have a parallel implementation of the router, and we report on its performance.

  19. GridLAB-D/SG

    SciTech Connect

    2011-08-30

    GridLAB-D is a new power system simulation tool that provides valuable information to users who design and operate electric power transmission and distribution systems, and to utilities that wish to take advantage of the latest smart grid technology. This special release of GridLAB-D was developed to study the proposed Smart Grid technology that is used by Battelle Memorial Institute in the AEP gridSMART demonstration project in Northeast Columbus, Ohio.

  20. Spaceflight Operations Services Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Mehrotra, Piyush; Lisotta, Anthony

    2004-01-01

    NASA over the years has developed many types of technologies and conducted various types of science resulting in numerous variations of operations, data and applications. For example, operations range from deep space projects managed by JPL, Saturn and Shuttle operations managed from JSC and KSC, ISS science operations managed from MSFC and numerous low earth orbit satellites managed from GSFC that are varied and intrinsically different but require many of the same types of services to fulfill their missions. Also, large data sets (databases) of Shuttle flight data, solar system projects and earth observing data exist which because of their varied and sometimes outdated technologies are not and have not been fully examined for additional information and knowledge. Many of the applications/systems supporting operational services e.g. voice, video, telemetry and commanding, are outdated and obsolete. The vast amounts of data are located in various formats, at various locations and range over many years. The ability to conduct unified space operations, access disparate data sets and to develop systems and services that can provide operational services does not currently exist in any useful form. In addition, adding new services to existing operations is generally expensive and with the current budget constraints not feasible on any broad level of implementation. To understand these services a discussion of each one follows. The Spaceflight User-based Services are those services required to conduct space flight operations. Grid Services are those Grid services that will be used to overcome, through middleware software, some or all the problems that currently exists. In addition, Network Services will be discussed briefly. Network Services are crucial to any type of remedy and are evolving adequately to support any technology currently in development.

  1. Simulation of Unsteady Flows Using an Unstructured Navier-Stokes Solver on Moving and Stationary Grids

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Vatsa, Veer N.; Atkins, Harold L.

    2005-01-01

    We apply an unsteady Reynolds-averaged Navier-Stokes (URANS) solver for unstructured grids to unsteady flows on moving and stationary grids. Example problems considered are relevant to active flow control and stability and control. Computational results are presented using the Spalart-Allmaras turbulence model and are compared to experimental data. The effect of grid and time-step refinement are examined.

  2. Two grid iteration with a conjugate gradient fine grid smoother applied to a groundwater flow model

    SciTech Connect

    Hagger, M.J.; Spence, A.; Cliffe, K.A.

    1994-12-31

    This talk is concerned with the efficient solution of Ax=b, where A is a large, sparse, symmetric positive definite matrix arising from a standard finite element discretisation of the groundwater flow problem {triangledown}{sm_bullet}(k{triangledown}p)=0. Here k is the coefficient of rock permeability in applications and is highly discontinuous. The discretisation is carried out using the Harwell NAMMU finite element package, using, for 2D, 9 node biquadratic rectangular elements, and 27 node biquadratics for 3D. The aim is to develop a robust technique for iterative solutions of 3D problems based on a regional groundwater flow model of a geological area with sharply varying hydrogeological properties. Numerical experiments with polynomial preconditioned conjugate gradient methods on a 2D groundwater flow model were found to yield very poor results, converging very slowly. In order to utilise the fact that A comes from the discretisation of a PDE the authors try the two grid method as is well analysed from studies of multigrid methods, see for example {open_quotes}Multi-Grid Methods and Applications{close_quotes} by W. Hackbusch. Specifically they consider two discretisations resulting in stiffness matrices A{sub N} and A{sub n}, of size N and n respectively, where N > n, for both a model problem and the geological model. They perform a number of conjugate gradient steps on the fine grid, ie using A{sub N}, followed by an exact coarse grid solve, using A{sub n}, and then update the fine grid solution, the exact coarse grid solve being done using a frontal method factorisation of A{sub n}. Note that in the context of the standard two grid method this is equivalent to using conjugate gradients as a fine grid smoothing step. Experimental results are presented to show the superiority of the two grid iteration method over the polynomial preconditioned conjugate gradient method.

  3. Complex Volume Grid Generation Through the Use of Grid Reusability

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This paper presents a set of surface and volume grid generation techniques which reuse existing surface and volume grids. These methods use combinations of data manipulations to reduce grid generation time, improve grid characteristics, and increase the capabilities of existing domain discretization software. The manipulation techniques utilize physical and computational domains to produce basis function on which to operate and modify grid character and smooth grids using Trans-Finite Interpolation, a vector interpolation method and parametric re-mapping technique. With these new techniques, inviscid grids can be converted to viscous grids, multiple zone grid adaption can be performed to improve CFD solver efficiency, and topological changes to improve modeling of flow fields can be done simply and quickly. Examples of these capabilities are illustrated as applied to various configurations.

  4. An embedded grid formulation applied to a delta wing

    NASA Technical Reports Server (NTRS)

    Krist, Sherrie L.; Thomas, James L.; Sellers, William L., III; Kjelgaard, Scott O.

    1990-01-01

    Applications using a three-dimensional embedded grid scheme are made to high angle of attack viscous flow over two bodies: a slender cone using the conical approximation and a 75 deg swept delta wing. The embedded grids are used principally to improve the numerical resolution of the separated vortical flow above the body. Detailed comparisons are made with experimental measurements of the velocity field over the delta wing. The prediction of the maximum steamwise velocity is improved using two levels of embedded grid refinement but is still less than the experimental measurements available from a laser velocimeter.

  5. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  6. Grid Data Management and Customer Demands at MeteoSwiss

    NASA Astrophysics Data System (ADS)

    Rigo, G.; Lukasczyk, Ch.

    2010-09-01

    Data grids constitute the required input form for a variety of applications. Therefore, customers increasingly expect climate services to not only provide measured data, but also grids of these with the required configurations on an operational basis. Currently, MeteoSwiss is establishing a production chain for delivering data grids by subscription directly from the data warehouse in order to meet the demand for precipitation data grids by governmental, business and science customers. The MeteoSwiss data warehouse runs on an Oracle database linked with an ArcGIS Standard edition geodatabase. The grids are produced by Unix-based software written in R called GRIDMCH which extracts the station data from the data warehouse and stores the files in the file system. By scripts, the netcdf-v4 files are imported via an FME interface into the database. Currently daily and monthly deliveries of daily precipitation grids are available from MeteoSwiss with a spatial resolution of 2.2km x 2.2km. These daily delivered grids are a preliminary based on 100 measuring sites whilst the grid of the monthly delivery of daily sums is calculated out of about 430 stations. Crucial for the absorption by the customers is the understanding of and the trust into the new grid product. Clearly stating needs which can be covered by grid products, the customers require a certain lead time to develop applications making use of the particular grid. Therefore, early contacts and a continuous attendance as well as flexibility in adjusting the production process to fulfill emerging customer needs are important during the introduction period. Gridding over complex terrain can lead to temporally elevated uncertainties in certain areas depending on the weather situation and coverage of measurements. Therefore, careful instructions on the quality and use and the possibility to communicate the uncertainties of gridded data proofed to be essential especially to the business and science customers who require

  7. The PacCAF Grid portal for the CDF experiment

    NASA Astrophysics Data System (ADS)

    Hou, Suen

    Distributed computing for the CDF experiment has been developed and is evolving towards shared resources on the computing Grid. Dedicated CAFs (CDF Analysis Farm) were constructed on Condor pools with a suit of services for user authentication, software distribution, and network connection to worker nodes.With the Condor Glide-in mechanism, the CAFs are extended to using dynamic worker pools collected from the Grid. The PacCAF (Pacific CAF) is the Glide CAF thus built to provide a single point portal to LCG (LHC ComputingGrid) and OSG (Open Science Grid) sites in the Pacific Asia region. We discuss the implementation and service as a late-binding solution towards Grid computing.

  8. NREL Smart Grid Projects

    SciTech Connect

    Hambrick, J.

    2012-01-01

    Although implementing Smart Grid projects at the distribution level provides many advantages and opportunities for advanced operation and control, a number of significant challenges must be overcome to maintain the high level of safety and reliability that the modern grid must provide. For example, while distributed generation (DG) promises to provide opportunities to increase reliability and efficiency and may provide grid support services such as volt/var control, the presence of DG can impact distribution operation and protection schemes. Additionally, the intermittent nature of many DG energy sources such as photovoltaics (PV) can present a number of challenges to voltage regulation, etc. This presentation provides an overview a number of Smart Grid projects being performed by the National Renewable Energy Laboratory (NREL) along with utility, industry, and academic partners. These projects include modeling and analysis of high penetration PV scenarios (with and without energy storage), development and testing of interconnection and microgrid equipment, as well as the development and implementation of advanced instrumentation and data acquisition used to analyze the impacts of intermittent renewable resources. Additionally, standards development associated with DG interconnection and analysis as well as Smart Grid interoperability will be discussed.

  9. The Experimental Teaching Reform in Biochemistry and Molecular Biology for Undergraduate Students in Peking University Health Science Center

    ERIC Educational Resources Information Center

    Yang, Xiaohan; Sun, Luyang; Zhao, Ying; Yi, Xia; Zhu, Bin; Wang, Pu; Lin, Hong; Ni, Juhua

    2015-01-01

    Since 2010, second-year undergraduate students of an eight-year training program leading to a Doctor of Medicine degree or Doctor of Philosophy degree in Peking University Health Science Center (PKUHSC) have been required to enter the "Innovative talent training project." During that time, the students joined a research lab and…

  10. Exploring the Opinions of Pre-Service Science Teachers in Their Experimental Designs Prepared Based on Various Approaches

    ERIC Educational Resources Information Center

    Benzer, Elif

    2015-01-01

    The students in working in laboratories in 21st century are preferred to take place as active participants in the experiments coming up with their own designs and projects by developing new ideas and problems rather than implementing the ones told and ordered by others during these experiments. The science teachers that would have the students…

  11. An Experimental Study into the Effect of Science Teaching on the Fourth-Grade Child's Concept of Piagetian Physical Causality.

    ERIC Educational Resources Information Center

    Gann, Louise L.; Fowler, H. Seymour

    The purpose of this study was to investigate the effect of selected science experiences on fourth-grade students' concepts of Piagetian physical causality, namely, animism and dynamism. Vocabulary ability and Piagetian developmental stages were assessed by the "Concept Assessment Kit Conservation" and "Metropolitan Achievement Test." Three-hundred…

  12. Fusion Data Grid Service

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana; Wang, Nanbor

    2004-11-01

    Simulations and experiments in the fusion and plasma physics community generate large datasets at remote sites. Visualization and analysis of these datasets are difficult because of the incompatibility among the various data formats adopted by simulation, experiments, and analysis tools, and the large sizes of analyzed data. Grids and Web Services technologies are capable of providing solutions for such heterogeneous settings, but need to be customized to the field-specific needs and merged with distributed technologies currently used by the community. This paper describes how we are addressing these issues in the Fusion Grid Service under development. We also present performance results of relevant data transfer mechanisms including binary SOAP, DIME, GridFTP and MDSplus and CORBA. We will describe the status of data converters (between HDF5 and MDSplus data types), developed in collaboration with MIT (J. Stillerman). Finally, we will analyze bottlenecks of MDSplus data transfer mechanism (work performed in collaboration with General Atomics (D. Schissel and M. Qian).

  13. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  14. Robotic Exploration and Science in Pits and Caves: Results from Three Years and Counting of Analog Field Experimentation

    NASA Astrophysics Data System (ADS)

    Wong, U. Y.; Whittaker, W. L.

    2015-10-01

    Robots are poised to access, investigate, and model planetary caves. We present the results of a multi-year campaign to develop robotic technologies for this domain, anchored by the most comprehensive analog field experimentation to date.

  15. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation

  16. GridPV Toolbox

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feedermore » on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.« less

  17. GridPV Toolbox

    SciTech Connect

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago; Reno, Matthew; Coogan, Kyle

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  18. Quasi-optical overmoded waveguide frequency multiplier grid arrays

    NASA Astrophysics Data System (ADS)

    Rosenau, Steven Andrew

    There is a growing need for compact, lightweight, inexpensive high power millimeter wave sources. Frequency multipliers can provide these sources by efficiently converting high power microwave signals to millimeter frequencies. Quasi-optical frequency multiplier grid arrays, comprised of hundreds to thousands of varactor devices and antennas on a single wafer, utilize spatial power combining to significantly increase power handling capability beyond that of a single device. In this dissertation work, theoretical and experimental investigations of frequency multiplier grid arrays have been conducted with a specific focus on overmoded waveguide systems. The principles of frequency multipliers and quasi-optical grid array power combining are presented. Simulation, design and experimental measurement techniques are described for both frequency tripler and doubler grid arrays. During this dissertation work, several quantum barrier varactor frequency tripler grid array systems and Schottky varactor frequency doubler grid array systems were designed, fabricated and tested. A frequency tripler grid array system, containing an innovative integrated output structure, achieved a multiplication efficiency of 3.4% and an output power of 148 mW. The two most efficient frequency doubler grid array systems achieved 11.7% multiplication efficiency and 0.41 W output power.

  19. Essential Grid Workflow Monitoring Elements

    SciTech Connect

    Gunter, Daniel K.; Jackson, Keith R.; Konerding, David E.; Lee,Jason R.; Tierney, Brian L.

    2005-07-01

    Troubleshooting Grid workflows is difficult. A typicalworkflow involves a large number of components networks, middleware,hosts, etc. that can fail. Even when monitoring data from all thesecomponents is accessible, it is hard to tell whether failures andanomalies in these components are related toa given workflow. For theGrid to be truly usable, much of this uncertainty must be elim- inated.We propose two new Grid monitoring elements, Grid workflow identifiersand consistent component lifecycle events, that will make Gridtroubleshooting easier, and thus make Grids more usable, by simplifyingthe correlation of Grid monitoring data with a particular Gridworkflow.

  20. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  1. A VO-Driven Astronomical Data Grid in China

    NASA Astrophysics Data System (ADS)

    Cui, C.; He, B.; Yang, Y.; Zhao, Y.

    2010-12-01

    With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.

  2. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  3. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  4. Unlocking the smart grid

    SciTech Connect

    Rokach, Joshua Z.

    2010-10-15

    The country has progressed in a relatively short time from rotary dial phones to computers, cell phones, and iPads. With proper planning and orderly policy implementation, the same will happen with the Smart Grid. Here are some suggestions on how to proceed. (author)

  5. APEC Smart Grid Initiative

    SciTech Connect

    Bloyd, Cary N.

    2012-03-01

    This brief paper describes the activities of the Asia Pacific Economic Cooperation (APEC) Smart Grid Initiative (ASGI) which is being led by the U.S. and developed by the APEC Energy Working Group. In the paper, I describe the origin of the initiative and briefly mention the four major elements of the initiative along with existing APEC projects which support it.

  6. NSTAR Smart Grid Pilot

    SciTech Connect

    Rabari, Anil; Fadipe, Oloruntomi

    2014-03-31

    NSTAR Electric & Gas Corporation (“the Company”, or “NSTAR”) developed and implemented a Smart Grid pilot program beginning in 2010 to demonstrate the viability of leveraging existing automated meter reading (“AMR”) deployments to provide much of the Smart Grid functionality of advanced metering infrastructure (“AMI”), but without the large capital investment that AMI rollouts typically entail. In particular, a central objective of the Smart Energy Pilot was to enable residential dynamic pricing (time-of-use “TOU” and critical peak rates and rebates) and two-way direct load control (“DLC”) by continually capturing AMR meter data transmissions and communicating through customer-sited broadband connections in conjunction with a standardsbased home area network (“HAN”). The pilot was supported by the U.S. Department of Energy’s (“DOE”) through the Smart Grid Demonstration program. NSTAR was very pleased to not only receive the funding support from DOE, but the guidance and support of the DOE throughout the pilot. NSTAR is also pleased to report to the DOE that it was able to execute and deliver a successful pilot on time and on budget. NSTAR looks for future opportunities to work with the DOE and others in future smart grid projects.

  7. Can Clouds replace Grids? Will Clouds replace Grids?

    NASA Astrophysics Data System (ADS)

    Shiers, J. D.

    2010-04-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared "open" and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently "Cloud Computing" - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  8. The surveillance error grid.

    PubMed

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  9. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework

    PubMed Central

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-01-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework

  10. Towards Hybrid Overset Grid Simulations of the Launch Environment

    NASA Astrophysics Data System (ADS)

    Moini-Yekta, Shayan

    A hybrid overset grid approach has been developed for the design and analysis of launch vehicles and facilities in the launch environment. The motivation for the hybrid grid methodology is to reduce the turn-around time of computational fluid dynamic simulations and improve the ability to handle complex geometry and flow physics. The LAVA (Launch Ascent and Vehicle Aerodynamics) hybrid overset grid scheme consists of two components: an off-body immersed-boundary Cartesian solver with block-structured adaptive mesh refinement and a near-body unstructured body-fitted solver. Two-way coupling is achieved through overset connectivity between the off-body and near-body grids. This work highlights verification using code-to-code comparisons and validation using experimental data for the individual and hybrid solver. The hybrid overset grid methodology is applied to representative unsteady 2D trench and 3D generic rocket test cases.

  11. Spectral methods on arbitrary grids

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Gottlieb, David

    1995-01-01

    Stable and spectrally accurate numerical methods are constructed on arbitrary grids for partial differential equations. These new methods are equivalent to conventional spectral methods but do not rely on specific grid distributions. Specifically, we show how to implement Legendre Galerkin, Legendre collocation, and Laguerre Galerkin methodology on arbitrary grids.

  12. Grid Interaction Technical Team Roadmap

    SciTech Connect

    2013-06-01

    The mission of the Grid Interaction Technical Team (GITT) is to support a transition scenario to large scale grid-connected vehicle charging with transformational technology, proof of concept and information dissemination. The GITT facilitates technical coordination and collaboration between vehicle-grid connectivity and communication activities among U.S. DRIVE government and industry partners.

  13. Ion Engine Grid Gap Measurements

    NASA Technical Reports Server (NTRS)

    Soulas, Gerge C.; Frandina, Michael M.

    2004-01-01

    A simple technique for measuring the grid gap of an ion engine s ion optics during startup and steady-state operation was demonstrated with beam extraction. The grid gap at the center of the ion optics assembly was measured with a long distance microscope that was focused onto an alumina pin that protruded through the center accelerator grid aperture and was mechanically attached to the screen grid. This measurement technique was successfully applied to a 30 cm titanium ion optics assembly mounted onto an NSTAR engineering model ion engine. The grid gap and each grid s movement during startup from room temperature to both full and low power were measured. The grid gaps with and without beam extraction were found to be significantly different. The grid gaps at the ion optics center were both significantly smaller than the cold grid gap and different at the two power levels examined. To avoid issues associated with a small grid gap during thruster startup with titanium ion optics, a simple method was to operate the thruster initially without beam extraction to heat the ion optics. Another possible method is to apply high voltage to the grids prior to igniting the discharge because power deposition to the grids from the plasma is lower with beam extraction than without. Further testing would be required to confirm this approach.

  14. Experimental capabilities of 0.4 PW, 1 shot/min Scarlet laser facility for high energy density science.

    PubMed

    Poole, P L; Willis, C; Daskalova, R L; George, K M; Feister, S; Jiang, S; Snyder, J; Marketon, J; Schumacher, D W; Akli, K U; Van Woerkom, L; Freeman, R R; Chowdhury, E A

    2016-06-10

    We report on the recently completed 400 TW upgrade to the Scarlet laser at The Ohio State University. Scarlet is a Ti:sapphire-based ultrashort pulse system that delivers >10  J in 30 fs pulses to a 2 μm full width at half-maximum focal spot, resulting in intensities exceeding 5×1021  W/cm2. The laser fires at a repetition rate of once per minute and is equipped with a suite of on-demand and on-shot diagnostics detailed here, allowing for rapid collection of experimental statistics. As part of the upgrade, the entire laser system has been redesigned to facilitate consistent, characterized high intensity data collection at high repetition rates. The design and functionality of the laser and target chambers are described along with initial data from commissioning experimental shots. PMID:27409030

  15. Experimental capabilities of 0.4 PW, 1 shot/min Scarlet laser facility for high energy density science.

    PubMed

    Poole, P L; Willis, C; Daskalova, R L; George, K M; Feister, S; Jiang, S; Snyder, J; Marketon, J; Schumacher, D W; Akli, K U; Van Woerkom, L; Freeman, R R; Chowdhury, E A

    2016-06-10

    We report on the recently completed 400 TW upgrade to the Scarlet laser at The Ohio State University. Scarlet is a Ti:sapphire-based ultrashort pulse system that delivers >10  J in 30 fs pulses to a 2 μm full width at half-maximum focal spot, resulting in intensities exceeding 5×1021  W/cm2. The laser fires at a repetition rate of once per minute and is equipped with a suite of on-demand and on-shot diagnostics detailed here, allowing for rapid collection of experimental statistics. As part of the upgrade, the entire laser system has been redesigned to facilitate consistent, characterized high intensity data collection at high repetition rates. The design and functionality of the laser and target chambers are described along with initial data from commissioning experimental shots.

  16. Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid

    SciTech Connect

    2012-02-08

    GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improve the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.

  17. Grid generation and inviscid flow computation about aircraft geometries

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1989-01-01

    Grid generation and Euler flow about fighter aircraft are described. A fighter aircraft geometry is specified by an area ruled fuselage with an internal duct, cranked delta wing or strake/wing combinations, canard and/or horizontal tail surfaces, and vertical tail surfaces. The initial step before grid generation and flow computation is the determination of a suitable grid topology. The external grid topology that has been applied is called a dual-block topology which is a patched C (exp 1) continuous multiple-block system where inner blocks cover the highly-swept part of a cranked wing or strake, rearward inner-part of the wing, and tail components. Outer-blocks cover the remainder of the fuselage, outer-part of the wing, canards and extend to the far field boundaries. The grid generation is based on transfinite interpolation with Lagrangian blending functions. This procedure has been applied to the Langley experimental fighter configuration and a modified F-18 configuration. Supersonic flow between Mach 1.3 and 2.5 and angles of attack between 0 degrees and 10 degrees have been computed with associated Euler solvers based on the finite-volume approach. When coupling geometric details such as boundary layer diverter regions, duct regions with inlets and outlets, or slots with the general external grid, imposing C (exp 1) continuity can be extremely tedious. The approach taken here is to patch blocks together at common interfaces where there is no grid continuity, but enforce conservation in the finite-volume solution. The key to this technique is how to obtain the information required for a conservative interface. The Ramshaw technique which automates the computation of proportional areas of two overlapping grids on a planar surface and is suitable for coding was used. Researchers generated internal duct grids for the Langley experimental fighter configuration independent of the external grid topology, with a conservative interface at the inlet and outlet.

  18. Teachers' personal didactical models and obstacles to professional development: Case-studies with secondary experimental science teachers

    NASA Astrophysics Data System (ADS)

    Wamba Aguado, Ana Maria

    The aim of this thesis has been to elaborate criteria which characterise how teachers teach, as a curriculum component of their professional knowledge and to infer the obstacles which hinder their desired professional development, in such a way that they are considered in the design of proposals for teacher training in secondary education. In addition to this, a further objective was to elaborate and validate data analysis instruments. Case studies were carried out on three natural science secondary teachers with more than ten years' experience, enabling the characterisation of the teachers' science and science teaching conceptions as well as the description of classroom practice. Finally, with the help of these data together with the material used by the teachers, the inference of the teachers' personal didactical models and the obstacles to their professional development were made possible. Instruments for data collection used a questionnaire to facilitate the realisation of a semi-structured interview, video recordings of the classroom intervention of each teacher which correspond to a teaching unit taught over a two-week period and all the written material produced for the unit was collected. For the data analysis a taxonomy of classroom intervention patterns and a progression hypothesis towards desirable professional knowledge were elaborated, from the perspective of a research in the classroom model and according to a system of categories and subcategories which refer to their concepts about scientific knowledge, school knowledge, how to teach and evaluation. With the interview and the questionnaire a profile of exposed conceptions was obtained. The intervention profile was obtained using the classroom recordings; according to the patterns identified and their sequencing, both of which determine the characteristic structures and routines of these teachers. An outcome of these results was the validation of the previously mentioned taxonomy as an instrument of

  19. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, activities, field projects and computer programs in the biological and physical sciences. Instructional procedures, experimental designs, materials, and background information are suggested. Topics include fluid mechanics, electricity, crystals, arthropods, limpets, acid neutralization, and software evaluation. (ML)

  20. Reading to learn experimental practice: The role of text and firsthand experience in the acquisition of an abstract science principle

    NASA Astrophysics Data System (ADS)

    Richmond, Erica Kesin

    2008-10-01

    From the onset of schooling, texts are used as important educational tools. In the primary years, they are integral to learning how to decode and develop fluency. In the later elementary years, they are often essential to the acquisition of academic content. Unfortunately, many children experience difficulties with this process, which is due in large part to their unfamiliarity with the genre of academic texts. The articles presented in this dissertation share an underlying theme of how to develop children's ability to comprehend and learn from academic, and specifically, non-narrative texts. The first article reviews research on the development of non-narrative discourse to elucidate the linguistic precursors to non-narrative text comprehension. The second and third articles draw from an empirical study that investigated the best way to integrate text, manipulation, and first-hand experience for children's acquisition and application of an abstract scientific principle. The scientific principle introduced in the study was the Control of Variables Strategy (CVS), a fundamental idea underlying scientific reasoning and a strategy for designing unconfounded experiments. Eight grade 4 classes participated in the study (N = 129), in one of three conditions: (a) read procedural text and manipulate experimental materials, (b) listen to procedural text and manipulate experimental materials, or (c) read procedural text with no opportunity to manipulate experimental materials. Findings from the study indicate that children who had the opportunity to read and manipulate materials were most effective at applying the strategy to designing and justifying unconfounded experiments, and evaluating written and physical experimental designs; however, there was no effect of instructional condition on a written assessment of evaluating familiar and unfamiliar experimental designs one week after the intervention. These results suggest that the acquisition and application of an abstract

  1. Les apports de l'experimentation assistee par ordinateur (ExAO) en pedagogie par projet en Sciences de la nature au collegial

    NASA Astrophysics Data System (ADS)

    Marcotte, Alice

    The goals of this research were to conceptualize and to produce a test synthesis model for the Sciences program, where the student had to demonstrate his or her competency using the approach Considering New Situations from Acquired Knowledge. The test took the form of a student-structured project utilizing the experimental process: the student's scientific investigation was supported and facilitated by computer-assisted experimentation (CAEx). The model of action was elaborated in developmental research within the school setting, tested in biology, and continued in an interdisciplinary context. Our study focused on the advantages and the constraints of this new learning environment, which modify laboratories using traditional instrumentation. The final research was not to evaluate a type of test synthesis, but to propose and to improve this model of test synthesis based on experimental process and supported by CAEx. In order to implement the competency approach within an integration activity, we chose a cooperative learning environment contained within the pedagogical project. This didactic environment was inspired by socio-constructivism which involves students in open scientific problem-solving. Computer-assisted experimentation turned out to be a valuable tool for this environment, facilitating the implementation of the scientific process by increased induction. Resistance to confronted and uncircumvented reality changes students' perception of scientific knowledge. They learn to integrate the building of this knowledge, and then to realize the extent of their learning and their training. Students' opinions, which were gathered from questionnaires, reveal that they favorably perceive this type of environment in interaction with their peers and the experimentation. While this new knowledge contributes to CAEx within the pedagogical project, the products of this research included a teaching guide for the test synthesis, a booklet featuring the projects carried out

  2. Gridded electron reversal ionizer

    NASA Technical Reports Server (NTRS)

    Chutjian, Ara (Inventor)

    1993-01-01

    A gridded electron reversal ionizer forms a three dimensional cloud of zero or near-zero energy electrons in a cavity within a filament structure surrounding a central electrode having holes through which the sample gas, at reduced pressure, enters an elongated reversal volume. The resultant negative ion stream is applied to a mass analyzer. The reduced electron and ion space-charge limitations of this configuration enhances detection sensitivity for material to be detected by electron attachment, such as narcotic and explosive vapors. Positive ions may be generated by generating electrons having a higher energy, sufficient to ionize the target gas and pulsing the grid negative to stop the electron flow and pulsing the extraction aperture positive to draw out the positive ions.

  3. Smart Grid Demonstration Project

    SciTech Connect

    Miller, Craig; Carroll, Paul; Bell, Abigail

    2015-03-11

    The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives, to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and

  4. Wireless Communications in Smart Grid

    NASA Astrophysics Data System (ADS)

    Bojkovic, Zoran; Bakmaz, Bojan

    Communication networks play a crucial role in smart grid, as the intelligence of this complex system is built based on information exchange across the power grid. Wireless communications and networking are among the most economical ways to build the essential part of the scalable communication infrastructure for smart grid. In particular, wireless networks will be deployed widely in the smart grid for automatic meter reading, remote system and customer site monitoring, as well as equipment fault diagnosing. With an increasing interest from both the academic and industrial communities, this chapter systematically investigates recent advances in wireless communication technology for the smart grid.

  5. On transferring the grid technology to the biomedical community.

    PubMed

    Mohammed, Yassene; Sax, Ulrich; Dickmann, Frank; Lippert, Joerg; Solodenko, Juri; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which resulted in the Grid. The inter domain transfer process of this technology has been an intuitive process. Some difficulties facing the life science community can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies that have achieved certain stability. Grid and Cloud solutions are technologies that are still in flux. We illustrate how Grid computing creates new difficulties for the technology transfer process that are not considered in Bozeman's model. We show why the success of health Grids should be measured by the qualified scientific human capital and opportunities created, and not primarily by the market impact. With two examples we show how the Grid technology transfer theory corresponds to the reality. We conclude with recommendations that can help improve the adoption of Grid solutions into the biomedical community. These results give a more concise explanation of the difficulties most life science IT projects are facing in the late funding periods, and show some leveraging steps which can help to overcome the "vale of tears". PMID:20543424

  6. TRMM Gridded Text Products

    NASA Technical Reports Server (NTRS)

    Stocker, Erich Franz

    2007-01-01

    NASA's Tropical Rainfall Measuring Mission (TRMM) has many products that contain instantaneous or gridded rain rates often among many other parameters. However, these products because of their completeness can often seem intimidating to users just desiring surface rain rates. For example one of the gridded monthly products contains well over 200 parameters. It is clear that if only rain rates are desired, this many parameters might prove intimidating. In addition, for many good reasons these products are archived and currently distributed in HDF format. This also can be an inhibiting factor in using TRMM rain rates. To provide a simple format and isolate just the rain rates from the many other parameters, the TRMM product created a series of gridded products in ASCII text format. This paper describes the various text rain rate products produced. It provides detailed information about parameters and how they are calculated. It also gives detailed format information. These products are used in a number of applications with the TRMM processing system. The products are produced from the swath instantaneous rain rates and contain information from the three major TRMM instruments: radar, radiometer, and combined. They are simple to use, human readable, and small for downloading.

  7. Constructing the ASCI computational grid

    SciTech Connect

    BEIRIGER,JUDY I.; BIVENS,HUGH P.; HUMPHREYS,STEVEN L.; JOHNSON,WILBUR R.; RHEA,RONALD E.

    2000-06-01

    The Accelerated Strategic Computing Initiative (ASCI) computational grid is being constructed to interconnect the high performance computing resources of the nuclear weapons complex. The grid will simplify access to the diverse computing, storage, network, and visualization resources, and will enable the coordinated use of shared resources regardless of location. To match existing hardware platforms, required security services, and current simulation practices, the Globus MetaComputing Toolkit was selected to provide core grid services. The ASCI grid extends Globus functionality by operating as an independent grid, incorporating Kerberos-based security, interfacing to Sandia's Cplant{trademark},and extending job monitoring services. To fully meet ASCI's needs, the architecture layers distributed work management and criteria-driven resource selection services on top of Globus. These services simplify the grid interface by allowing users to simply request ''run code X anywhere''. This paper describes the initial design and prototype of the ASCI grid.

  8. 3D Structured Grid Adaptation

    NASA Technical Reports Server (NTRS)

    Banks, D. W.; Hafez, M. M.

    1996-01-01

    Grid adaptation for structured meshes is the art of using information from an existing, but poorly resolved, solution to automatically redistribute the grid points in such a way as to improve the resolution in regions of high error, and thus the quality of the solution. This involves: (1) generate a grid vis some standard algorithm, (2) calculate a solution on this grid, (3) adapt the grid to this solution, (4) recalculate the solution on this adapted grid, and (5) repeat steps 3 and 4 to satisfaction. Steps 3 and 4 can be repeated until some 'optimal' grid is converged to but typically this is not worth the effort and just two or three repeat calculations are necessary. They also may be repeated every 5-10 time steps for unsteady calculations.

  9. Progress in Grid Generation: From Chimera to DRAGON Grids

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Kao, Kai-Hsiung

    1994-01-01

    Hybrid grids, composed of structured and unstructured grids, combines the best features of both. The chimera method is a major stepstone toward a hybrid grid from which the present approach is evolved. The chimera grid composes a set of overlapped structured grids which are independently generated and body-fitted, yielding a high quality grid readily accessible for efficient solution schemes. The chimera method has been shown to be efficient to generate a grid about complex geometries and has been demonstrated to deliver accurate aerodynamic prediction of complex flows. While its geometrical flexibility is attractive, interpolation of data in the overlapped regions - which in today's practice in 3D is done in a nonconservative fashion, is not. In the present paper we propose a hybrid grid scheme that maximizes the advantages of the chimera scheme and adapts the strengths of the unstructured grid while at the same time keeps its weaknesses minimal. Like the chimera method, we first divide up the physical domain by a set of structured body-fitted grids which are separately generated and overlaid throughout a complex configuration. To eliminate any pure data manipulation which does not necessarily follow governing equations, we use non-structured grids only to directly replace the region of the arbitrarily overlapped grids. This new adaptation to the chimera thinking is coined the DRAGON grid. The nonstructured grid region sandwiched between the structured grids is limited in size, resulting in only a small increase in memory and computational effort. The DRAGON method has three important advantages: (1) preserving strengths of the chimera grid; (2) eliminating difficulties sometimes encountered in the chimera scheme, such as the orphan points and bad quality of interpolation stencils; and (3) making grid communication in a fully conservative and consistent manner insofar as the governing equations are concerned. To demonstrate its use, the governing equations are

  10. Ion beamlet steering for two-grid electrostatic thrusters. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Homa, J. M.

    1984-01-01

    An experimental study of ion beamlet steering in which the direction of beamlets emitted from a two grid aperture system is controlled by relative translation of the grids, is described. The results can be used to design electrostatic accelerating devices for which the direction and focus of emerging beamlets are important. Deflection and divergence angle data are presented for two grid systems as a function of the relative lateral displacement of the holes in these grids. At large displacements, accelerator grid impingements become excessive and this determines the maximum allowable displacement and as a result the useful range of beamlet deflection. Beamlet deflection is shown to vary linearly with grid offset angle over this range. The divergence of the beamlets is found to be unaffected by deflection over the useful range of beamlet deflection. The grids of a typical dished grid ion thruster are examined to determine the effects of thermally induced grid distortion and prescribed offsets of grid hole centerlines on the characteristics of the emerging beamlets. The results are used to determine the region on the grid surface where ion beamlet deflections exceed the useful range. Over this region high accelerator grid impingement currents and rapid grid erosion are predicted.

  11. Enhancing control of grid distribution in algebraic grid generation

    NASA Technical Reports Server (NTRS)

    Steinthorsson, E.; Shih, T. I.-P.; Roelke, R. J.

    1992-01-01

    Three techniques are presented to enhance the control of grid-point distribution for a class of algebraic grid generation methods known as the two-, four- and six-boundary methods. First, multidimensional stretching functions are presented, and a technique is devised to construct them based on the desired distribution of grid points along certain boundaries. Second, a normalization procedure is proposed which allows more effective control over orthogonality of grid lines at boundaries and curvature of grid lines near boundaries. And third, interpolating functions based on tension splines are introduced to control curvature of grid lines in the interior of the spatial domain. In addition to these three techniques, consistency conditions are derived which must be satisfied by all user-specified data employed in the grid generation process to control grid-point distribution. The usefulness of the techniques developed in this study was demonstrated by using them in conjunction with the two- and four-boundary methods to generate several grid systems, including a three-dimensional grid system in the coolant passage of a radial turbine blade with serpentine channels and pin fins.

  12. Lattice QCD thermodynamics on the Grid

    NASA Astrophysics Data System (ADS)

    Mościcki, Jakub T.; Woś, Maciej; Lamanna, Massimo; de Forcrand, Philippe; Philipsen, Owe

    2010-10-01

    We describe how we have used simultaneously O(10) nodes of the EGEE Grid, accumulating ca. 300 CPU-years in 2-3 months, to determine an important property of Quantum Chromodynamics. We explain how Grid resources were exploited efficiently and with ease, using user-level overlay based on Ganga and DIANE tools above standard Grid software stack. Application-specific scheduling and resource selection based on simple but powerful heuristics allowed to improve efficiency of the processing to obtain desired scientific results by a specified deadline. This is also a demonstration of combined use of supercomputers, to calculate the initial state of the QCD system, and Grids, to perform the subsequent massively distributed simulations. The QCD simulation was performed on a 16×4 lattice. Keeping the strange quark mass at its physical value, we reduced the masses of the up and down quarks until, under an increase of temperature, the system underwent a second-order phase transition to a quark-gluon plasma. Then we measured the response of this system to an increase in the quark density. We find that the transition is smoothened rather than sharpened. If confirmed on a finer lattice, this finding makes it unlikely for ongoing experimental searches to find a QCD critical point at small chemical potential.

  13. Design and manufacturing of interlocked composite grids

    NASA Astrophysics Data System (ADS)

    Han, Dongyup

    Composite grid structures made from pultruded unidirectional glass or carbon ribs provide unmatched performance/cost combination of any composite panels. A new manufacturing method for an ortho-grid using slotted joint and adhesive bonding ("Interlocked Composite Grid" or ICG) has been developed. The high structural performance of the grid is derived from uni-plies and the efficient load transfer mechanism. Pultrusion is one of the cheapest, fastest and reliable manufacturing processes for composite sections. Pultruded ribs, along with the simple assembly concept, lead to the low cost structure. Also, the flexibility in assembly eliminates the size limitation and large civil composite structures can be built. Two different equivalent stiffness models, the equivalent plate stiffness matrices and the equivalent engineering constants, have been formulated. The former model, more accurate than the equivalent engineering constants, includes the effects of the slots, the internal ribs, and the skins. The latter is used for establishing simple design guidelines. The equivalent stiffness models have been verified with numerical analysis and experimental data. The simplicity and flexibility of the design of an ICG has been demonstrated by sample design problems. Also, an approximate cost estimation rule has been established. ICG beams and panels have been built and tested under static and dynamic flexural loading. Superior mechanical properties, such as high damage tolerance, resilience, and durability have been demonstrated. The failure mode has been identified.

  14. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  15. Service Oriented Gridded Atmospheric Radiances (SOAR)

    NASA Astrophysics Data System (ADS)

    Halem, M.; Goldberg, M. D.; Tilmes, C.; Zhou, L.; Shen, S.; Yesha, Y.

    2005-12-01

    responsively meeting diverse user specified requests in terms of the spatial and temporal compositing of radiance fields. Moreover, the volume of sounder data records produced from current and future instruments varies from GB's to TB's per day and griding these sounding data can thin the volume to KB's to MB's per day making them easier to download to desktops and laptops. This not only will better serve a wider earth science community but makes these capabilities more readily useful to the education community. This presentation will describe the rationale for the project, an overview of the system architecture, a description of the framework for executing the applications on the distributed cluster and present examples of gridded service requests that are currently available. This demonstration project represents a foundation for the development of a distributed web service architecture that will be able to invoke requested services for temperature and moisture retrievals for arbitrary integrated gridded radiance data sets. We plan to extend the framework to accommodate such services for other earth observing instruments as well.

  16. The BioGRID interaction database: 2015 update.

    PubMed

    Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Oughtred, Rose; Boucher, Lorrie; Heinicke, Sven; Chen, Daici; Stark, Chris; Breitkreutz, Ashton; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Nixon, Julie; Ramage, Lindsay; Winter, Andrew; Sellam, Adnane; Chang, Christie; Hirschman, Jodi; Theesfeld, Chandra; Rust, Jennifer; Livstone, Michael S; Dolinski, Kara; Tyers, Mike

    2015-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http://thebiogrid.org) is an open access database that houses genetic and protein interactions curated from the primary biomedical literature for all major model organism species and humans. As of September 2014, the BioGRID contains 749,912 interactions as drawn from 43,149 publications that represent 30 model organisms. This interaction count represents a 50% increase compared to our previous 2013 BioGRID update. BioGRID data are freely distributed through partner model organism databases and meta-databases and are directly downloadable in a variety of formats. In addition to general curation of the published literature for the major model species, BioGRID undertakes themed curation projects in areas of particular relevance for biomedical sciences, such as the ubiquitin-proteasome system and various human disease-associated interaction networks. BioGRID curation is coordinated through an Interaction Management System (IMS) that facilitates the compilation interaction records through structured evidence codes, phenotype ontologies, and gene annotation. The BioGRID architecture has been improved in order to support a broader range of interaction and post-translational modification types, to allow the representation of more complex multi-gene/protein interactions, to account for cellular phenotypes through structured ontologies, to expedite curation through semi-automated text-mining approaches, and to enhance curation quality control.

  17. The BioGRID interaction database: 2015 update

    PubMed Central

    Chatr-aryamontri, Andrew; Breitkreutz, Bobby-Joe; Oughtred, Rose; Boucher, Lorrie; Heinicke, Sven; Chen, Daici; Stark, Chris; Breitkreutz, Ashton; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Nixon, Julie; Ramage, Lindsay; Winter, Andrew; Sellam, Adnane; Chang, Christie; Hirschman, Jodi; Theesfeld, Chandra; Rust, Jennifer; Livstone, Michael S.; Dolinski, Kara; Tyers, Mike

    2015-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http://thebiogrid.org) is an open access database that houses genetic and protein interactions curated from the primary biomedical literature for all major model organism species and humans. As of September 2014, the BioGRID contains 749 912 interactions as drawn from 43 149 publications that represent 30 model organisms. This interaction count represents a 50% increase compared to our previous 2013 BioGRID update. BioGRID data are freely distributed through partner model organism databases and meta-databases and are directly downloadable in a variety of formats. In addition to general curation of the published literature for the major model species, BioGRID undertakes themed curation projects in areas of particular relevance for biomedical sciences, such as the ubiquitin-proteasome system and various human disease-associated interaction networks. BioGRID curation is coordinated through an Interaction Management System (IMS) that facilitates the compilation interaction records through structured evidence codes, phenotype ontologies, and gene annotation. The BioGRID architecture has been improved in order to support a broader range of interaction and post-translational modification types, to allow the representation of more complex multi-gene/protein interactions, to account for cellular phenotypes through structured ontologies, to expedite curation through semi-automated text-mining approaches, and to enhance curation quality control. PMID:25428363

  18. Smart Grid Risk Management

    NASA Astrophysics Data System (ADS)

    Abad Lopez, Carlos Adrian

    Current electricity infrastructure is being stressed from several directions -- high demand, unreliable supply, extreme weather conditions, accidents, among others. Infrastructure planners have, traditionally, focused on only the cost of the system; today, resilience and sustainability are increasingly becoming more important. In this dissertation, we develop computational tools for efficiently managing electricity resources to help create a more reliable and sustainable electrical grid. The tools we present in this work will help electric utilities coordinate demand to allow the smooth and large scale integration of renewable sources of energy into traditional grids, as well as provide infrastructure planners and operators in developing countries a framework for making informed planning and control decisions in the presence of uncertainty. Demand-side management is considered as the most viable solution for maintaining grid stability as generation from intermittent renewable sources increases. Demand-side management, particularly demand response (DR) programs that attempt to alter the energy consumption of customers either by using price-based incentives or up-front power interruption contracts, is more cost-effective and sustainable in addressing short-term supply-demand imbalances when compared with the alternative that involves increasing fossil fuel-based fast spinning reserves. An essential step in compensating participating customers and benchmarking the effectiveness of DR programs is to be able to independently detect the load reduction from observed meter data. Electric utilities implementing automated DR programs through direct load control switches are also interested in detecting the reduction in demand to efficiently pinpoint non-functioning devices to reduce maintenance costs. We develop sparse optimization methods for detecting a small change in the demand for electricity of a customer in response to a price change or signal from the utility

  19. Earth System Grid and EGI interoperability

    NASA Astrophysics Data System (ADS)

    Raciazek, J.; Petitdidier, M.; Gemuend, A.; Schwichtenberg, H.

    2012-04-01

    The Earth Science data centers have developed a data grid called Earth Science Grid Federation (ESGF) to give the scientific community world wide access to CMIP5 (Coupled Model Inter-comparison Project 5) climate data. The CMIP5 data will permit to evaluate the impact of climate change in various environmental and societal areas, such as regional climate, extreme events, agriculture, insurance… The ESGF grid provides services like searching, browsing and downloading of datasets. At the security level, ESGF data access is protected by an authentication mechanism. An ESGF trusted X509 Short-Lived EEC certificate with the correct roles/attributes is required to get access to the data in a non-interactive way (e.g. from a worker node). To access ESGF from EGI (i.e. by earth science applications running on EGI infrastructure), the security incompatibility between the two grids is the challenge: the EGI proxy certificate is not ESGF trusted nor it contains the correct roles/attributes. To solve this problem, we decided to use a Credential Translation Service (CTS) to translate the EGI X509 proxy certificate into the ESGF Short-Lived EEC certificate (the CTS will issue ESGF certificates based on EGI certificate authentication). From the end user perspective, the main steps to use the CTS are: the user binds his two identities (EGI and ESGF) together in the CTS using the CTS web interface (this steps has to be done only once) and then request an ESGF Short-Lived EEC certificate every time is needed, using a command-line tools. The implementation of the CTS is on-going. It is based on the open source MyProxy software stack, which is used in many grid infrastructures. On the client side, the "myproxy-logon" command-line tools is used to request the certificate translation. A new option has been added to "myproxy-logon" to select the original certificate (in our case, the EGI one). On the server side, MyProxy server operates in Certificate Authority mode, with a new module

  20. The pilot way to Grid resources using glideinWMS

    SciTech Connect

    Sfiligoi, Igor; Bradley, Daniel C.; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank; /UC, San Diego

    2010-09-01

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of Grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the nonuniformity of computer resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of Grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  1. Integrating Grid Services into the Cray XT4 Environment

    SciTech Connect

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic grid interfaces that mask the underlying system-specific details for the end user.

  2. Grist : grid-based data mining for astronomy

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden; Nichol, Robert

    2004-01-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  3. Are Clinical Trials With Mesenchymal Stem/Progenitor Cells too Far Ahead of the Science? Lessons From Experimental Hematology

    PubMed Central

    Prockop, Darwin J; Prockop, Susan E; Bertoncello, Ivan

    2014-01-01

    The cells referred to as mesenchymal stem/progenitor cells (MSCs) are currently being used to treat thousands of patients with diseases of essentially all the organs and tissues of the body. Strikingly positive results have been reported in some patients, but there have been few prospective controlled studies. Also, the reasons for the beneficial effects are frequently unclear. As a result there has been a heated debate as to whether the clinical trials with these new cell therapies are too far ahead of the science. The debate is not easily resolved, but important insights are provided by the 60-year history that was required to develop the first successful stem cell therapy, the transplantation of hematopoietic stem cells. The history indicates that development of a dramatically new therapy usually requires patience and a constant dialogue between basic scientists and physicians carrying out carefully designed clinical trials. It also suggests that the field can be moved forward by establishing better records of how MSCs are prepared, by establishing a large supply of reference MSCs that can be used to validate assays and compare MSCs prepared in different laboratories, and by continuing efforts to establish in vivo assays for the efficacy of MSCs. Stem Cells 2014;32:3055–3061 PMID:25100155

  4. Are clinical trials with mesenchymal stem/progenitor cells too far ahead of the science? Lessons from experimental hematology.

    PubMed

    Prockop, Darwin J; Prockop, Susan E; Bertoncello, Ivan

    2014-12-01

    The cells referred to as mesenchymal stem/progenitor cells (MSCs) are currently being used to treat thousands of patients with diseases of essentially all the organs and tissues of the body. Strikingly positive results have been reported in some patients, but there have been few prospective controlled studies. Also, the reasons for the beneficial effects are frequently unclear. As a result there has been a heated debate as to whether the clinical trials with these new cell therapies are too far ahead of the science. The debate is not easily resolved, but important insights are provided by the 60-year history that was required to develop the first successful stem cell therapy, the transplantation of hematopoietic stem cells. The history indicates that development of a dramatically new therapy usually requires patience and a constant dialogue between basic scientists and physicians carrying out carefully designed clinical trials. It also suggests that the field can be moved forward by establishing better records of how MSCs are prepared, by establishing a large supply of reference MSCs that can be used to validate assays and compare MSCs prepared in different laboratories, and by continuing efforts to establish in vivo assays for the efficacy of MSCs.

  5. Folding of electrostatically charged beads-on-a-string as an experimental realization of a theoretical model in polymer science.

    PubMed

    Reches, Meital; Snyder, Phillip W; Whitesides, George M

    2009-10-20

    The "beads-on-a-string" model for folding of polymers is a cornerstone of theoretical polymer science. This communication describes a physical model of beads-on-a-string, based on the folding of flexible strings of electrostatically charged beads in two dimensions. The system comprises millimeter-scale Teflon and Nylon-6,6 (spherical or cylindrical) beads (approximately 6 mm in diameter) separated by smaller (approximately 3 mm) poly(methyl methacrylate) (PMMA) spherical beads, threaded on a flexible string. The smaller, uncharged beads define the distances between the larger beads, and control the flexibility of the string. During agitation of the sequence of beads on a planar, horizontal paper surface, tribocharging generates opposite electrostatic charges on the larger Nylon and Teflon beads, but leaves the smaller PMMA beads essentially uncharged; the resulting electrostatic interactions cause the string to fold. Examination and comparison of two models--one physical and one theoretical--may offer a new approach to understanding folding, collapse, and molecular recognition at an abstract level, with particular opportunity to explore the influence of the flexibility of the string and the shape of the beads on the pattern and rate of folding. The physical system is, thus, an analog computer, simulating the theoretical beads-on-a-string model in two dimensions; this system makes it possible to test hypotheses connecting "sequence" to "folding", rapidly and conveniently, while exploring nonlinearities and other complexities omitted from the theoretical model.

  6. At the source of western science: the organization of experimentalism at the Accademia del Cimento (1657-1667).

    PubMed

    Beretta, M

    2000-05-01

    The Accademia del Cimento, founded by the Medici princes, Ferdinando II, Grand Duke of Tuscany, and his brother, Leopoldo, later Cardinal, had members and programmes of research very different from earlier academies in Italy. The Cimento foreshadowed later European academies and institutions specifically devoted to research and improvement of natural knowledge. It issued only one publication, the Saggi di naturali esperienze, and most of the observations and experimental results from its brief life remain unpublished. The Roman Accademia fisica-matematica, associated with Queen Christina of Sweden, continued to some extent its emphasis on experiment, while The Royal Society, with which it maintained links, placed even greater reliance on experiment and its validation through unvarnished publication. Comparisons between the Cimento and its contemporaries, The Royal Society and the French academy, illuminate the origin of scientific institutions in the early modern period.

  7. Between parasitic theory and experimental oncology: a proposal for systematizing oncological science in Portugal, 1889-1945.

    PubMed

    Costa, Rui Manuel Pinto

    2012-06-01

    This article deals with the bio-medical investigation of cancer studies in Portugal between 1889 and 1945. By examining the main works produced between the end of the nineteenth century and the middle of the twentieth century, it has been possible to illuminate and define a field of scientific endeavour which has been the scope of little study to date. Starting from the introduction and consolidation of the defining principles of experimental oncology, distinct phases can be discerned in the production of scientific material, alternating between support for the dominant theories and the application of methods for artificially creating the disease. In accordance with the principal phases of investigation, a brief systematic overview of the scope of these oncological studies is presented.

  8. Between parasitic theory and experimental oncology: a proposal for systematizing oncological science in Portugal, 1889-1945.

    PubMed

    Costa, Rui Manuel Pinto

    2012-06-01

    This article deals with the bio-medical investigation of cancer studies in Portugal between 1889 and 1945. By examining the main works produced between the end of the nineteenth century and the middle of the twentieth century, it has been possible to illuminate and define a field of scientific endeavour which has been the scope of little study to date. Starting from the introduction and consolidation of the defining principles of experimental oncology, distinct phases can be discerned in the production of scientific material, alternating between support for the dominant theories and the application of methods for artificially creating the disease. In accordance with the principal phases of investigation, a brief systematic overview of the scope of these oncological studies is presented. PMID:22872387

  9. Experimental Investigation of Space Radiation Processing in Lunar Soil Ilmenite: Combining Perspectives from Surface Science and Transmission Electron Microscopy

    NASA Technical Reports Server (NTRS)

    Christoffersen, R.; Keller, L. P.; Rahman, Z.; Baragiola, R.

    2010-01-01

    Energetic ions mostly from the solar wind play a major role in lunar space weathering because they contribute structural and chemical changes to the space-exposed surfaces of lunar regolith grains. In mature mare soils, ilmenite (FeTiO3) grains in the finest size fraction have been shown in transmission electron microscope (TEM) studies to exhibit key differences in their response to space radiation processing relative to silicates [1,2,3]. In ilmenite, solar ion radiation alters host grain outer margins to produce 10-100 nm thick layers that are microstructurally complex, but dominantly crystalline compared to the amorphous radiation-processed rims on silicates [1,2,3]. Spatially well-resolved analytical TEM measurements also show nm-scale compositional and chemical state changes in these layers [1,3]. These include shifts in Fe/Ti ratio from strong surface Fe-enrichment (Fe/Ti >> 1), to Fe depletion (Fe/Ti < 1) at 40-50 nm below the grain surface [1,3]. These compositional changes are not observed in the radiation-processed rims on silicates [4]. Several mechanism(s) to explain the overall relations in the ilmenite grain rims by radiation processing and/or additional space weathering processes were proposed by [1], and remain under current consideration [3]. A key issue has concerned the ability of ion radiation processing alone to produce some of the deeper- penetrating compositional changes. In order to provide some experimental constraints on these questions, we have performed a combined X-ray photoelectron spectroscopy (XPS) and field-emission scanning transmission electron (FE-STEM) study of experimentally ion-irradiated ilmenite. A key feature of this work is the combination of analytical techniques sensitive to changes in the irradiated samples at depth scales going from the immediate surface (approx.5 nm; XPS), to deeper in the grain interior (5-100 nm; FE-STEM).

  10. AstroGrid: the UK's Virtual Observatory Initiative

    NASA Astrophysics Data System (ADS)

    Mann, Robert G.; Astrogrid Consortium; Lawrence, Andy; Davenhall, Clive; Mann, Bob; McMahon, Richard; Irwin, Mike; Walton, Nic; Rixon, Guy; Watson, Mike; Osborne, Julian; Page, Clive; Allan, Peter; Giaretta, David; Perry, Chris; Pike, Dave; Sherman, John; Murtagh, Fionn; Harra, Louise; Bentley, Bob; Mason, Keith; Garrington, Simon

    AstroGrid is the UK's Virtual Observatory (VO) initiative. It brings together the principal astronomical data centres in the UK, and has been funded to the tune of ˜pounds 5M over the next three years, via PPARC, as part of the UK e--science programme. Its twin goals are the provision of the infrastructure and tools for the federation and exploitation of large astronomical (X-ray to radio), solar and space plasma physics datasets, and the delivery of federations of current datasets for its user communities to exploit using those tools. Whilst AstroGrid's work will be centred on existing and future (e.g. VISTA) UK datasets, it will seek solutions to generic VO problems and will contribute to the developing international virtual observatory framework: AstroGrid is a member of the EU-funded Astrophysical Virtual Observatory project, has close links to a second EU Grid initiative, the European Grid of Solar Observations (EGSO), and will seek an active role in the development of the common standards on which the international virtual observatory will rely. In this paper we shall primarily describe the concrete plans for AstroGrid's one-year Phase A study, which will centre on: (i) the definition of detailed science requirements through community consultation; (ii) the undertaking of a ``functionality market survey" to test the utility of existing technologies for the VO; and (iii) a pilot programme of database federations, each addressing different aspects of the general database federation problem. Further information on AstroGrid can be found at AstroGrid .

  11. Experimental evidence shows no fractionation of strontium isotopes ((87)Sr/(86)Sr) among soil, plants, and herbivores: implications for tracking wildlife and forensic science.

    PubMed

    Flockhart, D T Tyler; Kyser, T Kurt; Chipley, Don; Miller, Nathan G; Norris, D Ryan

    2015-01-01

    Strontium isotopes ((87)Sr/(86)Sr) can be useful biological markers for a wide range of forensic science applications, including wildlife tracking. However, one of the main advantages of using (87)Sr/(86)Sr values, that there is no fractionation from geological bedrock sources through the food web, also happens to be a critical assumption that has never been tested experimentally. We test this assumption by measuring (87)Sr/(86)Sr values across three trophic levels in a controlled greenhouse experiment. Adult monarch butterflies were raised on obligate larval host milkweed plants that were, in turn, grown on seven different soil types collected across Canada. We found no significant differences between (87)Sr/(86)Sr values in leachable Sr from soil minerals, organic soil, milkweed leaves, and monarch butterfly wings. Our results suggest that strontium isoscapes developed from (87)Sr/(86)Sr values in bedrock or soil may serve as a reliable biological marker in forensic science for a range of taxa and across large geographic areas.

  12. Experimental evidence shows no fractionation of strontium isotopes ((87)Sr/(86)Sr) among soil, plants, and herbivores: implications for tracking wildlife and forensic science.

    PubMed

    Flockhart, D T Tyler; Kyser, T Kurt; Chipley, Don; Miller, Nathan G; Norris, D Ryan

    2015-01-01

    Strontium isotopes ((87)Sr/(86)Sr) can be useful biological markers for a wide range of forensic science applications, including wildlife tracking. However, one of the main advantages of using (87)Sr/(86)Sr values, that there is no fractionation from geological bedrock sources through the food web, also happens to be a critical assumption that has never been tested experimentally. We test this assumption by measuring (87)Sr/(86)Sr values across three trophic levels in a controlled greenhouse experiment. Adult monarch butterflies were raised on obligate larval host milkweed plants that were, in turn, grown on seven different soil types collected across Canada. We found no significant differences between (87)Sr/(86)Sr values in leachable Sr from soil minerals, organic soil, milkweed leaves, and monarch butterfly wings. Our results suggest that strontium isoscapes developed from (87)Sr/(86)Sr values in bedrock or soil may serve as a reliable biological marker in forensic science for a range of taxa and across large geographic areas. PMID:25789981

  13. The International Symposium on Grids and Clouds

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.

  14. Past and future of ESA Earth Observation Grid

    NASA Astrophysics Data System (ADS)

    Fusco, L.; Cossu, R.

    Due to its intensive data processing and highly distributed organization, the multidisciplinary Earth Science (ES) applications community is uniquely positioned for the uptake and exploitation of Grid technologies. In this paper, we describe a number of initiatives that the European Space Agency is carrying focusing on a ES e-collaboration platform that makes use of Grid and SOA technologies. Starting from the experience gained so far with ESA Grid Processing on Demand, and the results of the EC funded DEGREE project, we will discuss the vision of a dedicated ES platform. The aim is enabling scientists to locate access, combine and integrate historical and fresh Earth-related data from space, airborne and in-situ sensors archived in large distributed repositories. The big challenge is allowing Earth Science communities to easily and quickly derive objective information and share knowledge based on all environmentally sensitive domains. GENESI-DR project is already moving the first steps in developing this platform.

  15. Grid crusher apparatus and method

    SciTech Connect

    McDaniels, J.D. Jr.

    1994-01-11

    A grid crusher apparatus and method are provided for a nuclear fuel rod consolidation system. Spacer grids are crushed within a basket which is then placed in a storage canister. The grid crusher apparatus has a ram assembly and a basket driving mechanism. The ram assembly has a sleeve ram and a central ram. The sleeve ram surrounds the central ram which is longitudinally movable within the sleeve ram. The central ram protrudes from the sleeve ram at a ram contact end and is retractable upon application of a preselected force to the central ram so that the central ram is flush with the sleeve ram at the ram contact end. The basket driving mechanism is configured to move the basket containing a spacer grid towards the ram contact end so that the spacer grid is crushed within the basket. The spacer grid is crushed by the combination of successive forces from the central ram and the sleeve ram, respectively. Essentially, the central portion of the spacer grid is crushed first, and then the remaining outer portion of the spacer grid is crushed to complete the crushing action of the spacer grid. The foregoing process is repeated for other spacer grids until the basket reaches a predetermined allowable capacity, and then the basket is stored in a storage canister. 11 figs.

  16. Evaluating the Information Power Grid using the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    VanderWijngaartm Rob F.; Frumkin, Michael A.

    2004-01-01

    The NAS Grid Benchmarks (NGB) are a collection of synthetic distributed applications designed to rate the performance and functionality of computational grids. We compare several implementations of the NGB to determine programmability and efficiency of NASA's Information Power Grid (IPG), whose services are mostly based on the Globus Toolkit. We report on the overheads involved in porting existing NGB reference implementations to the IPG. No changes were made to the component tasks of the NGB can still be improved.

  17. The Volume Grid Manipulator (VGM): A Grid Reusability Tool

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This document is a manual describing how to use the Volume Grid Manipulation (VGM) software. The code is specifically designed to alter or manipulate existing surface and volume structured grids to improve grid quality through the reduction of grid line skewness, removal of negative volumes, and adaption of surface and volume grids to flow field gradients. The software uses a command language to perform all manipulations thereby offering the capability of executing multiple manipulations on a single grid during an execution of the code. The command language can be input to the VGM code by a UNIX style redirected file, or interactively while the code is executing. The manual consists of 14 sections. The first is an introduction to grid manipulation; where it is most applicable and where the strengths of such software can be utilized. The next two sections describe the memory management and the manipulation command language. The following 8 sections describe simple and complex manipulations that can be used in conjunction with one another to smooth, adapt, and reuse existing grids for various computations. These are accompanied by a tutorial section that describes how to use the commands and manipulations to solve actual grid generation problems. The last two sections are a command reference guide and trouble shooting sections to aid in the use of the code as well as describe problems associated with generated scripts for manipulation control.

  18. GridPP: the UK grid for particle physics.

    PubMed

    Britton, D; Cass, A J; Clarke, P E L; Coles, J; Colling, D J; Doyle, A T; Geddes, N I; Gordon, J C; Jones, R W L; Kelsey, D P; Lloyd, S L; Middleton, R P; Patrick, G N; Sansum, R A; Pearce, S E

    2009-06-28

    The start-up of the Large Hadron Collider (LHC) at CERN, Geneva, presents a huge challenge in processing and analysing the vast amounts of scientific data that will be produced. The architecture of the worldwide grid that will handle 15 PB of particle physics data annually from this machine is based on a hierarchical tiered structure. We describe the development of the UK component (GridPP) of this grid from a prototype system to a full exploitation grid for real data analysis. This includes the physical infrastructure, the deployment of middleware, operational experience and the initial exploitation by the major LHC experiments. PMID:19451101

  19. GridPP: the UK grid for particle physics.

    PubMed

    Britton, D; Cass, A J; Clarke, P E L; Coles, J; Colling, D J; Doyle, A T; Geddes, N I; Gordon, J C; Jones, R W L; Kelsey, D P; Lloyd, S L; Middleton, R P; Patrick, G N; Sansum, R A; Pearce, S E

    2009-06-28

    The start-up of the Large Hadron Collider (LHC) at CERN, Geneva, presents a huge challenge in processing and analysing the vast amounts of scientific data that will be produced. The architecture of the worldwide grid that will handle 15 PB of particle physics data annually from this machine is based on a hierarchical tiered structure. We describe the development of the UK component (GridPP) of this grid from a prototype system to a full exploitation grid for real data analysis. This includes the physical infrastructure, the deployment of middleware, operational experience and the initial exploitation by the major LHC experiments.

  20. National Grid Deep Energy Retrofit Pilot Program—Clark Residence

    SciTech Connect

    2010-03-30

    In this case study, Building Science Corporation partnered with local utility company, National Grid, Massachusetts homes. This project involved the renovation of a 18th century Cape-style building and achieved a super-insulated enclosure (R-35 walls, R-50+ roof, R-20+ foundation), extensive water management improvements, high-efficiency water heater, and state-of-the-art ventilation.

  1. GRID[subscript C] Renewable Energy Data Streaming into Classrooms

    ERIC Educational Resources Information Center

    DeLuca, V. William; Carpenter, Pam; Lari, Nasim

    2010-01-01

    For years, researchers have shown the value of using real-world data to enhance instruction in mathematics, science, and social studies. In an effort to help develop students' higher-order thinking skills in a data-rich learning environment, Green Research for Incorporating Data in the Classroom (GRID[subscript C]), a National Science…

  2. 75 FR 6414 - Consumer Interface With the Smart Grid

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-09

    ... electronic mail. Electronic mail responses will be re-posted on the online forum. Instructions are provided...://blog.ostp.gov/category/smart-grid . Via E-mail: smartgrid@ostp.gov . Mail: Office of Science and... sensitive personal information or proprietary information. If you submit an e-mail comment, your...

  3. A Diagnostic Study of Computer Application of Structural Communication Grid

    ERIC Educational Resources Information Center

    Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol

    2009-01-01

    In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…

  4. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  5. TASMANIAN Sparse Grids Module

    SciTech Connect

    and Drayton Munster, Miroslav Stoyanov

    2013-09-20

    Sparse Grids are the family of methods of choice for multidimensional integration and interpolation in low to moderate number of dimensions. The method is to select extend a one dimensional set of abscissas, weights and basis functions by taking a subset of all possible tensor products. The module provides the ability to create global and local approximations based on polynomials and wavelets. The software has three components, a library, a wrapper for the library that provides a command line interface via text files ad a MATLAB interface via the command line tool.

  6. TASMANIAN Sparse Grids Module

    2013-09-20

    Sparse Grids are the family of methods of choice for multidimensional integration and interpolation in low to moderate number of dimensions. The method is to select extend a one dimensional set of abscissas, weights and basis functions by taking a subset of all possible tensor products. The module provides the ability to create global and local approximations based on polynomials and wavelets. The software has three components, a library, a wrapper for the library thatmore » provides a command line interface via text files ad a MATLAB interface via the command line tool.« less

  7. Prepares Overset Grids for Processing

    1998-04-22

    Many large and complex computational problems require multiple, structured, generically overlapped (overset) grids to obtain numerical solutions in a timely manner. BREAKUP significantly reduces required compute times by preparing overset grids for processing on massively parallel computers. BREAKUP subdivides the original grids for use on a user-specified number of parallel processors. Grid-to-grid and intragrid communications are maintained in the parallel environment via connectivity tables generated by BREAKUP. The subgrids are formed to be statically loadmore » balanced and to incur a minimum of communication between the subgrids. When the output of BREAKUP is submitted to an appropriately modified flow solver, subgrid solutions will be updated simultaneously. This contrasts to the much less efficient solution method of updating each original grid sequentially as done in the past.« less

  8. Prepares Overset Grids for Processing

    SciTech Connect

    Barnette, Daniel W.

    1998-04-22

    Many large and complex computational problems require multiple, structured, generically overlapped (overset) grids to obtain numerical solutions in a timely manner. BREAKUP significantly reduces required compute times by preparing overset grids for processing on massively parallel computers. BREAKUP subdivides the original grids for use on a user-specified number of parallel processors. Grid-to-grid and intragrid communications are maintained in the parallel environment via connectivity tables generated by BREAKUP. The subgrids are formed to be statically load balanced and to incur a minimum of communication between the subgrids. When the output of BREAKUP is submitted to an appropriately modified flow solver, subgrid solutions will be updated simultaneously. This contrasts to the much less efficient solution method of updating each original grid sequentially as done in the past.

  9. Experimental errors?

    NASA Astrophysics Data System (ADS)

    Downie, Neil; Turner, Jez

    2014-07-01

    In reply to Matin Durrani's article “Experimental mistake” (May p15, see also http://ow.ly/vDYlM) criticizing plans to base A-level science exams in England entirely on written tests, with practical skills noted as a separate grade.

  10. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008-2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y.; Harley, P. C.; Hornbrook, R. H.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L., III; Smith, J. N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air, but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include: - soil property measurements, - hydrological studies, - measurements of high-frequency turbulence parameters, - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy, - biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry, - aerosol number and mass distributions, - chemical speciation of aerosol particles, - characterization of ice and cloud condensation nuclei, - trace gas measurements, and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurement, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  11. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y. Y.; Harley, P. C.; Hornbrook, R. S.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M. C.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L.; Smith, J. N.

    2014-06-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include - soil property measurements; - hydrological studies; - measurements of high-frequency turbulence parameters; - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; - determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; - aerosol number and mass distributions; - chemical speciation of aerosol particles; - characterization of ice and cloud condensation nuclei; - trace gas measurements; and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  12. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    SciTech Connect

    Ortega, John; Turnipseed, A.; Guenther, Alex B.; Karl, Thomas G.; Day, D. A.; Gochis, David; Huffman, J. A.; Prenni, Anthony J.; Levin, E. J.; Kreidenweis, Sonia M.; DeMott, Paul J.; Tobo, Y.; Patton, E. G.; Hodzic, Alma; Cui, Y. Y.; Harley, P.; Hornbrook, R. S.; Apel, E. C.; Monson, Russell K.; Eller, A. S.; Greenberg, J. P.; Barth, Mary; Campuzano-Jost, Pedro; Palm, B. B.; Jiminez, J. L.; Aiken, A. C.; Dubey, Manvendra K.; Geron, Chris; Offenberg, J.; Ryan, M. G.; Fornwalt, Paula J.; Pryor, S. C.; Keutsch, Frank N.; DiGangi, J. P.; Chan, A. W.; Goldstein, Allen H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, Chris; Mauldin, R. L.; Smith, James N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and interrelationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include – soil property measurements; – hydrological studies; – measurements of high-frequency turbulence parameters; – eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; – determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; – aerosol number and mass distributions; – chemical speciation of aerosol particles; – characterization of ice and cloud condensation nuclei; – trace gas measurements; and – model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  13. On unstructured grids and solvers

    NASA Technical Reports Server (NTRS)

    Barth, T. J.

    1990-01-01

    The fundamentals and the state-of-the-art technology for unstructured grids and solvers are highlighted. Algorithms and techniques pertinent to mesh generation are discussed. It is shown that grid generation and grid manipulation schemes rely on fast multidimensional searching. Flow solution techniques for the Euler equations, which can be derived from the integral form of the equations are discussed. Sample calculations are also provided.

  14. Java Parallel Secure Stream for Grid Computing

    SciTech Connect

    Chen, Jie; Akers, Walter; Chen, Ying; Watson, William

    2001-09-01

    The emergence of high speed wide area networks makes grid computing a reality. However grid applications that need reliable data transfer still have difficulties to achieve optimal TCP performance due to network tuning of TCP window size to improve the bandwidth and to reduce latency on a high speed wide area network. This paper presents a pure Java package called JPARSS (Java Par-allel Secure Stream) that divides data into partitions that are sent over several parallel Java streams simultaneously and allows Java or Web applications to achieve optimal TCP performance in a gird environment without the necessity of tuning the TCP window size. Several experimental results are provided to show that using parallel stream is more effective than tuning TCP window size. In addi-tion X.509 certificate based single sign-on mechanism and SSL based connection establishment are integrated into this package. Finally a few applications using this package will be discussed.

  15. Smart Grid Enabled EVSE

    SciTech Connect

    None, None

    2014-10-15

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  16. Grid Task Execution

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2007-01-01

    IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.

  17. Colorado Electrical Transmission Grid

    SciTech Connect

    Zehner, Richard E.

    2012-02-01

    Citation Information: Originator: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Originator: Xcel Energy Publication Date: 2012 Title: Colorado XcelEnergy NonXcel Transmission Network Edition: First Publication Information: Publication Place: Earth Science & Observation Center, Cooperative Institute for Research in Environmental Science (CIRES), University of Colorado, Boulder Publisher: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Description: This layer contains transmission network of Colorado Spatial Domain: Extent: Top: 4540689.017558 m Left: 160606.141934 m Right: 758715.946645 m Bottom: 4098910.893397m Contact Information: Contact Organization: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Contact Person: Khalid Hussein Address: CIRES, Ekeley Building Earth Science & Observation Center (ESOC) 216 UCB City: Boulder State: CO Postal Code: 80309-0216 Country: USA Contact Telephone: 303-492-6782 Spatial Reference Information: Coordinate System: Universal Transverse Mercator (UTM) WGS’1984 Zone 13N False Easting: 500000.00000000 False Northing: 0.00000000 Central Meridian: -105.00000000 Scale Factor: 0.99960000 Latitude of Origin: 0.00000000 Linear Unit: Meter Datum: World Geodetic System ’1984 (WGS ’1984) Prime Meridian: Greenwich Angular Unit: Degree Digital Form: Format Name: Shapefile

  18. A Direct Experimental Evidence For the New Thermodynamic Boundary in the Supercritical State: Implications for Earth and Planetary Sciences.

    NASA Astrophysics Data System (ADS)

    Bolmatov, D.

    2015-12-01

    While scientists have a good theoretical understanding of the heat capacity of both solids and gases, a general theory of the heat capacity of liquids has always remained elusive. Apart from being an awkward hole in our knowledge, heat capacity - the amount of heat needed to change a substance's temperature by a certain amount - is a relevant quantity that it would be nice to be able to predict. I will introduce a phonon-based approach to liquids and supercritical fluids to describe its thermodynamics in terms of sound propagation. I will show that the internal liquid energy has a transverse sound propagation gaps and explain their evolution with temperature variations on the P-T diagram. I will explain how this theoretical framework covers the Debye theory of solids, the phonon theory of liquids, and thermodynamic limits such as the Delong-Petit and the ideal gas thermodynamic limits. As a results, the experimental evidence for the new thermodynamic boundary in the supercritical state (the Frenkel line) on the P-T phase diagram will be demonstrated. Then, I will report on inelastic X-ray scattering experiments combined with the molecular dynamics simulations on deeply supercritical Ar. The presented results unveil the mechanism and regimes of sound propagation in the liquid matter and provide compelling evidence for the adiabatic-to-isothermal longitudinal sound propagation transition. As a result, a universal link will be demonstrated between the positive sound dispersion (PSD) phenomenon and the origin of transverse sound propagation revealing the viscous-to-elastic crossover in compressed liquids. Both can be considered as a universal fingerprint of the dynamic response of a liquid. They can be used then for a signal detection and analysis of a dynamic response in deep water and other fluids which is relevant for describing the thermodynamics of gas giants. The consequences of this finding will be discussed, including a physically justified way to demarcate the

  19. Density separation of solids in ferrofluids with magnetic grids

    SciTech Connect

    Fay, H.; Quets, J.M.

    1980-04-01

    Nonmagnetic solids in a superparamagnetic ferrofluid are subjected to body forces proportional to the intensity of magnetization of the fluid and the gradient of the magnetic field. An apparent density of the fluid can be defined from the force equations, and since the apparent density can be much larger than the true density, it is possible to levitate or float dense objects. Mixtures of solids with a density greater than the apparent density sink while lower density solids float. In practice it is difficult to create a uniform gradient over a large volume and single gap magnetic separators require very large magnets or have a limited throughput. To overcome that problem, multiple gap magnetic grids have been designed. Such grids consist of planar arrays of parallel bars of alternating polarity, driven by permanent magnets. When immersed in ferrofluid, magnetic grids create nonuniform field gradients and apparent densities in the fluid. However, both analysis and experimental measurements show that the grid acts as a barrier to particles below a critical density, while permitting more dense particles to fall through the grid. Thus, a magnetic grid filter can be used as a high throughput binary separator of solids according to their densities. Such filters can be cascaded for more complex separations. Several magnetic grid filters have been designed, built, and tested. Magnetic measurements qualitatively agree with the theoretical predictions. Experiments with synthetic mixtures have demonstrated that good binary separations can be made.

  20. Topology and grid adaption for high-speed flow computations

    NASA Technical Reports Server (NTRS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-01-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  1. Grid Generation Techniques Utilizing the Volume Grid Manipulator

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1998-01-01

    This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.

  2. From the grid to the smart grid, topologically

    NASA Astrophysics Data System (ADS)

    Pagani, Giuliano Andrea; Aiello, Marco

    2016-05-01

    In its more visionary acceptation, the smart grid is a model of energy management in which the users are engaged in producing energy as well as consuming it, while having information systems fully aware of the energy demand-response of the network and of dynamically varying prices. A natural question is then: to make the smart grid a reality will the distribution grid have to be upgraded? We assume a positive answer to the question and we consider the lower layers of medium and low voltage to be the most affected by the change. In our previous work, we analyzed samples of the Dutch distribution grid (Pagani and Aiello, 2011) and we considered possible evolutions of these using synthetic topologies modeled after studies of complex systems in other technological domains (Pagani and Aiello, 2014). In this paper, we take an extra important step by defining a methodology for evolving any existing physical power grid to a good smart grid model, thus laying the foundations for a decision support system for utilities and governmental organizations. In doing so, we consider several possible evolution strategies and apply them to the Dutch distribution grid. We show how increasing connectivity is beneficial in realizing more efficient and reliable networks. Our proposal is topological in nature, enhanced with economic considerations of the costs of such evolutions in terms of cabling expenses and economic benefits of evolving the grid.

  3. The impact of the topology on cascading failures in a power grid model

    NASA Astrophysics Data System (ADS)

    Koç, Yakup; Warnier, Martijn; Mieghem, Piet Van; Kooij, Robert E.; Brazier, Frances M. T.

    2014-05-01

    Cascading failures are one of the main reasons for large scale blackouts in power transmission grids. Secure electrical power supply requires, together with careful operation, a robust design of the electrical power grid topology. Currently, the impact of the topology on grid robustness is mainly assessed by purely topological approaches, that fail to capture the essence of electric power flow. This paper proposes a metric, the effective graph resistance, to relate the topology of a power grid to its robustness against cascading failures by deliberate attacks, while also taking the fundamental characteristics of the electric power grid into account such as power flow allocation according to Kirchhoff laws. Experimental verification on synthetic power systems shows that the proposed metric reflects the grid robustness accurately. The proposed metric is used to optimize a grid topology for a higher level of robustness. To demonstrate its applicability, the metric is applied on the IEEE 118 bus power system to improve its robustness against cascading failures.

  4. NAS Grid Benchmarks: A Tool for Grid Space Exploration

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.

  5. Grid Erosion Modeling of the NEXT Ion Thruster Optics

    NASA Technical Reports Server (NTRS)

    Ernhoff, Jerold W.; Boyd, Iain D.; Soulas, George (Technical Monitor)

    2003-01-01

    Results from several different computational studies of the NEXT ion thruster optics are presented. A study of the effect of beam voltage on accelerator grid aperture wall erosion shows a non-monotonic, complex behavior. Comparison to experimental performance data indicates improvements in simulation of the accelerator grid current, as well as very good agreement with other quantities. Also examined is the effect of ion optics choice on the thruster life, showing that TAG optics provide better margin against electron backstreaming than NSTAR optics. The model is used to predict the change in performance with increasing accelerator grid voltage, showing that although the current collected on the accel grid downstream face increases, the erosion rate decreases. A study is presented for varying doubly-ionized Xenon current fraction. The results show that performance data is not extremely sensitive to the current fraction.

  6. TIGER: Turbomachinery interactive grid generation

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.; Shih, Ming-Hsin; Janus, J. Mark

    1992-01-01

    A three dimensional, interactive grid generation code, TIGER, is being developed for analysis of flows around ducted or unducted propellers. TIGER is a customized grid generator that combines new technology with methods from general grid generation codes. The code generates multiple block, structured grids around multiple blade rows with a hub and shroud for either C grid or H grid topologies. The code is intended for use with a Euler/Navier-Stokes solver also being developed, but is general enough for use with other flow solvers. TIGER features a silicon graphics interactive graphics environment that displays a pop-up window, graphics window, and text window. The geometry is read as a discrete set of points with options for several industrial standard formats and NASA standard formats. Various splines are available for defining the surface geometries. Grid generation is done either interactively or through a batch mode operation using history files from a previously generated grid. The batch mode operation can be done either with a graphical display of the interactive session or with no graphics so that the code can be run on another computer system. Run time can be significantly reduced by running on a Cray-YMP.

  7. Structured and unstructured grid generation.

    PubMed

    Thompson, J F; Weatherill, N P

    1992-01-01

    Current techniques in composite-block-structured grid generation and unstructured grid generation for general 3D geometries are summarized, including both algebraic and elliptic generation procedures for the former and Delaunay tessellations for the latter. Citations of relevant theory are given. Examples of applications for several geometries are included. PMID:1424687

  8. Intelligent automated surface grid generation

    NASA Technical Reports Server (NTRS)

    Yao, Ke-Thia; Gelsey, Andrew

    1995-01-01

    The goal of our research is to produce a flexible, general grid generator for automated use by other programs, such as numerical optimizers. The current trend in the gridding field is toward interactive gridding. Interactive gridding more readily taps into the spatial reasoning abilities of the human user through the use of a graphical interface with a mouse. However, a sometimes fruitful approach to generating new designs is to apply an optimizer with shape modification operators to improve an initial design. In order for this approach to be useful, the optimizer must be able to automatically grid and evaluate the candidate designs. This paper describes and intelligent gridder that is capable of analyzing the topology of the spatial domain and predicting approximate physical behaviors based on the geometry of the spatial domain to automatically generate grids for computational fluid dynamics simulators. Typically gridding programs are given a partitioning of the spatial domain to assist the gridder. Our gridder is capable of performing this partitioning. This enables the gridder to automatically grid spatial domains of wide range of configurations.

  9. Grid generation using classical techniques

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1980-01-01

    A brief historical review of conformal mapping and its applications to problems in fluid mechanics and electromagnetism is presented. The use of conformal mapping as a grid generator is described. The philosophy of the 'closed form' approach and its application to a Neumann problem is discussed. Karman-Trefftz mappings and grids for ablated, three dimensional bodies are also discussed.

  10. On Multigrid for Overlapping Grids

    SciTech Connect

    Henshaw, W

    2004-01-13

    The solution of elliptic partial differential equations on composite overlapping grids using multigrid is discussed. An approach is described that provides a fast and memory efficient scheme for the solution of boundary value problems in complex geometries. The key aspects of the new scheme are an automatic coarse grid generation algorithm, an adaptive smoothing technique for adjusting residuals on different component grids, and the use of local smoothing near interpolation boundaries. Other important features include optimizations for Cartesian component grids, the use of over-relaxed Red-Black smoothers and the generation of coarse grid operators through Galerkin averaging. Numerical results in two and three dimensions show that very good multigrid convergence rates can be obtained for both Dirichlet and Neumann/mixed boundary conditions. A comparison to Krylov based solvers shows that the multigrid solver can be much faster and require significantly less memory.

  11. Theory of the dynamic response of a coplanar grid semiconductor detector

    SciTech Connect

    Kozorezov, A. G.; Wigmore, J. K.; Owens, A.; Peacock, A.

    2007-07-09

    The authors have developed a theoretical model for the response of a coplanar grid semiconductor detector to hard x- and {gamma}-ray radiation. Carrier drift trajectories were obtained by solving the coupled dynamical equations for carriers driven by electrostatic fields of the coplanar grid configuration. The pulse spectra calculated by summing the individual contributions for all carriers are compared to experimental results for a large volume optimized cadmium zinc telluride coplanar grid detector and good agreement is obtained.

  12. A business model for the establishment of the European grid infrastructure

    NASA Astrophysics Data System (ADS)

    Candiello, A.; Cresti, D.; Ferrari, T.; Karagiannis, F.; Kranzlmueller, D.; Louridas, P.; Mazzucato, M.; Matyska, L.; Perini, L.; Schauerhammer, K.; Ullmann, K.; Wilson, M.

    2010-04-01

    An international grid has been built in Europe during the past years in the framework of various EC-funded projects to support the growth of e-Science. After several years of work spent to increase the scale of the infrastructure, to expand the user community and improve the availability of the services delivered, effort is now concentrating on the creation of a new organizational model, capable of fulfilling the vision of a sustainable European grid infrastructure. The European Grid Initiative (EGI) is the proposed framework to seamlessly link at a global level the European national grid e-Infrastructures operated by the National Grid Initiatives and European International Research Organizations, and based on a European Unified Middleware Distribution, which will be the result of a joint effort of various European grid Middleware Consortia. This paper describes the requirements that EGI addresses, the actors contributing to its foundation, the offering and the organizational structure that constitute the EGI business model.

  13. Framework for Interactive Parallel Dataset Analysis on the Grid

    SciTech Connect

    Alexander, David A.; Ananthan, Balamurali; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  14. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  15. Aeroacoustic Simulation of Nose Landing Gear on Adaptive Unstructured Grids With FUN3D

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Park, Michael A.; Lockhard, David P.

    2013-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed nose landing gear configuration that was tested in NASA Langley s closed-wall Basic Aerodynamic Research Tunnel (BART) and in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D, developed at NASA Langley Research center, is used to compute the unsteady flow field for this configuration. Starting with a coarse grid, a series of successively finer grids were generated using the adaptive gridding methodology available in the FUN3D code. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these computations. Time-averaged and instantaneous solutions obtained on these grids are compared with the measured data. In general, the correlation with the experimental data improves with grid refinement. A similar trend is observed for sound pressure levels obtained by using these CFD solutions as input to a FfowcsWilliams-Hawkings noise propagation code to compute the farfield noise levels. In general, the numerical solutions obtained on adapted grids compare well with the hand-tuned enriched fine grid solutions and experimental data. In addition, the grid adaption strategy discussed here simplifies the grid generation process, and results in improved computational efficiency of CFD simulations.

  16. Optimizing solar-cell grid geometry

    NASA Technical Reports Server (NTRS)

    Crossley, A. P.

    1969-01-01

    Trade-off analysis and mathematical expressions calculate optimum grid geometry in terms of various cell parameters. Determination of the grid geometry provides proper balance between grid resistance and cell output to optimize the energy conversion process.

  17. Single grid accelerator for an ion thrustor

    NASA Technical Reports Server (NTRS)

    Margosian, P. M.; Nakanishi, S. (Inventor)

    1973-01-01

    A single grid accelerator system for an ion thrustor is discussed. A layer of dielectric material is interposed between this metal grid and the chamber containing an ionized propellant for protecting the grid against sputtering erosion.

  18. Hybrid Scheduling Model for Independent Grid Tasks

    PubMed Central

    Shanthini, J.; Kalaikumaran, T.; Karthik, S.

    2015-01-01

    Grid computing facilitates the resource sharing through the administrative domains which are geographically distributed. Scheduling in a distributed heterogeneous environment is intrinsically very hard because of the heterogeneous nature of resource collection. Makespan and tardiness are two different measures of scheduling, and many of the previous researches concentrated much on reduction of makespan, which measures the machine utilization. In this paper, we propose a hybrid scheduling algorithm for scheduling independent grid tasks with the objective of reducing total weighted tardiness of grid tasks. Tardiness is to measure the due date performance, which has a direct impact on cost for executing the jobs. In this paper we propose BG_ATC algorithm which is a combination of best gap (BG) search and Apparent Tardiness Cost (ATC) indexing algorithm. Furthermore, we implemented these two algorithms in two different phases of the scheduling process. In addition to that, the comparison was made on results with various benchmark algorithms and the experimental results show that our algorithm outperforms the benchmark algorithms. PMID:26543897

  19. Hybrid Scheduling Model for Independent Grid Tasks.

    PubMed

    Shanthini, J; Kalaikumaran, T; Karthik, S

    2015-01-01

    Grid computing facilitates the resource sharing through the administrative domains which are geographically distributed. Scheduling in a distributed heterogeneous environment is intrinsically very hard because of the heterogeneous nature of resource collection. Makespan and tardiness are two different measures of scheduling, and many of the previous researches concentrated much on reduction of makespan, which measures the machine utilization. In this paper, we propose a hybrid scheduling algorithm for scheduling independent grid tasks with the objective of reducing total weighted tardiness of grid tasks. Tardiness is to measure the due date performance, which has a direct impact on cost for executing the jobs. In this paper we propose BG_ATC algorithm which is a combination of best gap (BG) search and Apparent Tardiness Cost (ATC) indexing algorithm. Furthermore, we implemented these two algorithms in two different phases of the scheduling process. In addition to that, the comparison was made on results with various benchmark algorithms and the experimental results show that our algorithm outperforms the benchmark algorithms. PMID:26543897

  20. Hybrid Scheduling Model for Independent Grid Tasks.

    PubMed

    Shanthini, J; Kalaikumaran, T; Karthik, S

    2015-01-01

    Grid computing facilitates the resource sharing through the administrative domains which are geographically distributed. Scheduling in a distributed heterogeneous environment is intrinsically very hard because of the heterogeneous nature of resource collection. Makespan and tardiness are two different measures of scheduling, and many of the previous researches concentrated much on reduction of makespan, which measures the machine utilization. In this paper, we propose a hybrid scheduling algorithm for scheduling independent grid tasks with the objective of reducing total weighted tardiness of grid tasks. Tardiness is to measure the due date performance, which has a direct impact on cost for executing the jobs. In this paper we propose BG_ATC algorithm which is a combination of best gap (BG) search and Apparent Tardiness Cost (ATC) indexing algorithm. Furthermore, we implemented these two algorithms in two different phases of the scheduling process. In addition to that, the comparison was made on results with various benchmark algorithms and the experimental results show that our algorithm outperforms the benchmark algorithms.

  1. Grid-enabled Web Services for Geospatial Interoperability

    NASA Astrophysics Data System (ADS)

    Chen, A.; di, L.; Bai, Y.; Wei, Y.

    2006-05-01

    Geospatial interoperability technology makes better and easier use of the huge volume of distributed heterogeneous geospatial data and services in Earth related science research and applications . Open Geospatial Consortium (OGC) has been developing interoperable Web service specifications, such as Web Coverage Service (WCS), Web Map Service (WMS), Web Feature Service (WFS) and Catalog Service for Web (CSW), for promoting geospatial interoperability in the distributed environment. These specifications are widely used by the geospatial community for sharing data and service. Due to the complex nature of Earth related science research and applications, a geoprocessing task normally composes of many inter-related steps of computations in the web service environment. There is a need for the cooperation and security mechanism between any two geospatial web services. Grid, as a promising e-science infrastructure, promotes and facilitates the secure interoperation and collaboration of distributed heterogeneous resources. In this paper, we discuss the technology for enabling the OGC-based geospatial interoperability in a Globus- based Grid environment. Firstly, a new Grid-enabled catalogue services model for secure registry, discovery and access of geospatial data and service was developed. The model not only combines the information schemas of Grid Metadata Catalog Service (MCS)/Replica Location Service (RSL) and the OGC Catalog Service for Web (CSW), but also exploits the geospatial metadata standards including ISO 19115, ISO 19115- 2, FGDC Content Standard for Geospatial metadata, and NASA ECS Metadata. Based on the model, the Grid- enabled CSW (GCSW) service is developed. The service preserves the OGC CSW interface while providing the naming and location transparency by mapping Grid MCS/RLS information model to OGC CSW information model. Moreover, the OGC CSW model is extended to accommodate more than 40 mandatory metadata elements needed for describing the properties

  2. SimpleVisGrid: Grid Services for Visualization of Diverse Biomedical Knowledge and Molecular Systems Data

    PubMed Central

    Stokes, Todd H.; Wang, May D.

    2016-01-01

    Biomedical data visualization is a great challenge due to the scale, complexity, and diversity of systems, system component interactions and experimental data. Standards for interoperable data are a good start to addressing these problems, but standardization of visualization technologies is an emerging topic. SimpleVisGrid builds on Cancer Biomedical Informatics Grid (caBIG) common infrastructure for cancer research, and clearly specifies and extends three standard data formats for inputs and outputs to grid services: comma-separated values (CSV), Portable Network Graphics (PNG), and Scalable Vector Graphics (SVG). Four prototype visualizations are available: 2D array data quality visualization, correlation heatmaps between high-dimensional data and associated meta-data, feature landscapes, and biochemical or semantic network graphs. The services and data model are prepared for submission for caBIG Silver-level compatibility review and for integration into automated research workflows. Making these tools available to caBIG developers and ultimately to biomedical researchers can (1) help with biomedical communication, discovery, and decision-making, (2) encourage more research on standardization of visualization formats, and (3) improve the efficiency of large data transfers across the grid. PMID:19964624

  3. Implicit large eddy simulation of a scalar mixing layer in fractal grid turbulence

    NASA Astrophysics Data System (ADS)

    Watanabe, Tomoaki; Sakai, Yasuhiko; Nagata, Kouji; Ito, Yasumasa; Hayase, Toshiyuki

    2016-07-01

    A scalar mixing layer in fractal grid turbulence is simulated by the implicit large eddy simulation (ILES) using low-pass filtering as an implicit subgrid-scale model. The square-type fractal grid with three fractal iterations is used for generating turbulence. The streamwise evolutions of the streamwise velocity statistics obtained in the ILES are in good agreement with the experimental results. The ILES results are used for investigating the development of the scalar mixing layer behind the fractal grid. The results show that the vertical development of the scalar mixing layer strongly depends on the spanwise location. Near the fractal grid, the scalar mixing layer rapidly develops just behind the largest grid bars owing to the vertical turbulent transport. The scalar mixing layer near the fractal grid also develops outside the largest grid bars because the scalar is transported between the outside and back of the largest grid bars by the spanwise turbulent transport. In the downstream region, the scalar mixing layer develops more rapidly near the grid centerline by the vertical turbulent transport and by the spanwise one which transports the scalar between the back of the largest grid bars and both the centerline and outer edge of the fractal grid. Then, the mean scalar profile becomes close to be homogeneous in the spanwise direction.

  4. Grid Integration Studies: Data Requirements, Greening the Grid

    SciTech Connect

    Katz, Jessica

    2015-06-01

    A grid integration study is an analytical framework used to evaluate a power system with high penetration levels of variable renewable energy (VRE). A grid integration study simulates the operation of the power system under different VRE scenarios, identifying reliability constraints and evaluating the cost of actions to alleviate those constraints. These VRE scenarios establish where, how much, and over what timeframe to build generation and transmission capacity, ideally capturing the spatial diversity benefits of wind and solar resources. The results help build confidence among policymakers, system operators, and investors to move forward with plans to increase the amount of VRE on the grid.

  5. Random grid fern for visual tracking

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Kai; Zhang, Jin; Li, YunSong

    2014-05-01

    Visual tracking is one of the significant research directions in computer vision. Although standard random ferns tracking method obtains a good performance for the random spatial arrangement of binary tests, the effect of the locality of image on ferns description ability are ignored and prevent them to describe the object more accurately and robustly. This paper proposes a novel spatial arrangement of binary tests to divide the bounding box into grids in order to keep more details of the image for visual tracking. Experimental results show that this method can improve tracking accuracy effectively.

  6. Celebrating the 65th anniversary of the Russian Federal Nuclear Center — All-Russian Research Institute of Experimental Physics (Scientific session of the Physical Sciences Division of the Russian Academy of Sciences, 6 October 2010)

    NASA Astrophysics Data System (ADS)

    2011-04-01

    A scientific session of the Physical Sciences Division of the Russian Academy of Sciences (RAS) took place on 6 October 2010 in the Conference Hall of the Lebedev Physical Institute, RAS (FIAN) on the occasion of the 65th anniversary of founding of the Russian Federal Nuclear Center — All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF).The agenda of the session announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Ilkaev R I (RFNC-VNIIEF, Sarov, Nizhny Novgorod region). Opening remarks "On the fundamental physics research programs at RFNC-VNIIEF" (2) Mikhailov A L (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Hydrodynamic instabilities in various media"; (3) Trunin R F (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Study of extreme states of metals using shock waves"; (4) Ivanovskii A V (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Explosive magnetic energy generators and their application in research"; (5) Podurets A M (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "X-ray studies of the structure of matter in shock waves"; (6) Garanin S G (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "High-power lasers in studies of the physics of hot, dense plasma and thermonuclear fusion"; (7) Selemir V D (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Physics research in ultrahigh magnetic fields"; (8) Mkhitar'yan L S (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Gasdynamic thermonuclear fusion."Articles based on reports 1-7 are published below. An extended version of report 3 written as a review paper will be published in a later issue of Physics-Uspekhi. • Fundamental physics research at the All-Russian Research Institute of Experimental Physics, R I Ilkaev Physics-Uspekhi, 2011, Volume 54, Number 4, Pages 387-392 • Hydrodynamic instabilities, A L Mikhailov, N V Nevmerzhitskii, V A Raevskii Physics-Uspekhi, 2011, Volume 54, Number 4, Pages 392-397 • Extreme states of metals: investigation using shock

  7. National Smart Water Grid

    SciTech Connect

    Beaulieu, R A

    2009-07-13

    The United States repeatedly experiences floods along the Midwest's large rivers and droughts in the arid Western States that cause traumatic environmental conditions with huge economic impact. With an integrated approach and solution these problems can be alleviated. Tapping into the Mississippi River and its tributaries, the world's third largest fresh water river system, during flood events will mitigate the damage of flooding and provide a new source of fresh water to the Western States. The trend of increased flooding on the Midwest's large rivers is supported by a growing body of scientific literature. The Colorado River Basin and the western states are experiencing a protracted multi-year drought. Fresh water can be pumped via pipelines from areas of overabundance/flood to areas of drought or high demand. Calculations document 10 to 60 million acre-feet (maf) of fresh water per flood event can be captured from the Midwest's Rivers and pumped via pipelines to the Colorado River and introduced upstream of Lake Powell, Utah, to destinations near Denver, Colorado, and used in areas along the pipelines. Water users of the Colorado River include the cities in southern Nevada, southern California, northern Arizona, Colorado, Utah, Indian Tribes, and Mexico. The proposed start and end points, and routes of the pipelines are documented, including information on right-of-ways necessary for state and federal permits. A National Smart Water Grid{trademark} (NSWG) Project will create thousands of new jobs for construction, operation, and maintenance and save billions in drought and flood damage reparations tax dollars. The socio-economic benefits of NWSG include decreased flooding in the Midwest; increased agriculture, and recreation and tourism; improved national security, transportation, and fishery and wildlife habitats; mitigated regional climate change and global warming such as increased carbon capture; decreased salinity in Colorado River water crossing the US

  8. An electrostatic analog for generating cascade grids

    NASA Technical Reports Server (NTRS)

    Adamczyk, J. J.

    1980-01-01

    Accurate and efficient numerical simulation of flows through turbomachinery blade rows depends on the topology of the computational grids. These grids must reflect the periodic nature of turbomachinery blade row geometries and conform to the blade shapes. Three types of grids can be generated that meet these minimal requirements: through-flow grids, O-type grids, and C-type grids. A procedure which can be used to generate all three types of grids is presented. The resulting grids are orthogonal and can be stretched to capture the essential physics of the flow. A discussion is also presented detailing the extension of the generation procedure to three dimensional geometries.

  9. GridOPTICS Software System

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allowmore » power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.« less

  10. GridOPTICS Software System

    SciTech Connect

    Akyol, Bora A; Ciraci, PNNL Selim; Gibson, PNNL Tara; Rice, PNNL Mark; Sharma, PNNL Poorva; Yin, PNNL Jian; Allwardt, PNNL Craig; PNNL,

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allow power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.

  11. A Java commodity grid kit.

    SciTech Connect

    von Laszewski, G.; Foster, I.; Gawor, J.; Lane, P.; Mathematics and Computer Science

    2001-07-01

    In this paper we report on the features of the Java Commodity Grid Kit. The Java CoG Kit provides middleware for accessing Grid functionality from the Java framework. Java CoG Kit middleware is general enough to design a variety of advanced Grid applications with quite different user requirements. Access to the Grid is established via Globus protocols, allowing the Java CoG Kit to communicate also with the C Globus reference implementation. Thus, the Java CoG Kit provides Grid developers with the ability to utilize the Grid, as well as numerous additional libraries and frameworks developed by the Java community to enable network, Internet, enterprise, and peer-to peer computing. A variety of projects have successfully used the client libraries of the Java CoG Kit to access Grids driven by the C Globus software. In this paper we also report on the efforts to develop server side Java CoG Kit components. As part of this research we have implemented a prototype pure Java resource management system that enables one to run Globus jobs on platforms on which a Java virtual machine is supported, including Windows NT machines.

  12. Buildings-to-Grid Technical Opportunities: From the Grid Perspective

    SciTech Connect

    Kropski, Ben; Pratt, Rob

    2014-03-28

    This paper outlines the nature of the power grid, lists challenges and barriers to the implementation of a transactive energy ecosystem, and provides concept solutions to current technological impediments.

  13. Running medical image analysis on GridFactory desktop grid.

    PubMed

    Orellana, Frederik; Niinimaki, Marko; Zhou, Xin; Rosendahl, Peter; Müller, Henning; Waananen, Anders

    2009-01-01

    At the Geneva University Hospitals work is in progress to establish a computing facility for medical image analysis, potentially using several hundreds of desktop computers. Typically, hospitals do not have a computer infrastructure dedicated to research, nor can the data leave the hospital network for the reasons of privacy. For this purpose, a novel batch system called GridFactory has been tested along-side with the well-known batch system Condor. GridFactory's main benefits, compared to other batch systems, lie in its virtualization support and firewall friendliness. The tests involved running visual feature extraction from 50,000 anonymized medical images on a small local grid of 20 desktop computers. A comparisons with a Condor based batch system in the same computers is then presented. The performance of GridFactory is found satisfactory. PMID:19593040

  14. Simulation of an Isolated Tiltrotor in Hover with an Unstructured Overset-Grid RANS Solver

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, Elizabeth M.; Biedron, Robert T.

    2009-01-01

    An unstructured overset-grid Reynolds Averaged Navier-Stokes (RANS) solver, FUN3D, is used to simulate an isolated tiltrotor in hover. An overview of the computational method is presented as well as the details of the overset-grid systems. Steady-state computations within a noninertial reference frame define the performance trends of the rotor across a range of the experimental collective settings. Results are presented to show the effects of off-body grid refinement and blade grid refinement. The computed performance and blade loading trends show good agreement with experimental results and previously published structured overset-grid computations. Off-body flow features indicate a significant improvement in the resolution of the first perpendicular blade vortex interaction with background grid refinement across the collective range. Considering experimental data uncertainty and effects of transition, the prediction of figure of merit on the baseline and refined grid is reasonable at the higher collective range- within 3 percent of the measured values. At the lower collective settings, the computed figure of merit is approximately 6 percent lower than the experimental data. A comparison of steady and unsteady results show that with temporal refinement, the dynamic results closely match the steady-state noninertial results which gives confidence in the accuracy of the dynamic overset-grid approach.

  15. Overset grids in compressible flow

    NASA Technical Reports Server (NTRS)

    Eberhardt, S.; Baganoff, D.

    1985-01-01

    Numerical experiments have been performed to investigate the importance of boundary data handling with overset grids in computational fluid dynamics. Experience in using embedded grid techniques in compressible flow has shown that shock waves which cross grid boundaries become ill defined and convergence is generally degraded. Numerical boundary schemes were studied to investigate the cause of these problems and a viable solution was generated using the method of characteristics to define a boundary scheme. The model test problem investigated consisted of a detached shock wave on a 2-dimensional Mach 2 blunt, cylindrical body.

  16. Grid Visualization Tool

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steven

    2005-01-01

    The Grid Visualization Tool (GVT) is a computer program for displaying the path of a mobile robotic explorer (rover) on a terrain map. The GVT reads a map-data file in either portable graymap (PGM) or portable pixmap (PPM) format, representing a gray-scale or color map image, respectively. The GVT also accepts input from path-planning and activity-planning software. From these inputs, the GVT generates a map overlaid with one or more rover path(s), waypoints, locations of targets to be explored, and/or target-status information (indicating success or failure in exploring each target). The display can also indicate different types of paths or path segments, such as the path actually traveled versus a planned path or the path traveled to the present position versus planned future movement along a path. The program provides for updating of the display in real time to facilitate visualization of progress. The size of the display and the map scale can be changed as desired by the user. The GVT was written in the C++ language using the Open Graphics Library (OpenGL) software. It has been compiled for both Sun Solaris and Linux operating systems.

  17. National transmission grid study

    SciTech Connect

    Abraham, Spencer

    2003-05-31

    The National Energy Policy Plan directed the U.S. Department of Energy (DOE) to conduct a study to examine the benefits of establishing a national electricity transmission grid and to identify transmission bottlenecks and measures to address them. DOE began by conducting an independent analysis of U.S. electricity markets and identifying transmission system bottlenecks using DOE’s Policy Office Electricity Modeling System (POEMS). DOE’s analysis, presented in Section 2, confirms the central role of the nation’s transmission system in lowering costs to consumers through increased trade. More importantly, DOE’s analysis also confirms the results of previous studies, which show that transmission bottlenecks and related transmission system market practices are adding hundreds of millions of dollars to consumers’ electricity bills each year. A more detailed technical overview of the use of POEMS is provided in Appendix A. DOE led an extensive, open, public input process and heard a wide range of comments and recommendations that have all been considered.1 More than 150 participants registered for three public workshops held in Detroit, MI (September 24, 2001); Atlanta, GA (September 26, 2001); and Phoenix, AZ (September 28, 2001).

  18. Symbolic Constraint Maintenance Grid

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.

  19. Skeletal muscle grids for assessing current distributions from defibrillation shocks.

    PubMed

    Schmidt, J; Gatlin, B; Eason, J; Koomullil, G; Pilkington, T

    1992-01-01

    This paper utilizes a structured and an unstructured grid representation of a torso with an anisotropic skeletal muscle to assess current distributions from defibrillation shocks. The results show that a finite-element solution on an unstructured grid of 400,000 elements (60,000 nodes) achieves comparable current distributions with a finite-difference solution on a structured grid that uses approximately the same number of nodes. Moreover, a finite-element solution on a 65,000-element (10,500 nodes) unstructured grid yielded fractional percent current results within 5% of the finer grids. The structured and unstructured grid models are used to investigate recent interpretations of experimental data that concluded that more than 80% of the total defibrillation current is shunted by the anisotropic skeletal muscle thoracic cage. It is concluded that these interpretations, which were based on a one-dimensional resistive network representation of the three-dimensional defibrillation situation, overestimate by 25% the current shunted by the anisotropic thoracic cage. PMID:1424684

  20. Spatial grid services for adaptive spatial query optimization

    NASA Astrophysics Data System (ADS)

    Gao, Bingbo; Xie, Chuanjie; Sheng, Wentao

    2008-10-01

    Spatial information sharing and integration has now become an important issue of Geographical Information Science (GIS). Web Service technologies provide a easy and standard way to share spatial resources over network, and grid technologies which aim at sharing resources such as data, storage, and computational powers can help the sharing go deeper. However, the dynamic characteristic of grid brings complexity to spatial query optimization which is more stressed in GIS domain because spatial operations are both CPU intensive and data intensive. To address this problem, a new grid framework is employed to provide standard spatial services which can also manage and report their state information to the coordinator which is responsible for distributed spatial query optimization.

  1. Sensitivity of 30-cm mercury bombardment ion thruster characteristics to accelerator grid design

    NASA Technical Reports Server (NTRS)

    Rawlin, V. K.

    1978-01-01

    The design of ion optics for bombardment thrusters strongly influences overall performance and lifetime. The operation of a 30 cm thruster with accelerator grid open area fractions ranging from 43 to 24 percent, was evaluated and compared with experimental and theoretical results. Ion optics properties measured included the beam current extraction capability, the minimum accelerator grid voltage to prevent backstreaming, ion beamlet diameter as a function of radial position on the grid and accelerator grid hole diameter, and the high energy, high angle ion beam edge location. Discharge chamber properties evaluated were propellant utilization efficiency, minimum discharge power per beam amp, and minimum discharge voltage.

  2. isochrones: Stellar model grid package

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.

    2015-03-01

    Isochrones, written in Python, simplifies common tasks often done with stellar model grids, such as simulating synthetic stellar populations, plotting evolution tracks or isochrones, or estimating the physical properties of a star given photometric and/or spectroscopic observations.

  3. Modal Analysis for Grid Operation

    SciTech Connect

    2011-03-03

    MANGO software is to provide a solution for improving small signal stability of power systems through adjusting operator-controllable variables using PMU measurement. System oscillation problems are one of the major threats to the grid stability and reliability in California and the Western Interconnection. These problems result in power fluctuations, lower grid operation efficiency, and may even lead to large-scale grid breakup and outages. This MANGO software aims to solve this problem by automatically generating recommended operation procedures termed Modal Analysis for Grid Operation (MANGO) to improve damping of inter-area oscillation modes. The MANGO procedure includes three steps: recognizing small signal stability problems, implementing operating point adjustment using modal sensitivity, and evaluating the effectiveness of the adjustment. The MANGO software package is designed to help implement the MANGO procedure.

  4. Assistive Awareness in Smart Grids

    NASA Astrophysics Data System (ADS)

    Bourazeri, Aikaterini; Almajano, Pablo; Rodriguez, Inmaculada; Lopez-Sanchez, Maite

    The following sections are included: * Introduction * Background * The User-Infrastructure Interface * User Engagement through Assistive Awareness * Research Impact * Serious Games for Smart Grids * Serious Game Technology * Game scenario * Game mechanics * Related Work * Summary and Conclusions

  5. Grid-Enabled High Energy Physics Research using a Beowulf Cluster

    NASA Astrophysics Data System (ADS)

    Mahmood, Akhtar

    2005-04-01

    At Edinboro University of Pennsylvania, we have built a 8-node 25 Gflops Beowulf Cluster with 2.5 TB of disk storage space to carry out grid-enabled, data-intensive high energy physics research for the ATLAS experiment via Grid3. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes. Once fully functional, the Cluster will be part of Grid3[www.ivdgl.org/grid3]. The current ATLAS simulation grid application, models the entire physical processes from the proton anti-proton collisions and detector's response to the collision debri through the complete reconstruction of the event from analyses of these responses. The end result is a detailed set of data that simulates the real physical collision event inside a particle detector. Grid is the new IT infrastructure for the 21^st century science -- a new computing paradigm that is poised to transform the practice of large-scale data-intensive research in science and engineering. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.

  6. CDF GlideinWMS usage in grid computing of high energy physics

    SciTech Connect

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor; /Fermilab

    2010-01-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  7. CDF GlideinWMS usage in Grid computing of high energy physics

    NASA Astrophysics Data System (ADS)

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor

    2010-04-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  8. Smart Wire Grid: Resisting Expectations

    ScienceCinema

    Ramsay, Stewart; Lowe, DeJim

    2016-07-12

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  9. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  10. Reinventing Batteries for Grid Storage

    ScienceCinema

    Banerjee, Sanjoy

    2016-07-12

    The City University of New York's Energy Institute, with the help of ARPA-E funding, is creating safe, low cost, rechargeable, long lifecycle batteries that could be used as modular distributed storage for the electrical grid. The batteries could be used at the building level or the utility level to offer benefits such as capture of renewable energy, peak shaving and microgridding, for a safer, cheaper, and more secure electrical grid.

  11. Smart Wire Grid: Resisting Expectations

    SciTech Connect

    Ramsay, Stewart; Lowe, DeJim

    2014-03-03

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  12. Towards Smart Grid Dynamic Ratings

    NASA Astrophysics Data System (ADS)

    Cheema, Jamal; Clark, Adrian; Kilimnik, Justin; Pavlovski, Chris; Redman, David; Vu, Maria

    2011-08-01

    The energy distribution industry is giving greater attention to smart grid solutions as a means for increasing the capabilities, efficiency and reliability of the electrical power network. The smart grid makes use of intelligent monitoring and control devices throughout the distribution network to report on electrical properties such as voltage, current and power, as well as raising network alarms and events. A further aspect of the smart grid embodies the dynamic rating of electrical assets of the network. This fundamentally involves a rating of the load current capacity of electrical assets including feeders, transformers and switches. The mainstream approach to rate assets is to apply the vendor plate rating, which often under utilizes assets, or in some cases over utilizes when environmental conditions reduce the effective rated capacity, potentially reducing lifetime. Using active intelligence we have developed a rating system that rates assets in real time based upon several events. This allows for a far more efficient and reliable electrical grid that is able to extend further the life and reliability of the electrical network. In this paper we describe our architecture, the observations made during development and live deployment of the solution into operation. We also illustrate how this solution blends with the smart grid by proposing a dynamic rating system for the smart grid.

  13. Navier-Stokes simulation of rotor-body flowfield in hover using overset grids

    NASA Technical Reports Server (NTRS)

    Srinivasan, G. R.; Ahmad, J. U.

    1993-01-01

    A free-wake Navier-Stokes numerical scheme and multiple Chimera overset grids have been utilized for calculating the quasi-steady hovering flowfield of a Boeing-360 rotor mounted on an axisymmetric whirl-tower. The entire geometry of this rotor-body configuration is gridded-up with eleven different overset grids. The composite grid has 1.3 million grid points for the entire flow domain. The numerical results, obtained using coarse grids and a rigid rotor assumption, show a thrust value that is within 5% of the experimental value at a flow condition of M(sub tip) = 0.63, Theta(sub c) = 8 deg, and Re = 2.5 x 10(exp 6). The numerical method thus demonstrates the feasibility of using a multi-block scheme for calculating the flowfields of complex configurations consisting of rotating and non-rotating components.

  14. On the application of Chimera/unstructured hybrid grids for conjugate heat transfer

    NASA Technical Reports Server (NTRS)

    Kao, Kai-Hsiung; Liou, Meng-Sing

    1995-01-01

    A hybrid grid system that combines the Chimera overset grid scheme and an unstructured grid method is developed to study fluid flow and heat transfer problems. With the proposed method, the solid structural region, in which only the heat conduction is considered, can be easily represented using an unstructured grid method. As for the fluid flow region external to the solid material, the Chimera overset grid scheme has been shown to be very flexible and efficient in resolving complex configurations. The numerical analyses require the flow field solution and material thermal response to be obtained simultaneously. A continuous transfer of temperature and heat flux is specified at the interface, which connects the solid structure and the fluid flow as an integral system. Numerical results are compared with analytical and experimental data for a flat plate and a C3X cooled turbine cascade. A simplified drum-disk system is also simulated to show the effectiveness of this hybrid grid system.

  15. Grid-Optimization Program for Photovoltaic Cells

    NASA Technical Reports Server (NTRS)

    Daniel, R. E.; Lee, T. S.

    1986-01-01

    CELLOPT program developed to assist in designing grid pattern of current-conducting material on photovoltaic cell. Analyzes parasitic resistance losses and shadow loss associated with metallized grid pattern on both round and rectangular solar cells. Though performs sensitivity studies, used primarily to optimize grid design in terms of bus bar and grid lines by minimizing power loss. CELLOPT written in APL.

  16. ASCR Science Network Requirements

    SciTech Connect

    Dart, Eli; Tierney, Brian

    2009-08-24

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

  17. Grid accounting service: state and future development

    SciTech Connect

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-01-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  18. The European Grid of Solar Observations (EGSO)

    NASA Astrophysics Data System (ADS)

    Bentley, R. D.; EGSO Team

    2002-05-01

    A major hurdles in the analysis of solar data is finding what data are available and retrieving those that are needed. Planned space- and ground-based instruments will produce huge volumes of data and even taking into account the continuous technical advances, it is clear that a new approach is needed to the way we use these data. The European Grid of Solar Observations (EGSO) is a Grid test-bed that will change the way users analyze solar data. EGSO will federate solar data archives across Europe and beyond, and will create the tools to select, process and retrieve distributed and heterogeneous solar data. It will provide mechanisms to produce standardized observing catalogues for space and ground-based observations, and the tools to create solar feature catalogues that will facilitate the selection of solar data based on features, events and phenomena. In essence, EGSO will provide the fabric of a virtual observatory. EGSO is funded under the IST (Information Society Technologies) thematic programme of European Commission's Fifth Framework Programme (FP5). The project started in March 2002 and will last for 3 years. The EGSO consortium comprises 10 institutes from Europe and the US, and is led by the Mullard Space Science Laboratory (MSSL) of University College London (UCL). EGSO plans to work closely with groups funded under NASA's Virtual Solar Observatory (VSO) initiative, and with the team at Lockheed-Martin who are doing similar work within the ILWS programme.

  19. Grid accounting service: state and future development

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-06-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  20. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    SciTech Connect

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve; /Fermilab

    2009-05-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.