Science.gov

Sample records for science grid experimental

  1. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  2. The Open Science Grid

    SciTech Connect

    Pordes, Ruth; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Wurthwein, Frank; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  3. New Science on the Open Science Grid

    SciTech Connect

    Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; /Fermilab /Florida U. /Chicago U. /Caltech /LBL, Berkeley /Wisconsin U., Madison /Indiana U. /Brookhaven /UC, San Diego

    2008-06-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  4. Space-based Science Operations Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    science experimenters. There is an international aspect to the Grid involving the America's Pathway (AMPath) network, the Chilean REUNA Research and Education Network and the University of Chile in Santiago that will further demonstrate how extensive these services can be used. From the user's perspective, the Prototype will provide a single interface and logon to these varied services without the complexity of knowing the where's and how's of each service. There is a separate and deliberate emphasis on security. Security will be addressed by specifically outlining the different approaches and tools used. Grid technology, unlike the Internet, is being designed with security in mind. In addition we will show the locations, configurations and network paths associated with each service and virtual organization. We will discuss the separate virtual organizations that we define for the varied user communities. These will include certain, as yet undetermined, space-based science functions and/or processes and will include specific virtual organizations required for public and educational outreach and science and engineering collaboration. We will also discuss the Grid Prototype performance and the potential for further Grid applications both space-based and ground based projects and processes. In this paper and presentation we will detail each service and how they are integrated using Grid

  5. Grid for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  6. Enabling Campus Grids with Open Science Grid Technology

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian; Fraser, Dan; Pordes, Ruth; Swanson, David

    2011-12-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  7. Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Chen, Meili; Cobb, John W; Kohl, James Arthur; Miller, Stephen D; Speirs, David A; Vazhkudai, Sudharshan S

    2010-01-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of $1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  8. Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, Vickie; Chen, Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of 1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  9. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  10. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:19579217

  11. TeraGrid Gateways for Earth Science

    NASA Astrophysics Data System (ADS)

    Wilkins-Diehr, Nancy

    2010-05-01

    The increasingly digital component of science today poses exciting challenges and opportunities for researchers. Whether it's streaming data from sensors to computations, tagging video in the study of language patterns or the use of geographic information systems to anticipate the spread of disease, the challenges are enormous and continue to grow. The existence of advanced cyberinfrastructure (CI) tools or science gateways can significantly increase the productivity of researchers facing the most difficult challenges - in some cases making the impossible possible. The TeraGrid Science Gateways program works to incorporate high end resources through these community-designed interfaces. This talk will present an overview of TeraGrid's gateway program and highlight several gateways in atmospheric science, earth sciences and geography and regional science, geophysics, global atmospheric research, materials research and seismology.

  12. Parallel Grid Manipulations in Earth Science Calculations

    NASA Technical Reports Server (NTRS)

    Sawyer, W.; Lucchesi, R.; daSilva, A.; Takacs, L. L.

    1999-01-01

    sparse interpolation with little data locality between the physical lat-lon grid and a pole rotated computational grid- can be solved efficiently and at the GFlop/s rates needed to solve tomorrow's high resolution earth science models. In the subsequent presentation we will discuss the design and implementation of PILGRIM as well as a number of the problems it is required to solve. Some conclusions will be drawn about the potential performance of the overall earth science models on the supercomputer platforms foreseen for these problems.

  13. Grids for Dummies: Featuring Earth Science Data Mining Application

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  14. The Neutron Science TeraGrid Gateway, a TeraGrid Science Gateway to Support the Spallation Neutron Source

    SciTech Connect

    Cobb, John W; Geist, Al; Kohl, James Arthur; Miller, Stephen D; Peterson, Peter F; Pike, Gregory; Reuter, Michael A; Swain, William; Vazhkudai, Sudharshan S; Vijayakumar, Nithya N

    2006-01-01

    The National Science Foundation's (NSF's) Extensible Terascale Facility (ETF), or TeraGrid [1] is entering its operational phase. An ETF science gateway effort is the Neutron Science TeraGrid Gateway (NSTG.) The Oak Ridge National Laboratory (ORNL) resource provider effort (ORNL-RP) during construction and now in operations is bridging a large scale experimental community and the TeraGrid as a large-scale national cyberinfrastructure. Of particular emphasis is collaboration with the Spallation Neutron Source (SNS) at ORNL. The U.S. Department of Energy's (DOE's) SNS [2] at ORNL will be commissioned in spring of 2006 as the world's brightest source of neutrons. Neutron science users can run experiments, generate datasets, perform data reduction, analysis, visualize results; collaborate with remotes users; and archive long term data in repositories with curation services. The ORNL-RP and the SNS data analysis group have spent 18 months developing and exploring user requirements, including the creation of prototypical services such as facility portal, data, and application execution services. We describe results from these efforts and discuss implications for science gateway creation. Finally, we show incorporation into implementation planning for the NSTG and SNS architectures. The plan is for a primarily portal-based user interaction supported by a service oriented architecture for functional implementation.

  15. Technology for a NASA Space-Based Science Operations Grid

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.

    2003-01-01

    This viewgraph representation presents an overview of a proposal to develop a space-based operations grid in support of space-based science experiments. The development of such a grid would provide a dynamic, secure and scalable architecture based on standards and next-generation reusable software and would enable greater science collaboration and productivity through the use of shared resources and distributed computing. The authors propose developing this concept for use on payload experiments carried aboard the International Space Station. Topics covered include: grid definitions, portals, grid development and coordination, grid technology and potential uses of such a grid.

  16. The Open Science Grid status and architecture

    SciTech Connect

    Pordes, Ruth; Petravick, Don; Kramer, Bill; Olsen, James D.; Livny, Miron; Roy, Gordon A.; Avery, Paul Ralph; Blackburn, Kent; Wenaus, Torre J.; Wuerthwein, Frank K.; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  17. The Open Science Grid status and architecture

    NASA Astrophysics Data System (ADS)

    Pordes, R.; Petravick, D.; Kramer, B.; Olson, D.; Livny, M.; Roy, A.; Avery, P.; Blackburn, K.; Wenaus, T.; Würthwein, F.; Foster, I.; Gardner, R.; Wilde, M.; Blatecky, A.; McGee, J.; Quick, R.

    2008-07-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  18. European grid services for global earth science

    NASA Astrophysics Data System (ADS)

    Brewer, S.; Sipos, G.

    2012-04-01

    This presentation will provide an overview of the distributed computing services that the European Grid Infrastructure (EGI) offers to the Earth Sciences community and also explain the processes whereby Earth Science users can engage with the infrastructure. One of the main overarching goals for EGI over the coming year is to diversify its user-base. EGI therefore - through the National Grid Initiatives (NGIs) that provide the bulk of resources that make up the infrastructure - offers a number of routes whereby users, either individually or as communities, can make use of its services. At one level there are two approaches to working with EGI: either users can make use of existing resources and contribute to their evolution and configuration; or alternatively they can work with EGI, and hence the NGIs, to incorporate their own resources into the infrastructure to take advantage of EGI's monitoring, networking and managing services. Adopting this approach does not imply a loss of ownership of the resources. Both of these approaches are entirely applicable to the Earth Sciences community. The former because researchers within this field have been involved with EGI (and previously EGEE) as a Heavy User Community and the latter because they have very specific needs, such as incorporating HPC services into their workflows, and these will require multi-skilled interventions to fully provide such services. In addition to the technical support services that EGI has been offering for the last year or so - the applications database, the training marketplace and the Virtual Organisation services - there now exists a dynamic short-term project framework that can be utilised to establish and operate services for Earth Science users. During this talk we will present a summary of various on-going projects that will be of interest to Earth Science users with the intention that suggestions for future projects will emerge from the subsequent discussions: • The Federated Cloud Task

  19. AstroGrid-D: Grid technology for astronomical science

    NASA Astrophysics Data System (ADS)

    Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve

    2011-02-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.

  20. Virtual Experiments on the Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, V. E.; Cobb, J. W.; Farhi, E.; Miller, S. D.; Taylor, M.

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted.

  1. Virtual Experiments on the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Farhi, Emmanuel N; Miller, Stephen D; Taylor, M

    2008-01-01

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted.

  2. Public storage for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  3. Migrating Open Science Grid to RPMs*

    NASA Astrophysics Data System (ADS)

    Roy, Alain

    2012-12-01

    We recently completed a significant transition in the Open Science Grid (OSG) in which we moved our software distribution mechanism from the useful but niche system called Pacman to a community-standard native package system, RPM. In this paper we explore some of the lessons learned during this transition as well as our earlier work, lessons that we believe are valuable not only for software distribution and packaging, but also for software engineering in a distributed computing environment where reliability is critical. We discuss the benefits found in moving to a community standard, including the abilities to reuse existing packaging, to donate existing packaging back to the community, and to leverage existing skills in the community. We describe our approach to testing in which we test our software against multiple versions of the OS, including pre-releases of the OS, in order to find surprises before our users do. Finally, we discuss our large-scale evaluation testing and community testing, which are essential for both quality and community acceptance.

  4. Open computing grid for molecular science and engineering.

    PubMed

    Sild, Sulev; Maran, Uko; Lomaka, Andre; Karelson, Mati

    2006-01-01

    Grid is an emerging infrastructure for distributed computing that provides secure and scalable mechanisms for discovering and accessing remote software and data resources. Applications built on this infrastructure have great potential for addressing and solving large scale chemical, pharmaceutical, and material science problems. The article describes the concept behind grid computing and will present the OpenMolGRID system that is an open computing grid for molecular science and engineering. This system provides grid enabled components, such as a data warehouse for chemical data, software for building QSPR/QSAR models, and molecular engineering tools for generating compounds with predefined chemical properties or biological activities. The article also provides an overview about the availability of chemical applications in the grid. PMID:16711713

  5. Direct experimental determination of Frisch grid inefficiency in ionization chamber

    NASA Astrophysics Data System (ADS)

    Khriachkov, V. A.; Goverdovski, A. A.; Ketlerov, V. V.; Mitrofanov, V. F.; Semenova, N. N.

    1997-07-01

    The present work describes the method of direct experimental determination of the Frisch grid inefficiency in an ionization chamber. The method is based on analysis of the anode signal after Waveform Digitizer. It is shown that the calculated grid inefficiency value can differ much from the measured ones.

  6. Pilot job accounting and auditing in Open Science Grid

    SciTech Connect

    Sfiligoi, Igor; Green, Chris; Quinn, Greg; Thain, Greg; /Wisconsin U., Madison

    2008-06-01

    The Grid accounting and auditing mechanisms were designed under the assumption that users would submit their jobs directly to the Grid gatekeepers. However, many groups are starting to use pilot-based systems, where users submit jobs to a centralized queue and are successively transferred to the Grid resources by the pilot infrastructure. While this approach greatly improves the user experience, it does disrupt the established accounting and auditing procedures. Open Science Grid deploys gLExec on the worker nodes to keep the pilot-related accounting and auditing information and centralizes the accounting collection with GRATIA.

  7. Grid Technology as a Cyber Infrastructure for Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    This paper describes how grids and grid service technologies can be used to develop an infrastructure for the Earth Science community. This cyberinfrastructure would be populated with a hierarchy of services, including discipline specific services such those needed by the Earth Science community as well as a set of core services that are needed by most applications. This core would include data-oriented services used for accessing and moving data as well as computer-oriented services used to broker access to resources and control the execution of tasks on the grid. The availability of such an Earth Science cyberinfrastructure would ease the development of Earth Science applications. With such a cyberinfrastructure, application work flows could be created to extract data from one or more of the Earth Science archives and then process it by passing it through various persistent services that are part of the persistent cyberinfrastructure, such as services to perform subsetting, reformatting, data mining and map projections.

  8. Nuclear test experimental science

    SciTech Connect

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S.

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research.

  9. A Grid Metadata Service for Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Critical challenges for climate modeling researchers are strongly connected with the increasingly complex simulation models and the huge quantities of produced datasets. Future trends in climate modeling will only increase computational and storage requirements. For this reason the ability to transparently access to both computational and data resources for large-scale complex climate simulations must be considered as a key requirement for Earth Science and Environmental distributed systems. From the data management perspective (i) the quantity of data will continuously increases, (ii) data will become more and more distributed and widespread, (iii) data sharing/federation will represent a key challenging issue among different sites distributed worldwide, (iv) the potential community of users (large and heterogeneous) will be interested in discovery experimental results, searching of metadata, browsing collections of files, compare different results, display output, etc.; A key element to carry out data search and discovery, manage and access huge and distributed amount of data is the metadata handling framework. What we propose for the management of distributed datasets is the GRelC service (a data grid solution focusing on metadata management). Despite the classical approaches, the proposed data-grid solution is able to address scalability, transparency, security and efficiency and interoperability. The GRelC service we propose is able to provide access to metadata stored in different and widespread data sources (relational databases running on top of MySQL, Oracle, DB2, etc. leveraging SQL as query language, as well as XML databases - XIndice, eXist, and libxml2 based documents, adopting either XPath or XQuery) providing a strong data virtualization layer in a grid environment. Such a technological solution for distributed metadata management leverages on well known adopted standards (W3C, OASIS, etc.); (ii) supports role-based management (based on VOMS), which

  10. Unlocking the potential of smart grid technologies with behavioral science

    PubMed Central

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  11. Unlocking the potential of smart grid technologies with behavioral science.

    PubMed

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  12. ISS Space-Based Science Operations Grid for the Ground Systems Architecture Workshop (GSAW)

    NASA Technical Reports Server (NTRS)

    Welch, Clara; Bradford, Bob

    2003-01-01

    Contents include the following:What is grid? Benefits of a grid to space-based science operations. Our approach. Score of prototype grid. The security question. Short term objectives. Long term objectives. Space-based services required for operations. The prototype. Score of prototype grid. Prototype service layout. Space-based science grid service components.

  13. Optimal response to attacks on the open science grids.

    SciTech Connect

    Altunay, M.; Leyffer, S.; Linderoth, J. T.; Xie, Z.

    2011-01-01

    Cybersecurity is a growing concern, especially in open grids, where attack propagation is easy because of prevalent collaborations among thousands of users and hundreds of institutions. The collaboration rules that typically govern large science experiments as well as social networks of scientists span across the institutional security boundaries. A common concern is that the increased openness may allow malicious attackers to spread more readily around the grid. We consider how to optimally respond to attacks in open grid environments. To show how and why attacks spread more readily around the grid, we first discuss how collaborations manifest themselves in the grids and form the collaboration network graph, and how this collaboration network graph affects the security threat levels of grid participants. We present two mixed-integer program (MIP) models to find the optimal response to attacks in open grid environments, and also calculate the threat level associated with each grid participant. Given an attack scenario, our optimal response model aims to minimize the threat levels at unaffected participants while maximizing the uninterrupted scientific production (continuing collaborations). By adopting some of the collaboration rules (e.g., suspending a collaboration or shutting down a site), the model finds optimal response to subvert an attack scenario.

  14. Data Grid tools: enabling science on big distributed data

    NASA Astrophysics Data System (ADS)

    Allcock, Bill; Chervenak, Ann; Foster, Ian; Kesselman, Carl; Livny, Miron

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the "plumbing" that allows scientists to do more science on an unprecedented scale in production environments.

  15. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  16. Enabling Science and Engineering Applications on the Grid

    SciTech Connect

    Seidel, Ed

    2004-08-25

    The Grid has the potential to fundamentally change the way science and engineering are done. Aggregate power of computing resources connected by networks - of the Grid - exceeds that of any single supercomputer by many orders of magnitude. At the same time, our ability to carry out computations of the scale and level of detail required, for example, to study the Universe, or simulate a rocket engine, are severely constrained by available computing power. Hence, such applications should be one of the main driving forces behind the development of Grid computing. I will discuss some large scale applications, including simulations of colliding black holes, and show how they are driving the development of Grid computing technology. Applications are already being developed that are not only aware of their needs, but also of the resources available to them on the Grid. They will be able to adapt themselves automatically to respond to their changing needs, to spawn off tasks on other resources, and to adapt to the changing characteristics of the Grid including machine and network loads and availability. I will discuss a number of innovative scenarios for computing on the Grid enabled by such technologies, and demonstrate how close these are to being a reality.

  17. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  18. Earth Science applications on Grid -advantages and limitations

    NASA Astrophysics Data System (ADS)

    Petitdidier, M.; Schwichtenberg, H.

    2012-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies…. Our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly… The technical challenge is to put together databases and computing resources to answer the ES challenges. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites, (2) new algorithms and methodologies have been developed using new technologies and compute resources. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity were deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted to decrease uncertainties by increasing the probability of occurrence via a larger number of runs. Some limitations are related to the combination of databases-outside the grid infrastructure- and grid compute resources; and to real-time applications that need resource reservation in order to insure results at given time. As a matter of fact ES scientists use different compute resources according to the phase of their application are used to work in large projects and share their results. They need a service-oriented architecture and a platform of

  19. DZero data-intensive computing on the Open Science Grid

    SciTech Connect

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.; /Fermilab

    2007-09-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  20. Open Science Grid: Linking Universities and Laboratories In National Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Avery, Paul

    2011-10-01

    Open Science Grid is a consortium of researchers from universities and national laboratories that operates a national computing infrastructure serving large-scale scientific and engineering research. While OSG's scale has been primarily driven by the demands of the LHC experiments, it currently serves particle and nuclear physics, gravitational wave searches, digital astronomy, genomic science, weather forecasting, molecular modeling, structural biology and nanoscience. The OSG distributed computing facility links campus and regional computing resources and is a major component of the Worldwide LHC Computing Grid (WLCG) that handles the massive computing and storage needs of experiments at the Large Hadron Collider. This collaborative work has provided a wealth of results, including powerful new software tools and services; a uniform packaging scheme (the Virtual Data Toolkit) that simplifies software deployment across many sites in the US and Europe; integration of complex tools and services in large science applications; multiple education and outreach projects; and new approaches to integrating advanced network infrastructure in scientific computing applications. More importantly, OSG has provided unique collaborative opportunities between researchers in a variety of research disciplines.

  1. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  2. Experimental results of an iodine plasma in PEGASES gridded thruster

    NASA Astrophysics Data System (ADS)

    Grondein, Pascaline; Aanesland, Ane

    2015-09-01

    In the electric gridded thruster PEGASES, both positive and negative ions are expelled after extraction from an ion-ion plasma. This ion-ion plasma is formed downstream a localized magnetic field placed a few centimeters from the ionization region, trapping and cooling down the electron to allow a better attachment to an electronegative gas. For this thruster concept, iodine has emerged as the most attractive option. Heavy, under diatomic form and therefore good for high thrust, its low ionization threshold and high electronegativity lead to high ion-ion densities and low RF power. After the proof-of-concept of PEGASES using SF6 as propellant, we present here experimental results of an iodine plasma studied inside PEGASES thruster. At solid state at standard temperature and pressure, iodine is heated to sublimate, then injected inside the chamber where the neutral gas is heated and ionized. The whole injection system is heated to avoid deposition on surfaces and a mass flow controller allows a fine control on the neutral gas mass flow. A 3D translation stage inside the vacuum chamber allows volumetric plasma studies using electrostatic probes. The results are also compared with the global model dedicated to iodine as propellant for electric gridded thrusters. This work has been done within the LABEX Plas@par project, and received financial state aid managed by the Agence Nationale de la Recherche, as part of the programme ``Investissements d'avenir.''

  3. e-Science, caGrid, and Translational Biomedical Research

    PubMed Central

    Saltz, Joel; Kurc, Tahsin; Hastings, Shannon; Langella, Stephen; Oster, Scott; Ervin, David; Sharma, Ashish; Pan, Tony; Gurcan, Metin; Permar, Justin; Ferreira, Renato; Payne, Philip; Catalyurek, Umit; Caserta, Enrico; Leone, Gustavo; Ostrowski, Michael C.; Madduri, Ravi; Foster, Ian; Madhavan, Subhashree; Buetow, Kenneth H.; Shanbhag, Krishnakant; Siegel, Eliot

    2011-01-01

    Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies. PMID:21311723

  4. Physical Science Laboratory Manual, Experimental Version.

    ERIC Educational Resources Information Center

    Cooperative General Science Project, Atlanta, GA.

    Provided are physical science laboratory experiments which have been developed and used as a part of an experimental one year undergraduate course in general science for non-science majors. The experiments cover a limited number of topics representative of the scientific enterprise. Some of the topics are pressure and buoyancy, heat, motion,…

  5. Grid infrastructure to support science portals for large scale instruments.

    SciTech Connect

    von Laszewski, G.; Foster, I.

    1999-09-29

    Soon, a new generation of scientific workbenches will be developed as a collaborative effort among various research institutions in the US. These scientific workbenches will be accessed in the Web via portals. Reusable components are needed to build such portals for different scientific disciplines, allowing uniform desktop access to remote resources. Such components will include tools and services enabling easy collaboration, job submission, job monitoring, component discovery, and persistent object storage. Based on experience gained from Grand Challenge applications for large-scale instruments, we demonstrate how Grid infrastructure components can be used to support the implementation of science portals. The availability of these components will simplify the prototype implementation of a common portal architecture.

  6. Commissioning the HTCondor-CE for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Cartwright, T.; Frey, J.; Fajardo, E. M.; Lin, B.; Selmeci, M.; Tannenbaum, T.; Zvada, M.

    2015-12-01

    The HTCondor-CE is the next-generation gateway software for the Open Science Grid (OSG). This is responsible for providing a network service which authorizes remote users and provides a resource provisioning service (other well-known gateways include Globus GRAM, CREAM, Arc-CE, and Openstacks Nova). Based on the venerable HTCondor software, this new CE is simply a highly-specialized configuration of HTCondor. It was developed and adopted to provide the OSG with a more flexible, scalable, and easier-to-manage gateway software. Further, the focus of the HTCondor-CE is not job submission (as in GRAM or CREAM) but resource provisioning. This software does not exist in a vacuum: to deploy this gateway across the OSG, we had to integrate it with the CE configuration, deploy a corresponding information service, coordinate with sites, and overhaul our documentation.

  7. SCEC Earthworks: A TeraGrid Science Gateway

    NASA Astrophysics Data System (ADS)

    Francoeur, H.; Muench, J.; Okaya, D.; Maechling, P.; Deelman, E.; Mehta, G.

    2006-12-01

    SCEC Earthworks is a scientific gateway designed to provide community wide access to the TeraGrid. Earthworks provides its users with a portal based interface for easily running anelastic wave propagation (AWM) simulations. Using Gridsphere and several portlets developed as a collaborative effort with IRIS, Earthworks enables users to run simulations without any knowledge of the underlying workflow technology needed to utilize the TeraGrid. The workflow technology behind Earthworks has been developed as a collaborative effort between SCEC and the Information Sciences Institute (ISI). Earthworks uses a complex software stack to translate abstract workflows defined by the user into a series of jobs that run on a number of computational resources. These computational resources include a combination of servers provided by SCEC, USC High Performance Computing Center and NSF TeraGrid supercomputer facilities. Workflows are constructed after input from the user is passed via a Java based interface to the Earthworks backend, where a DAX (directed acyclic graph in XML) is generated. This DAX describes each step of the workflow including its inputs, outputs, and arguments, as well as the parent child relationships between each process. The DAX is then handed off to the Virtual Data System (VDS) and Pegasus provided by ISI, which translate it from an abstract workflow to a concrete workflow by filling in logical file and application names with their physical path and location. This newly created DAG (directed acyclic graph) is handed off to the Condor scheduler. The bottom part of the software stack is a Globus installation at each site the provides local transfer and resource management capabilities. Resources across different sites are transparently managed and tracked by VDS which allows greater flexibility in running the workflows. After a workflow is completed, products and metadata are registered with integrated data management tools. This allows for metadata querying

  8. Who Needs Plants? Science (Experimental).

    ERIC Educational Resources Information Center

    Ropeik, Bernard H.; Kleinman, David Z.

    The basic elective course in introductory botany is designed for secondary students who probably will not continue study in plant science. The objectives of the course are to help the student 1) identify, compare and differentiate types of plants; 2) identify plant cell structures; 3) distinguish between helpful and harmful plants; 4) predict…

  9. Secure Grid Services for Cooperative Work in Medicine and Life Science

    NASA Astrophysics Data System (ADS)

    Weisbecker, Anette; Falkner, Jürgen

    MediGRID provides a grid infrastructure to solve challenging problems in medical and life sciences by enhancing the productivity and by enabling locationindependent, interdisciplinary collaboration. The usage of grid technology has enabled the development of new application and services for research in medical and life sciences. In order to enlarge the range of services and to get a broader range of users sustainable business models are needed. In Services@MediGRID methods for monitoring, accounting, and billing which fulfilled the high security demands within medicine and life sciences will be developed. Also different requirements of academic and industrial grid customers are considered in order to establish the sustainable business models for grid computing.

  10. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  11. Reputation, Princing and the E-Science Grid

    NASA Astrophysics Data System (ADS)

    Anandasivam, Arun; Neumann, Dirk

    One of the fundamental aspects for an efficient Grid usage is the optimization of resource allocation among the participants. However, this has not yet materialized. Each user is a self-interested participant trying to maximize his utility whereas the utility is not only determined by the fastest completion time, but on the prices as well. Future revenues are influenced by users' reputation. Reputation mechanisms help to build trust between loosely coupled and geographically distributed participants. Providers need an incentive to reduce selfish cancellation of jobs and privilege own jobs. In this chapter we present first an offline scheduling mechanism with a fixed price. Jobs are collected by a broker and scheduled to machines. The goal of the broker is to balance the load and to maximize the revenue in the network. Consumers can submit their jobs according to their preferences, but taking the incentives of the broker into account. This mechanism does not consider reputation. In a second step a reputation-based pricing mechanism for a simple, but fair pricing of resources is analyzed. In e-Science researchers do not appreciate idiosyncratic pricing strategies and policies. Their interest lies in doing research in an efficient manner. Consequently, in our mechanism the price is tightly coupled to the reputation of a site to guarantee fairness of pricing and facilitate price determination. Furthermore, the price is not the only parameter as completion time plays an important role, when deadlines have to be met. We provide a flexible utility and decision model for every participant and analyze the outcome of our reputation-based pricing system via simulation.

  12. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  13. Diagramming the path of a seed coat fragment on experimental lint cleaner grid bars

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was run to determine how a seed coat fragment reacts after colliding with newly-designed grid bars mounted on a lint cleaner simulator. A high-speed video camera recorded the action that took place. Ten experimental grid bars were used in the test. The included angle of the sharp to...

  14. An overview of Grid portal technologies for the development of HMR science gateways

    NASA Astrophysics Data System (ADS)

    D'Agostino, D.

    2012-04-01

    Grid portals and related technologies represent an easy and transparent way for scientists to interact with Distributed Computing Infrastructures (DCIs) as the Grid and the Cloud. Many toolkits and frameworks are available, both commercial and open source, but there is a lack of best practices, customization methodologies and dedicated high-level service repositories that allow a fast development of specialized scientific gateways in Europe. Starting from the US TeraGrid-XSEDE experience, in this contribution the most interesting portal toolkits and related European projects are analyzed with the perspective to develop a science gateway for HMR community within the the Distributed Research Infrastructure for Hydrometeorology (DRIHM) project.

  15. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    NASA Astrophysics Data System (ADS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  16. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  17. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  18. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  19. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  20. Experimental Demonstration of a Self-organized Architecture for Emerging Grid Computing Applications on OBS Testbed

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Hong, Xiaobin; Wu, Jian; Lin, Jintong

    As Grid computing continues to gain popularity in the industry and research community, it also attracts more attention from the customer level. The large number of users and high frequency of job requests in the consumer market make it challenging. Clearly, all the current Client/Server(C/S)-based architecture will become unfeasible for supporting large-scale Grid applications due to its poor scalability and poor fault-tolerance. In this paper, based on our previous works [1, 2], a novel self-organized architecture to realize a highly scalable and flexible platform for Grids is proposed. Experimental results show that this architecture is suitable and efficient for consumer-oriented Grids.

  1. Analysis of the current use, benefit, and value of the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Pordes, R.; Open Science Grid Executive Board, the; Weichel, J.

    2010-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by non-physics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  2. Analysis of the Current Use, Benefit, and Value of the Open Science Grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2009-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by nonphysics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  3. AMP: a science-driven web-based application for the TeraGrid

    NASA Astrophysics Data System (ADS)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  4. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1992-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science data set browsing, sampling and manipulation. The system will be coupled to a super computer in a distributed computing environment for near real-time interaction between scientists and computational results.

  5. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1993-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science dataset browsing, sampling and manipulation. The system will be coupled to a supercomputer in a distributed computing environment for near real-time interaction between scientists and computational results.

  6. Remote Job Testing for the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Miller, Stephen D; Reuter, Michael A; Smith, Bradford C

    2009-01-01

    Remote job execution gives neutron science facilities access to high performance computing such as the TeraGrid. A scientific community can use community software with a community certificate and account through a common interface of a portal. Results show this approach is successful, but with more testing and problem solving, we expect remote job executions to become more reliable.

  7. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  8. The GENIUS Grid Portal and robot certificates: a new tool for e-Science

    PubMed Central

    Barbera, Roberto; Donvito, Giacinto; Falzone, Alberto; La Rocca, Giuseppe; Milanesi, Luciano; Maggi, Giorgio Pietro; Vicario, Saverio

    2009-01-01

    Background Grid technology is the computing model which allows users to share a wide pletora of distributed computational resources regardless of their geographical location. Up to now, the high security policy requested in order to access distributed computing resources has been a rather big limiting factor when trying to broaden the usage of Grids into a wide community of users. Grid security is indeed based on the Public Key Infrastructure (PKI) of X.509 certificates and the procedure to get and manage those certificates is unfortunately not straightforward. A first step to make Grids more appealing for new users has recently been achieved with the adoption of robot certificates. Methods Robot certificates have recently been introduced to perform automated tasks on Grids on behalf of users. They are extremely useful for instance to automate grid service monitoring, data processing production, distributed data collection systems. Basically these certificates can be used to identify a person responsible for an unattended service or process acting as client and/or server. Robot certificates can be installed on a smart card and used behind a portal by everyone interested in running the related applications in a Grid environment using a user-friendly graphic interface. In this work, the GENIUS Grid Portal, powered by EnginFrame, has been extended in order to support the new authentication based on the adoption of these robot certificates. Results The work carried out and reported in this manuscript is particularly relevant for all users who are not familiar with personal digital certificates and the technical aspects of the Grid Security Infrastructure (GSI). The valuable benefits introduced by robot certificates in e-Science can so be extended to users belonging to several scientific domains, providing an asset in raising Grid awareness to a wide number of potential users. Conclusion The adoption of Grid portals extended with robot certificates, can really

  9. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    SciTech Connect

    Livny, Miron; Shank, James; Ernst, Michael; Blackburn, Kent; Goasguen, Sebastien; Tuts, Michael; Gibbons, Lawrence; Pordes, Ruth; Sliz, Piotr; Deelman, Ewa; Barnett, William; Olson, Doug; McGee, John; Cowles, Robert; Wuerthwein, Frank; Gardner, Robert; Avery, Paul; Wang, Shaowen; Lincoln, David Swanson

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  10. Fully Automated Single-Zone Elliptic Grid Generation for Mars Science Laboratory (MSL) Aeroshell and Canopy Geometries

    NASA Technical Reports Server (NTRS)

    kaul, Upender K.

    2008-01-01

    A procedure for generating smooth uniformly clustered single-zone grids using enhanced elliptic grid generation has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy. The procedure obviates the need for generating multizone grids for such geometries, as reported in the literature. This has been possible because the enhanced elliptic grid generator automatically generates clustered grids without manual prescription of decay parameters needed with the conventional approach. In fact, these decay parameters are calculated as decay functions as part of the solution, and they are not constant over a given boundary. Since these decay functions vary over a given boundary, orthogonal grids near any arbitrary boundary can be clustered automatically without having to break up the boundaries and the corresponding interior domains into various zones for grid generation.

  11. GENESIS SciFlo: Enabling Multi-Instrument Atmospheric Science Using Grid Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Tang, B.; Manipon, G.; Yunck, T.; Fetzer, E.; Braverman, A.; Dobinson, E.

    2004-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of web services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations will include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-strato-sphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we are developing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable web services and executable operators into a distributed computing flow (operator tree). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out

  12. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  13. ReSS: A Resource Selection Service for the Open Science Grid

    SciTech Connect

    Garzoglio, Gabriele; Levshina, Tanya; Mhashilkar, Parag; Timm, Steve; /Fermilab

    2008-01-01

    The Open Science Grid offers access to hundreds of computing and storage resources via standard Grid interfaces. Before the deployment of an automated resource selection system, users had to submit jobs directly to these resources. They would manually select a resource and specify all relevant attributes in the job description prior to submitting the job. The necessity of a human intervention in resource selection and attribute specification hinders automated job management components from accessing OSG resources and it is inconvenient for the users. The Resource Selection Service (ReSS) project addresses these shortcomings. The system integrates condor technology, for the core match making service, with the gLite CEMon component, for gathering and publishing resource information in the Glue Schema format. Each one of these components communicates over secure protocols via web services interfaces. The system is currently used in production on OSG by the DZero Experiment, the Engagement Virtual Organization, and the Dark Energy. It is also the resource selection service for the Fermilab Campus Grid, FermiGrid. ReSS is considered a lightweight solution to push-based workload management. This paper describes the architecture, performance, and typical usage of the system.

  14. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid

  15. Ultrasonic Technique for Experimental Investigation of Statistical Characteristics of Grid Generated Turbulence.

    NASA Astrophysics Data System (ADS)

    Andreeva, Tatiana; Durgin, William

    2001-11-01

    This paper focuses on ultrasonic measurements of a grid-generated turbulent flow using the travel time technique. In the present work an attempt to describe a turbulent flow by means of statistics of ultrasound wave propagation time is undertaken in combination with Kolmogorov (2/3)-power law. There are two objectives in current research work. The first one is to demonstrate an application of the travel-time ultrasonic technique for data acquisition in the grid-generated turbulence produced in a wind tunnel. The second one is to use the experimental data to verify or refute the analytically obtained expression for travel time dispersion as a function of velocity fluctuation metrics. The theoretical analysis and derivations of that formula are based on Kolmogorov theory. The series of experiment was conducted at different values of wind speeds and distances from the grid giving rise to different values of the dimensional turbulence characteristic coefficient K. Theoretical analysis, based on the experimental data reveals strong dependence of the turbulent characteristic K on the mean wind velocity. Tabulated values of the turbulent characteristic coefficient may be used for further understanding of the effect of turbulence on sound propagation.

  16. Thermoplastic Composites Reinforced with Textile Grids: Development of a Manufacturing Chain and Experimental Characterisation

    NASA Astrophysics Data System (ADS)

    Böhm, R.; Hufnagl, E.; Kupfer, R.; Engler, T.; Hausding, J.; Cherif, C.; Hufenbach, W.

    2013-12-01

    A significant improvement in the properties of plastic components can be achieved by introducing flexible multiaxial textile grids as reinforcement. This reinforcing concept is based on the layerwise bonding of biaxially or multiaxially oriented, completely stretched filaments of high-performance fibers, e.g. glass or carbon, and thermoplastic components, using modified warp knitting techniques. Such pre-consolidated grid-like textiles are particularly suitable for use in injection moulding, since the grid geometry is very robust with respect to flow pressure and temperature on the one hand and possesses an adjustable spacing to enable a complete filling of the mould cavity on the other hand. The development of pre-consolidated textile grids and their further processing into composites form the basis for providing tailored parts with a large number of additional integrated functions like fibrous sensors or electroconductive fibres. Composites reinforced in that way allow new product groups for promising lightweight structures to be opened up in future. The article describes the manufacturing process of this new composite class and their variability regarding reinforcement and function integration. An experimentally based study of the mechanical properties is performed. For this purpose, quasi-static and highly dynamic tensile tests have been carried out as well as impact penetration experiments. The reinforcing potential of the multiaxial grids is demonstrated by means of evaluating drop tower experiments on automotive components. It has been shown that the load-adapted reinforcement enables a significant local or global improvement of the properties of plastic components depending on industrial requirements.

  17. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    PubMed Central

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  18. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    PubMed

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  19. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  20. Materials Science and Materials Chemistry for Large Scale Electrochemical Energy Storage: From Transportation to Electrical Grid

    SciTech Connect

    Liu, Jun; Zhang, Jiguang; Yang, Zhenguo; Lemmon, John P.; Imhoff, Carl H.; Graff, Gordon L.; Li, Liyu; Hu, Jian Z.; Wang, Chong M.; Xiao, Jie; Xia, Guanguang; Viswanathan, Vilayanur V.; Baskaran, Suresh; Sprenkle, Vincent L.; Li, Xiaolin; Shao, Yuyan; Schwenzer, Birgit

    2013-02-15

    Large-scale electrical energy storage has become more important than ever for reducing fossil energy consumption in transportation and for the widespread deployment of intermittent renewable energy in electric grid. However, significant challenges exist for its applications. Here, the status and challenges are reviewed from the perspective of materials science and materials chemistry in electrochemical energy storage technologies, such as Li-ion batteries, sodium (sulfur and metal halide) batteries, Pb-acid battery, redox flow batteries, and supercapacitors. Perspectives and approaches are introduced for emerging battery designs and new chemistry combinations to reduce the cost of energy storage devices.

  1. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  2. Experimental optimization of the FireFly 600 photovoltaic off-grid system.

    SciTech Connect

    Boyson, William Earl; Orozco, Ron; Ralph, Mark E.; Brown, Marlene Laura; King, David L.; Hund, Thomas D.

    2003-10-01

    A comprehensive evaluation and experimental optimization of the FireFly{trademark} 600 off-grid photovoltaic system manufactured by Energia Total, Ltd. was conducted at Sandia National Laboratories in May and June of 2001. This evaluation was conducted at the request of the manufacturer and addressed performance of individual system components, overall system functionality and performance, safety concerns, and compliance with applicable codes and standards. A primary goal of the effort was to identify areas for improvement in performance, reliability, and safety. New system test procedures were developed during the effort.

  3. Experimental Observation of a Periodically Oscillating Plasma Sphere in a Gridded Inertial Electrostatic Confinement Device

    SciTech Connect

    Park, J.; Nebel, R.A.; Stange, S.; Murali, S. Krupakar

    2005-07-01

    The periodically oscillating plasma sphere (POPS) [D. C. Barnes and R. A. Nebel, Phys. Plasmas 5, 2498 (1998).] oscillation has been observed in a gridded inertial electrostatic confinement device. In these experiments, ions in the virtual cathode exhibit resonant behavior when driven at the POPS frequency. Excellent agreement between the observed POPS resonance frequency and theoretical predictions has been observed for a wide range of potential well depths and for three different ion species. The results provide the first experimental validation of the POPS concept proposed by Barnes and Nebel [R. A. Nebel and D. C. Barnes, Fusion Technol. 34, 28 (1998).].

  4. Moving off the grid in an experimental, compressively sampled photonic link.

    PubMed

    Nichols, J M; McLaughlin, C V; Bucholtz, F

    2015-07-13

    Perhaps the largest obstacle to practical compressive sampling is an inability to accurately, and sparsely describe the data one seeks to recover due to poor choice of signal model parameters. In such cases the recovery process will yield artifacts, or in many cases, fail completely. This work represents the first demonstration of a solution to this so-called "off-grid" problem in an experimental, compressively sampled system. Specifically, we show that an Alternating Convex Search algorithm is able to significantly reduce these data model errors in harmonic signal recovery. PMID:26191864

  5. Visual monitoring of autonomous life sciences experimentation

    NASA Technical Reports Server (NTRS)

    Blank, G. E.; Martin, W. N.

    1987-01-01

    The design and implementation of a computerized visual monitoring system to aid in the monitoring and control of life sciences experiments on board a space station was investigated. A likely multiprocessor design was chosen, a plausible life science experiment with which to work was defined, the theoretical issues involved in the programming of a visual monitoring system for the experiment was considered on the multiprocessor, a system for monitoring the experiment was designed, and simulations of such a system was implemented on a network of Apollo workstations.

  6. Experimental control requirements for life sciences

    NASA Technical Reports Server (NTRS)

    Berry, W. E.; Sharp, J. C.

    1978-01-01

    The Life Sciences dedicated Spacelab will enable scientists to test hypotheses in various disciplines. Building upon experience gained in mission simulations, orbital flight test experiments, and the first three Spacelab missions, NASA will be able to progressively develop the engineering and management capabilities necessary for the first Life Sciences Spacelab. Development of experiments for these missions will require implementation of life-support systems not previously flown in space. Plant growth chambers, animal holding facilities, aquatic specimen life-support systems, and centrifuge-mounted specimen holding units are examples of systems currently being designed and fabricated for flight.

  7. Environmental Science, Grade 9. Experimental Curriculum Bulletin.

    ERIC Educational Resources Information Center

    Bernstein, Leonard, Ed.

    This is the teacher's guide for the required, interdisciplinary, ninth-year environmental science course for the New York City Schools. One hundred twenty lesson plans, divided into nine units, are presented. Areas of study include the living and non-living environment, ecosystems, population, urban ecology, energy and technology, pollution, and…

  8. Geomorphology, Science (Experimental): 5343.09.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    Performance objectives are stated for this secondary school instructional unit concerned with aspects of earth science with emphases on the internal and external forces that bring about changes in the earth's crust. Lists of films and state-adopted and other texts are presented. Included are a course outline summarizing the unit content; numerous…

  9. RAON experimental facilities for nuclear science

    SciTech Connect

    Kwon, Y. K.; Kim, Y. K.; Komatsubara, T.; Moon, J. Y.; Park, J. S.; Shin, T. S.; Kim, Y. J.

    2014-05-02

    The Rare Isotope Science Project (RISP) was established in December 2011 and has put quite an effort to carry out the design and construction of the accelerator complex facility named “RAON”. RAON is a rare isotope (RI) beam facility that aims to provide various RI beams of proton-and neutron-rich nuclei as well as variety of stable ion beams of wide ranges of energies up to a few hundreds MeV/nucleon for the researches in basic science and application. Proposed research programs for nuclear physics and nuclear astrophysics at RAON include studies of the properties of exotic nuclei, the equation of state of nuclear matter, the origin of the universe, process of nucleosynthesis, super heavy elements, etc. Various high performance magnetic spectrometers for nuclear science have been designed, which are KOBRA (KOrea Broad acceptance Recoil spectrometer and Apparatus), LAMPS (Large Acceptance Multi-Purpose Spectrometer), and ZDS (Zero Degree Spectrometer). The status of those spectrometers for nuclear science will be presented with a brief report on the RAON.

  10. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  11. Nuclear Test-Experimental Science: Annual report, fiscal year 1988

    SciTech Connect

    Struble, G.L.; Donohue, M.L.; Bucciarelli, G.; Hymer, J.D.; Kirvel, R.D.; Middleton, C.; Prono, J.; Reid, S.; Strack, B.

    1988-01-01

    Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challenges and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.

  12. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    PubMed

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing). PMID:21521586

  13. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  14. Space materials science experimental facilities in China

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Jin, Weiqing

    Three typical facilities for materials science research under microgravity in China are introduced in this paper. The multi-task materials processing facility was developed for crystal growth and alloy solidification onboard Chinese Shenzhou spacecrafts, and more than ten types of different materials had been processed successfully in space. The in-situ observation facility was designed for mechanism research of oxide single crystals in space, and it had been carried into space onboard both Chinese recoverable satellite and Shenzhou spacecraft. The comprehensive materials processing facility is recently developed for utilization onboard the future spacelab in the manned spaceflight project in China. Both the achievement and the recent progress of materials research hardwares in China will also be summarized in this paper.

  15. Modular experimental platform for science and applications

    NASA Technical Reports Server (NTRS)

    Hill, A. S.

    1984-01-01

    A modularized, standardized spacecraft bus, known as MESA, suitable for a variety of science and applications missions is discussed. The basic bus consists of a simple structural arrangement housing attitude control, telemetry/command, electrical power, propulsion and thermal control subsystems. The general arrangement allows extensive subsystem adaptation to mission needs. Kits provide for the addition of tape recorders, increased power levels and propulsion growth. Both 3-axis and spin stabilized flight proven attitude control subsystems are available. The MESA bus can be launched on Ariane, as a secondary payload for low cost, or on the STS with a PAM-D or other suitable upper stage. Multi-spacecraft launches are possible with either booster. Launch vehicle integration is simple and cost-effective. The low cost of the MESA bus is achieved by the extensive utilization of existing subsystem design concepts and equipment, and efficient program management and test integration techniques.

  16. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  17. Experimental investigation of the dynamics of a vibrating grid in superfluid 4He over a range of temperatures and pressures.

    PubMed

    Charalambous, D; Skrbek, L; Hendry, P C; McClintock, P V E; Vinen, W F

    2006-09-01

    In an earlier paper [Nichol, Phys. Rev. E, 70, 056307 (2004)] some of the present authors presented the results of an experimental study of the dynamics of a stretched grid driven into vibration at or near its resonant frequency in isotopically pure superfluid 4He over a range of pressures at a very low temperature, where the density of normal fluid is negligible. In this paper we present the results of a similar study, based on a different grid, but now including the temperature range where the normal fluid density is no longer insignificant. The new grid is very similar to the old one except for a small difference in the character of its surface roughness. In many respects the results at low temperature are similar to those for the old grid. At low amplitudes the results are somewhat history dependent, but in essence there is no damping greater than that in vacuo. At a critical amplitude corresponding to a velocity of about 50 mms(-1) there is a sudden and large increase in damping, which can be attributed to the generation of new vortex lines. Strange shifts in the resonant frequency at intermediate amplitudes observed with the old grid are no longer seen, however they must therefore have been associated with the different surface roughness, or perhaps were due simply to some artifact of the old grid, the details of which we are currently unable to determine. With the new grid we have studied both the damping at low amplitudes due to excitations of the normal fluid, and the dependence of the supercritical damping on temperature. We present evidence that in helium at low amplitudes there may be some enhancement in the effective mass of the grid in addition to that associated with potential flow of the helium. In some circumstances small satellite resonances are seen near the main fundamental grid resonance, which are attributed to coupling to some other oscillatory system within the experimental cell. PMID:17025743

  18. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  19. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  20. The Structure of Scientific Arguments by Secondary Science Teachers: Comparison of Experimental and Historical Science Topics

    ERIC Educational Resources Information Center

    Gray, Ron; Kang, Nam-Hwa

    2014-01-01

    Just as scientific knowledge is constructed using distinct modes of inquiry (e.g. experimental or historical), arguments constructed during science instruction may vary depending on the mode of inquiry underlying the topic. The purpose of this study was to examine whether and how secondary science teachers construct scientific arguments during…

  1. The NASA/GSFC Advanced Data Grid: A Prototype for Future Earth Science Ground System Architectures

    NASA Technical Reports Server (NTRS)

    Gasster, Samuel D.; Lee, Craig; Davis, Brooks; Clark, Matt; AuYeung, Mike; Wilson, John R.; Ladwig, Debra M.

    2003-01-01

    Contents include the following: Background and motivation. Grid computing concepts. Advanced data grid (ADG) prototype development. ADG requirements and operations concept. ADG architecture. ADG implementation. ADG test plan. ADG schedule. Summary and status.

  2. The Virtual Kidney: an eScience interface and Grid portal.

    PubMed

    Harris, Peter J; Buyya, Rajkumar; Chu, Xingchen; Kobialka, Tom; Kazmierczak, Ed; Moss, Robert; Appelbe, William; Hunter, Peter J; Thomas, S Randall

    2009-06-13

    The Virtual Kidney uses a web interface and distributed computing to provide experimental scientists and analysts with access to computational simulations and knowledge databases hosted in geographically separated laboratories. Users can explore a variety of complex models without requiring the specific programming environment in which applications have been developed. This initiative exploits high-bandwidth communication networks for collaborative research and for shared access to knowledge resources. The Virtual Kidney has been developed within a specialist community of renal scientists but is transferable to other areas of research requiring interaction between published literature and databases, theoretical models and simulations and the formulation of effective experimental designs. A web-based three-dimensional interface provides access to experimental data, a parameter database and mathematical models. A multi-scale kidney reconstruction includes blood vessels and serially sectioned nephrons. Selection of structures provides links to the database, returning parameter values and extracts from the literature. Models are run locally or remotely with a Grid resource broker managing scheduling, monitoring and visualization of simulation results and application, credential and resource allocation. Simulation results are viewed graphically or as scaled colour gradients on the Virtual Kidney structures, allowing visual and quantitative appreciation of the effects of simulated parameter changes. PMID:19414450

  3. An Illustration of the Experimenter Expectancy Effect in School Science

    ERIC Educational Resources Information Center

    Allen, Michael; Briten, Elizabeth

    2012-01-01

    Two groups of year 6 pupils (age 10-11 years) each experienced science practical lessons that were essentially identical but for one difference: one group (theory-led) were told by the teacher what result they should expect, and the other group (hypothetico-deductive) were not. The theory-led group demonstrated experimental bias, recording results…

  4. Environmental Science. An Experimental Programme for Primary Teachers.

    ERIC Educational Resources Information Center

    Linke, R. D.

    An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…

  5. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  6. SciFlo: Semantically-Enabled Grid Workflow for Collaborative Science

    NASA Astrophysics Data System (ADS)

    Yunck, T.; Wilson, B. D.; Raskin, R.; Manipon, G.

    2005-12-01

    SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (WS-* standards and the Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable SOAP Services, native executables, local command-line scripts, and python codes into a distributed computing flow (a graph of operators). SciFlo's XML dataflow documents can be a mixture of concrete operators (fully bound operations) and abstract template operators (late binding via semantic lookup). All data objects and operators can be both simply typed (simple and complex types in XML schema) and semantically typed using controlled vocabularies (linked to OWL ontologies such as SWEET). By exploiting ontology-enhanced search and inference, one can discover (and automatically invoke) Web Services and operators that have been semantically labeled as performing the desired transformation, and adapt a particular invocation to the proper interface (number, types, and meaning of inputs and outputs). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. A Visual Programming tool is also being developed, but it is not required. Once an analysis has been specified for a granule or day of data, it can be easily repeated with different control parameters and over months or years of data. SciFlo uses and preserves semantics, and also generates and infers new semantic annotations. Specifically, the SciFlo engine uses semantic metadata to

  7. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  8. OASIS: a data and software distribution service for Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  9. Experimental Study of Two Phase Flow Behavior Past BWR Spacer Grids

    SciTech Connect

    Ratnayake, Ruwan K.; Hochreiter, L.E.; Ivanov, K.N.; Cimbala, J.M.

    2002-07-01

    Performance of best estimate codes used in the nuclear industry can be significantly improved by reducing the empiricism embedded in their constitutive models. Spacer grids have been found to have an important impact on the maximum allowable Critical Heat Flux within the fuel assembly of a nuclear reactor core. Therefore, incorporation of suitable spacer grids models can improve the critical heat flux prediction capability of best estimate codes. Realistic modeling of entrainment behavior of spacer grids requires understanding the different mechanisms that are involved. Since visual information pertaining to the entrainment behavior of spacer grids cannot possibly be obtained from operating nuclear reactors, experiments have to be designed and conducted for this specific purpose. Most of the spacer grid experiments available in literature have been designed in view of obtaining quantitative data for the purpose of developing or modifying empirical formulations for heat transfer, critical heat flux or pressure drop. Very few experiments have been designed to provide fundamental information which can be used to understand spacer grid effects and phenomena involved in two phase flow. Air-water experiments were conducted to obtain visual information on the two-phase flow behavior both upstream and downstream of Boiling Water Reactor (BWR) spacer grids. The test section was designed and constructed using prototypic dimensions such as the channel cross-section, rod diameter and other spacer grid configurations of a typical BWR fuel assembly. The test section models the flow behavior in two adjacent sub channels in the BWR core. A portion of a prototypic BWR spacer grid accounting for two adjacent channels was used with industrial mild steel rods for the purpose of representing the channel internals. Symmetry was preserved in this practice, so that the channel walls could effectively be considered as the channel boundaries. Thin films were established on the rod surfaces

  10. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    SciTech Connect

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  11. Systematic control of experimental inconsistency in combinatorial materials science.

    PubMed

    Sharma, Asish Kumar; Kulshreshtha, Chandramouli; Sohn, Keemin; Sohn, Kee-Sun

    2009-01-01

    We developed a method to systematically control experimental inconsistency, which is one of the most troublesome and difficult problems in high-throughput combinatorial experiments. The topic of experimental inconsistency is never addressed, even though all scientists in the field of combinatorial materials science face this very serious problem. Experimental inconsistency and material property were selected as dual objective functions that were simultaneously optimized. Specifically, in an attempt to search for promising phosphors with high reproducibility, photoluminescence (PL) intensity was maximized, and experimental inconsistency was minimized by employing a multiobjective evolutionary optimization-assisted combinatorial materials search (MOEO combinatorial material search) strategy. A tetravalent manganese-doped alkali earth germanium/titanium oxide system was used as a model system to be screened using MOEO combinatorial materials search. As a result of MOEO reiteration, we identified a halide-detached deep red phosphor with improved PL intensity and reliable reproducibility. PMID:19061418

  12. Development of experimental systems for material sciences under microgravity

    NASA Technical Reports Server (NTRS)

    Tanii, Jun; Obi, Shinzo; Kamimiyata, Yotsuo; Ajimine, Akio

    1988-01-01

    As part of the Space Experiment Program of the Society of Japanese Aerospace Companies, three experimental systems (G452, G453, G454) have been developed for materials science studies under microgravity by the NEC Corporation. These systems are to be flown as Get Away Special payloads for studying the feasibility of producing new materials. Together with the experimental modules carrying the hardware specific to the experiment, the three systems all comprise standard subsystems consisting of a power supply, sequence controller, temperature controller, data recorder, and video recorder.

  13. Minimum Learning Essentials: Science. Chemistry, Earth Science, Biology, Physics, General Science. Experimental Edition 0/4.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This guide presents the "minimum teaching essentials" published by the New York City Board of Education, for science education in grades 9-12. Covered are: biology, physics, earth science, and chemistry. Work study skills for all subjects are given with content areas, performance objectives, and suggested classroom activities. (APM)

  14. Experimental Physical Sciences Vistas: MaRIE (draft)

    SciTech Connect

    Shlachter, Jack

    2010-09-08

    To achieve breakthrough scientific discoveries in the 21st century, a convergence and integration of world-leading experimental facilities and capabilities with theory, modeling, and simulation is necessary. In this issue of Experimental Physical Sciences Vistas, I am excited to present our plans for Los Alamos National Laboratory's future flagship experimental facility, MaRIE (Matter-Radiation Interactions in Extremes). MaRIE is a facility that will provide transformational understanding of matter in extreme conditions required to reduce or resolve key weapons performance uncertainties, develop the materials needed for advanced energy systems, and transform our ability to create materials by design. Our unique role in materials science starting with the Manhattan Project has positioned us well to develop a contemporary materials strategy pushing the frontiers of controlled functionality - the design and tailoring of a material for the unique demands of a specific application. Controlled functionality requires improvement in understanding of the structure and properties of materials in order to synthesize and process materials with unique characteristics. In the nuclear weapons program today, improving data and models to increase confidence in the stockpile can take years from concept to new knowledge. Our goal with MaRIE is to accelerate this process by enhancing predictive capability - the ability to compute a priori the observables of an experiment or test and pertinent confidence intervals using verified and validated simulation tools. It is a science-based approach that includes the use of advanced experimental tools, theoretical models, and multi-physics codes, simultaneously dealing with multiple aspects of physical operation of a system that are needed to develop an increasingly mature predictive capability. This same approach is needed to accelerate improvements to other systems such as nuclear reactors. MaRIE will be valuable to many national security

  15. Animal experimentation in forensic sciences: How far have we come?

    PubMed

    Cattaneo, C; Maderna, E; Rendinelli, A; Gibelli, D

    2015-09-01

    In the third millennium where ethical, ethological and cultural evolution seem to be leading more and more towards an inter-species society, the issue of animal experimentation is a moral dilemma. Speaking from a self-interested human perspective, avoiding all animal testing where human disease and therapy are concerned may be very difficult or even impossible; such testing may not be so easily justifiable when suffering-or killing-of non human animals is inflicted for forensic research. In order to verify how forensic scientists are evolving in this ethical issue, we undertook a systematic review of the current literature. We investigated the frequency of animal experimentation in forensic studies in the past 15 years and trends in publication in the main forensic science journals. Types of species, lesions inflicted, manner of sedation or anesthesia and euthanasia were examined in a total of 404 articles reviewed, among which 279 (69.1%) concerned studies involving animals sacrificed exclusively for the sake of the experiment. Killing still frequently includes painful methods such as blunt trauma, electrocution, mechanical asphyxia, hypothermia, and even exsanguination; of all these animals, apparently only 60.8% were anesthetized. The most recent call for a severe reduction if not a total halt to the use of animals in forensic sciences was made by Bernard Knight in 1992. In fact the principle of reduction and replacement, frequently respected in clinical research, must be considered the basis for forensic science research needing animals. PMID:26216717

  16. Teaching science problem solving: An overview of experimental work

    NASA Astrophysics Data System (ADS)

    Taconis, R.; Ferguson-Hessler, M. G. M.; Broekkamp, H.

    2001-04-01

    The traditional approach to teaching science problem solving is having the students work individually on a large number of problems. This approach has long been overtaken by research suggesting and testing other methods, which are expected to be more effective. To get an overview of the characteristics of good and innovative problem-solving teaching strategies, we performed an analysis of a number of articles published between 1985 and 1995 in high-standard international journals, describing experimental research into the effectiveness of a wide variety of teaching strategies for science problem solving. To characterize the teaching strategies found, we used a model of the capacities needed for effective science problem solving, composed of a knowledge base and a skills base. The relations between the cognitive capacities required by the experimental or control treatments and those of the model were specified and used as independent variables. Other independent variables were learning conditions such as feedback and group work. As a dependent variable we used standardized learning effects. We identified 22 articles describing 40 experiments that met the standards we deemed necessary for a meta-analysis. These experiments were analyzed both with quantitative (correlational) methods and with a systematic qualitative method. A few of the independent variables were found to characterize effective strategies for teaching science problem solving. Effective treatments all gave attention to the structure and function (the schemata) of the knowledge base, whereas attention to knowledge of strategy and the practice of problem solving turned out to have little effect. As for learning conditions, both providing the learners with guidelines and criteria they can use in judging their own problem-solving process and products, and providing immediate feedback to them were found to be important prerequisites for the acquisition of problem-solving skills. Group work did not lead to

  17. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  18. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  19. Boundary condition identification for a grid model by experimental and numerical dynamic analysis

    NASA Astrophysics Data System (ADS)

    Mao, Qiang; Devitis, John; Mazzotti, Matteo; Bartoli, Ivan; Moon, Franklin; Sjoblom, Kurt; Aktan, Emin

    2015-04-01

    There is a growing need to characterize unknown foundations and assess substructures in existing bridges. It is becoming an important issue for the serviceability and safety of bridges as well as for the possibility of partial reuse of existing infrastructures. Within this broader contest, this paper investigates the possibility of identifying, locating and quantifying changes of boundary conditions, by leveraging a simply supported grid structure with a composite deck. Multi-reference impact tests are operated for the grid model and modification of one supporting bearing is done by replacing a steel cylindrical roller with a roller of compliant material. Impact based modal analysis provide global modal parameters such as damped natural frequencies, mode shapes and flexibility matrix that are used as indicators of boundary condition changes. An updating process combining a hybrid optimization algorithm and the finite element software suit ABAQUS is presented in this paper. The updated ABAQUS model of the grid that simulates the supporting bearing with springs is used to detect and quantify the change of the boundary conditions.

  20. Effects of mesh style and grid convergence on particle deposition in bifurcating airway models with comparisons to experimental data.

    PubMed

    Longest, P Worth; Vinchurkar, Samir

    2007-04-01

    A number of research studies have employed a wide variety of mesh styles and levels of grid convergence to assess velocity fields and particle deposition patterns in models of branching biological systems. Generating structured meshes based on hexahedral elements requires significant time and effort; however, these meshes are often associated with high quality solutions. Unstructured meshes that employ tetrahedral elements can be constructed much faster but may increase levels of numerical diffusion, especially in tubular flow systems with a primary flow direction. The objective of this study is to better establish the effects of mesh generation techniques and grid convergence on velocity fields and particle deposition patterns in bifurcating respiratory models. In order to achieve this objective, four widely used mesh styles including structured hexahedral, unstructured tetrahedral, flow adaptive tetrahedral, and hybrid grids have been considered for two respiratory airway configurations. Initial particle conditions tested are based on the inlet velocity profile or the local inlet mass flow rate. Accuracy of the simulations has been assessed by comparisons to experimental in vitro data available in the literature for the steady-state velocity field in a single bifurcation model as well as the local particle deposition fraction in a double bifurcation model. Quantitative grid convergence was assessed based on a grid convergence index (GCI), which accounts for the degree of grid refinement. The hexahedral mesh was observed to have GCI values that were an order of magnitude below the unstructured tetrahedral mesh values for all resolutions considered. Moreover, the hexahedral mesh style provided GCI values of approximately 1% and reduced run times by a factor of 3. Based on comparisons to empirical data, it was shown that inlet particle seedings should be consistent with the local inlet mass flow rate. Furthermore, the mesh style was found to have an observable

  1. Experimental demonstration of OpenFlow-based control plane for elastic lightpath provisioning in Flexi-Grid optical networks.

    PubMed

    Zhang, Jiawei; Zhang, Jie; Zhao, Yongli; Yang, Hui; Yu, Xiaosong; Wang, Lei; Fu, Xihua

    2013-01-28

    Due to the prominent performance on networking virtualization and programmability, OpenFlow is widely regarded as a promising control plane technology in packet-switched IP networks as well as wavelength-switched optical networks. For the purpose of applying software programmable feature to future optical networks, we propose an OpenFlow-based control plane in Flexi-Grid optical networks. Experimental results demonstrate its feasibility of dynamic lightpath establishment and adjustment via extended OpenFlow protocol. Wireshark captures of the signaling procedure are printed out. Additionally, the overall latency including signaling and hardware for lightpath setup and adjustment is also reported. PMID:23389119

  2. Science of Geological Carbon Sequestration: Integration of Experimentation and Simulation.

    SciTech Connect

    Zhang, D.; Hall, M. L.; Higdon, D.; Hollis, W. K.; Kaszuba, J.; Lichtner, P.; Pawar, R.; Zhao, Y.; Chen, S.; Grigg, R.

    2003-08-04

    This LDRD-DR will develop and enhance the science and technology needed to safely and effectively sequester carbon dioxide (CO[sub 2]) in geologic formations for the long term. There is consensus in the scientific community that increased levels of greenhouse gases such as CO[sub 2] are adversely affecting the global environment as evidenced by recent trends in global warming and dramatic changes in weather patterns. Geologic sequestration represents an immediately available, low-cost option for mitigating the global environmental impact of C0[sub 2] by removing large amounts of the gas from the atmosphere. The main limitation of this approach is the limited knowledge of the fundamental science that governs the physical and chemical behavior of (supercritical) CO[sub 2] during and after injection into the host geologic environment. Key scientific issues revolve around determination of the ultimate fate of injected CO[sub 2] which is governed by permeability/porosity relations in the multi-phase CO[sub 2]-brine(-oil) systems as well as the reactivity and integrity of the host rock. We propose a combined experimental and theoretical investigation to determine key parameters and incorporate them into coupled microscopic and macroscopic numerical CO[sub 2] flow and reaction models. This problem provides an excellent opportunity to utilize unique LANL resources including the Supercritical Fluids Facility (SCRUB) for dynamic (flow-through) studies of supercritical CO[sub 2] (scCO[sub 2]); LANSCE for microscale investigation of pore structure and reaction products; and hydrothermal reaction laboratories for long-term flow and reaction studies. These facilities will allow us to obtain crucial experimental data that could not be easily obtained at any other research facility in the world. The experimental data will be used to develop and validate coupled flow and reaction models that build on existing state-of-the-art modeling capabilities in EES, T and D Divisions. Carbon

  3. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    NASA Astrophysics Data System (ADS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  4. Analyzing Sustainable Energy Opportunities for a Small Scale Off-Grid Facility: A Case Study at Experimental Lakes Area (ELA), Ontario

    NASA Astrophysics Data System (ADS)

    Duggirala, Bhanu

    This thesis explored the opportunities to reduce energy demand and renewable energy feasibility at an off-grid science "community" called the Experimental Lakes Area (ELA) in Ontario. Being off-grid, ELA is completely dependent on diesel and propane fuel supply for all its electrical and heating needs, which makes ELA vulnerable to fluctuating fuel prices. As a result ELA emits a large amount of greenhouse gases (GHG) for its size. Energy efficiency and renewable energy technologies can reduce energy consumption and consequently energy cost, as well as GHG. Energy efficiency was very important to ELA due to the elevated fuel costs at this remote location. Minor upgrades to lighting, equipment and building envelope were able to reduce energy costs and reduce load. Efficient energy saving measures were recommended that save on operating and maintenance costs, namely, changing to LED lights, replacing old equipment like refrigerators and downsizing of ice makers. This resulted in a 4.8% load reduction and subsequently reduced the initial capital cost for biomass by 27,000, by 49,500 for wind power and by 136,500 for solar power. Many alternative energies show promise as potential energy sources to reduce the diesel and propane consumption at ELA including wind energy, solar heating and biomass. A biomass based CHP system using the existing diesel generators as back-up has the shortest pay back period of the technologies modeled. The biomass based CHP system has a pay back period of 4.1 years at 0.80 per liter of diesel, as diesel price approaches $2.00 per liter the pay back period reduces to 0.9 years, 50% the generation cost compared to present generation costs. Biomass has been successfully tried and tested in many off-grid communities particularly in a small-scale off-grid setting in North America and internationally. Also, the site specific solar and wind data show that ELA has potential to harvest renewable resources and produce heat and power at competitive

  5. Experimental Evaluation of Load Rejection Over-Voltage from Grid-Tied Solar Inverters

    SciTech Connect

    Nelson, Austin; Hoke, Anderson; Chakraborty, Sudipta; Ropp, Michael; Chebahtah, Justin; Wang, Trudie; Zimmerly, Brian

    2015-06-14

    This paper investigates the impact of load rejection over-voltage (LRO) from commercially available grid-tied photovoltaic (PV) inverters. LRO can occur when a breaker opens and the power output from a distributed energy resource (DER) exceeds the load. Simplified models of current-controlled inverters can over-predict LRO magnitudes, thus it is useful to quantify the effect through laboratory testing. The load rejection event was replicated using a hardware testbed at the National Renewable Energy Laboratory (NREL), and a set of commercially available PV inverters was tested to quantify the impact of LRO for a range of generation-to-load ratios. The magnitude and duration of the over-voltage events are reported in this paper along with a discussion of characteristic inverter output behavior. The results for the inverters under test showed that maximum over-voltage magnitudes were less than 200% of nominal voltage, and much lower in many test cases. These research results are important because utilities that interconnect inverter-based DER need to understand their characteristics under abnormal grid conditions.

  6. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  7. EverVIEW: A visualization platform for hydrologic and Earth science gridded data

    NASA Astrophysics Data System (ADS)

    Romañach, Stephanie S.; McKelvy, Mark; Suir, Kevin; Conzelmann, Craig

    2015-03-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  8. Pre-Service Teachers' Use of Improvised and Virtual Laboratory Experimentation in Science Teaching

    ERIC Educational Resources Information Center

    Bhukuvhani, Crispen; Kusure, Lovemore; Munodawafa, Violet; Sana, Abel; Gwizangwe, Isaac

    2010-01-01

    This research surveyed 11 purposely sampled Bindura University of Science Education (Zimbabwe) Bachelor of Science Education Honours Part III pre-service science teachers' use of improvised and virtual laboratory experimentation in science teaching. A self-designed four-point Likert scale twenty-item questionnaire was used. SPSS Version 10 was…

  9. The Distinction between Experimental and Historical Sciences as a Framework for Improving Classroom Inquiry

    ERIC Educational Resources Information Center

    Gray, Ron

    2014-01-01

    Inquiry experiences in secondary science classrooms are heavily weighted toward experimentation. We know, however, that many fields of science (e.g., evolutionary biology, cosmology, and paleontology), while they may utilize experiments, are not justified by experimental methodologies. With the focus on experimentation in schools, these fields of…

  10. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation, and Research

    NASA Astrophysics Data System (ADS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-06-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing -1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  11. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-01-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  12. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching. PMID:23482120

  13. SEE-GRID eInfrastructure for Regional eScience

    NASA Astrophysics Data System (ADS)

    Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel

    In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e

  14. Students' epistemologies about experimental physics: Validating the Colorado Learning Attitudes about Science Survey for experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-06-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder and elsewhere, we developed the Colorado Learning Attitudes about Science Survey for experimental physics (E-CLASS). Previous work with this assessment has included establishing the accuracy and clarity of the instrument through student interviews and preliminary testing. Several years of data collection at multiple institutions has resulted in a growing national data set of student responses. Here, we report on results of the analysis of these data to investigate the statistical validity and reliability of the E-CLASS as a measure of students' epistemologies for a broad student population. We find that the E-CLASS demonstrates an acceptable level of both validity and reliability on measures of item and test discrimination, test-retest reliability, partial-sample reliability, internal consistency, concurrent validity, and convergent validity. We also examine students' responses using principal component analysis and find that, as expected, the E-CLASS does not exhibit strong factors (a.k.a. categories).

  15. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    NASA Astrophysics Data System (ADS)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  16. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  17. Grid oscillators

    NASA Technical Reports Server (NTRS)

    Popovic, Zorana B.; Kim, Moonil; Rutledge, David B.

    1988-01-01

    Loading a two-dimensional grid with active devices offers a means of combining the power of solid-state oscillators in the microwave and millimeter-wave range. The grid structure allows a large number of negative resistance devices to be combined. This approach is attractive because the active devices do not require an external locking signal, and the combining is done in free space. In addition, the loaded grid is a planar structure amenable to monolithic integration. Measurements on a 25-MESFET grid at 9.7 GHz show power-combining and frequency-locking without an external locking signal, with an ERP of 37 W. Experimental far-field patterns agree with theoretical results obtained using reciprocity.

  18. Data Grids: a new computational infrastructure for data-intensive science.

    PubMed

    Avery, Paul

    2002-06-15

    Twenty-first-century scientific and engineering enterprises are increasingly characterized by their geographic dispersion and their reliance on large data archives. These characteristics bring with them unique challenges. First, the increasing size and complexity of modern data collections require significant investments in information technologies to store, retrieve and analyse them. Second, the increased distribution of people and resources in these projects has made resource sharing and collaboration across significant geographic and organizational boundaries critical to their success. In this paper I explore how computing infrastructures based on Data Grids offer data-intensive enterprises a comprehensive, scalable framework for collaboration and resource sharing. A detailed example of a Data Grid framework is presented for a Large Hadron Collider experiment, where a hierarchical set of laboratory and university resources comprising petaflops of processing power and a multi-petabyte data archive must be efficiently used by a global collaboration. The experience gained with these new information systems, providing transparent managed access to massive distributed data collections, will be applicable to large-scale, data-intensive problems in a wide spectrum of scientific and engineering disciplines, and eventually in industry and commerce. Such systems will be needed in the coming decades as a central element of our information-based society. PMID:12804274

  19. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William W.; Schuster, David; Adams, Betty; Applegate, Brooks; Skjold, Brandy; Undreiu, Adriana; Loving, Cathleen C.; Gobert, Janice D.

    2010-01-01

    There are continuing educational and political debates about "inquiry" versus "direct" teaching of science. Traditional science instruction has been largely direct but in the US, recent national and state science education standards advocate inquiry throughout K-12 education. While inquiry-based instruction has the advantage of modelling aspects…

  20. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William; Schuster, David; Adams, Betty

    2010-01-01

    It is evident that "experientially-based" instruction and "active student engagement" are advantageous for effective science learning. However, "hands-on" and "minds-on" aspects can occur in both inquiry and direct science instruction, and convincing comparative evidence for the superiority of either mode remains rare. Thus, the pertinent question…

  1. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    SciTech Connect

    Jablonowski, Christiane

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  2. Vanguard: A New Science Mission For Experimental Astrobiology

    NASA Astrophysics Data System (ADS)

    Ellery, A.; Wynn-Williams, D.; Edwards, H.; Dickensheets, D.; Welch, C.; Curley, A.

    As an alternative to technically and financially problemat ic sample return missions, a rover-mounted laser Raman spectrometer sensitive to biomolecules and their mineral substrata is a promising alternative in the search for evidence of former life on Mars. We presented a new remote in situ analysis package being designed for experimental astrobiology on terrestrial-type planetary surfaces. The science is based on the hypothesis that if life arose on Mars, the selective pressure of solar radiation would have led to the evolution of pigmented systems to harness the energy of sunlight and to protect cells from concurrent UV stress. Microbial communities would have therefore become stratified by the light gradient, and our remote system would penetrate the near-subsurface profile in a vertical transect of horizontal strata in ancient sediments (such as palaeolake beds). The system will include an extensive array of robotic support to translocate and deploy a Raman spectrometer detectors beneath the surface of Mars ­ it will comprise of a base station lander to support communications, a robotic micro-rover to permit well- separated triplicate profiles made by three ground-penetrating moles mounted in a vertical configuration. Each mole will deploy a tether carrying fibre optic cables coupling the Raman spectrometer onboard the rover and the side-scanning sensor head on the mole. The complete system has been named Vanguard, and it represents a close collaboration between a space robotics engineer (Ellery), an astrobiologist (Wynn-Williams), a molecular spectroscopist (Edwards), an opto-electronic technologist (Dickensheets), a spacecraft engineer (Welch) and a robotic vision specialist (Curley). The autonomy requirement for the Vanguard instrument requires that significant scientific competence is imparted to the instrument through an expert system to ensure that quick-look analysis is performed onboard in real-time as the mole penetrates beneath the surface. Onboard

  3. Experimental Investigation of the Behavior of Sub-Grid Scale Motions in Turbulent Shear Flow

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian

    1992-01-01

    Experiments have been carried out on a vertical jet of helium issuing into a co-flow of air at a fixed exit velocity ratio of 2.0. At all the experimental conditions studied, the flow exhibits a strong self excited periodicity. The natural frequency behavior of the jet, the underlying fine-scale flow structure, and the transition to turbulence have been studied over a wide range of flow conditions. The experiments were conducted in a variable pressure facility which made it possible to vary the Reynolds number and Richardson number independently. A stroboscopic schlieren system was used for flow visualization and single-component Laser Doppler Anemometry was used to measure the axial component of velocity. The flow exhibits several interesting features. The presence of co-flow eliminates the random meandering typical of buoyant plumes in a quiescent environment and the periodicity of the helium jet under high Richardson number conditions is striking. Under these conditions transition to turbulence consists of a rapid but highly structured and repeatable breakdown and intermingling of jet and freestream fluid. At Ri = 1.6 the three-dimensional structure of the flow is seen to repeat from cycle to cycle. The point of transition moves closer to the jet exit as either the Reynolds number or the Richardson number increases. The wavelength of the longitudinal instability increases with Richardson number. At low Richardson numbers, the natural frequency scales on an inertial time scale. At high Richardson number the natural frequency scales on a buoyancy time scale. The transition from one flow regime to another occurs over a narrow range of Richardson numbers from 0.7 to 1. A buoyancy Strouhal number is used to correlate the high Richardson number frequency behavior.

  4. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  5. The DOE SunShot Initiative: Science and Technology to enable Solar Electricity at Grid Parity

    NASA Astrophysics Data System (ADS)

    Ramesh, Ramamoorthy

    2012-02-01

    The SunShot Initiative's mission is to develop solar energy technologies through a collaborative national push to make solar Photovoltaic (PV) and Concentrated Solar Power (CSP) energy technologies cost-competitive with fossil fuel based energy by reducing the cost of solar energy systems by ˜ 75 percent before 2020. Reducing the total installed cost for utility-scale solar electricity to roughly 6 cents per kilowatt hour (1/Watt) without subsidies will result in rapid, large-scale adoption of solar electricity across the United States and the world. Achieving this goal will require significant reductions and technological innovations in all PV system components, namely modules, power electronics, and balance of systems (BOS), which includes all other components and costs required for a fully installed system including permitting and inspection costs. This investment will re-establish American technological and market leadership, improve the nation's energy security, strengthen U.S. economic competitiveness and catalyze domestic economic growth in the global clean energy race. SunShot is a cooperative program across DOE, involving the Office of Science, the Office of Energy Efficiency and Renewable Energy and ARPA-E.

  6. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  7. Early Adolescence: Using Consumer Science to Develop Experimental Techniques.

    ERIC Educational Resources Information Center

    Padilla, Michael

    1981-01-01

    Describes several consumer science activities useful for introducing process skills for the middle/junior high school student. Activities described include testing laundry detergent effectiveness for stain removal, comparison of quantities in fast foods, and various activities concerning tests of product claims. (DS)

  8. Learning Political Science with Prediction Markets: An Experimental Study

    ERIC Educational Resources Information Center

    Ellis, Cali Mortenson; Sami, Rahul

    2012-01-01

    Prediction markets are designed to aggregate the information of many individuals to forecast future events. These markets provide participants with an incentive to seek information and a forum for interaction, making markets a promising tool to motivate student learning. We carried out a quasi-experiment in an introductory political science class…

  9. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    ERIC Educational Resources Information Center

    Allen, Michael; Coole, Hilary

    2012-01-01

    This paper describes a randomised educational experiment (n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from…

  10. Geometric and Applied Optics, Science (Experimental): 5318.04.

    ERIC Educational Resources Information Center

    Sanderson, Robert C.

    This unit of instruction presents a laboratory-oriented course which relates the sources and behaviors of light to man's control and uses of light. Successful completion of Algebra I and Plane Geometry is strongly recommended as indicators of success. The course is recommended if the student plans further studies in science, optical technology, or…

  11. Grid Architecture 2

    SciTech Connect

    Taft, Jeffrey D.

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  12. FermiGrid

    SciTech Connect

    Yocum, D.R.; Berman, E.; Canal, P.; Chadwick, K.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Sharma, N.; Timm, S.; /Fermilab

    2007-05-01

    As one of the founding members of the Open Science Grid Consortium (OSG), Fermilab enables coherent access to its production resources through the Grid infrastructure system called FermiGrid. This system successfully provides for centrally managed grid services, opportunistic resource access, development of OSG Interfaces for Fermilab, and an interface to the Fermilab dCache system. FermiGrid supports virtual organizations (VOs) including high energy physics experiments (USCMS, MINOS, D0, CDF, ILC), astrophysics experiments (SDSS, Auger, DES), biology experiments (GADU, Nanohub) and educational activities.

  13. Accounting for reciprocal host-microbiome interactions in experimental science.

    PubMed

    Stappenbeck, Thaddeus S; Virgin, Herbert W

    2016-06-01

    Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research. PMID:27279212

  14. An Experimental Clinical Science Fellowship in Cardiovascular-Renal

    ERIC Educational Resources Information Center

    Chasis, Herbert; Campbell, Charles I.

    1974-01-01

    Describes the New York Heart Association's experimental program aimed at evaluating a method of developing physicians disciplined by research and competent both as teachers and in the care of patients (clinical scientists). (Author)

  15. Experimental evaluation of fiber-interspaced antiscatter grids for large patient imaging with digital x-ray systems

    NASA Astrophysics Data System (ADS)

    Fetterly, Kenneth A.; Schueler, Beth A.

    2007-08-01

    Radiographic imaging of large patients is compromised by x-ray scatter. Optimization of digital x-ray imaging systems used for projection radiography requires the use of the best possible antiscatter grid. The performance of antiscatter grids used in conjunction with digital x-ray imaging systems can be characterized through measurement of the signal-to-noise ratio (SNR) improvement factor (KSNR). The SNR improvement factor of several linear, focused antiscatter grids was determined from measurements of the fundamental primary and scatter transmission fraction measurements of the grids as well as the inherent scatter-to-primary ratio (SPR) of the x-ray beam and scatter phantom. The inherent SPR and scatter transmission fraction was measured using a graduated lead beam stop method. The KSNR of eight grids with line rates (N) in the range 40 to 80 cm-1 and ratios (r) in the range 8:1 to 15:1 was measured. All of the grids had fiber interspace material and carbon-fiber covers. The scatter phantom used was Solid Water® with thickness 10 to 50 cm, and a 30 × 30 cm2 field of view was used. All measurements were acquired using a 104 kVp x-ray beam. The SPR of the non-grid imaging condition ranged from 2.55 for the 10 cm phantom to 25.9 for the 50 cm phantom. The scatter transmission fractions ranged from a low of 0.083 for the N50 r15 grid to a high of 0.22 for the N40 r8 grid and the primary transmission fractions ranged from a low of 0.69 for the N80 r15 grid to 0.76 for the N40 r8 grid. The SNR improvement factors ranged from 1.2 for the 10 cm phantom and N40 r8 grid to 2.09 for the 50 cm phantom and the best performing N50 r15, N44 r15 and N40 r14 grids.

  16. Space Science Education: An Experimental Study. Report of the Study Commission on Space Science Education.

    ERIC Educational Resources Information Center

    Vick, Raymond

    The implications of space science terminology and concepts for elementary science teaching are explored. Twenty-two concepts were identified which elementary and junior high school teachers were invited to introduce in their teaching. Booklets explaining the concepts were distributed together with report forms for teacher feedback. The numbers of…

  17. Life Science Research and Drug Discovery at the Turn of the 21st Century: The Experience of SwissBioGrid

    PubMed Central

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-01-01

    Background It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling “in-silico” the processes observed “in-vitro.” The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. Results SwissBioGrid was established to provide computational support to two pilot projects: one for proteomics data analysis, and the other for high-throughput molecular docking (“virtual screening”) to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a large-scale data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the

  18. An Experimental Evaluation of the Effects of ESCP and General Science on the Development of Interdisciplinary Science Concepts by Ninth Grade Students.

    ERIC Educational Resources Information Center

    Coleman, Esther Montague

    This study was an experimental evaluation of achievement in understanding interdisciplinary science concepts by ninth grade students enrolled in two different integrated science courses. The experimental group used "Investigating the Earth", the textbook/laboratory program, developed by the Earth Science Curriculum Project (ESCP) staff. The…

  19. The Beliefs and Behaviors of Pupils in an Experimental School: The Science Lab.

    ERIC Educational Resources Information Center

    Lancy, David F.

    This booklet, the second in a series, reports on the results of a year-long research project conducted in an experimental school associated with the Learning Research and Development Center, University of Pittsburgh. Specifically, this is a report of findings pertaining to one major setting in the experimental school, the science lab. The science…

  20. Is Physicality an Important Aspect of Learning through Science Experimentation among Kindergarten Students?

    ERIC Educational Resources Information Center

    Zacharia, Zacharias C.; Loizou, Eleni; Papaevripidou, Marios

    2012-01-01

    The purpose of this study was to investigate whether physicality (actual and active touch of concrete material), as such, is a necessity for science experimentation learning at the kindergarten level. We compared the effects of student experimentation with Physical Manipulatives (PM) and Virtual Manipulatives (VM) on kindergarten students'…

  1. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor

    SciTech Connect

    Singh, M. J.; De Esch, H. P. L.

    2010-01-15

    This paper describes the physics design of a 100 keV, 60 A H{sup -} accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated.

  2. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor.

    PubMed

    Singh, M J; De Esch, H P L

    2010-01-01

    This paper describes the physics design of a 100 keV, 60 A H(-) accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated. PMID:20113091

  3. An 11-year global gridded aerosol optical thickness reanalysis (v1.0) for atmospheric and climate sciences

    NASA Astrophysics Data System (ADS)

    Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.

    2016-04-01

    While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how

  4. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    NASA Astrophysics Data System (ADS)

    Allen, Michael; Coole, Hilary

    2012-06-01

    This paper describes a randomised educational experiment ( n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from learners that were triggered by their own confirmation biases. The treatment group showed superior learning gains to control at post-test immediately after the lesson, although benefits had dissipated after 6 weeks. Findings are discussed with reference to the conceptual change paradigm and to the importance of feeling emotion during a learning experience, having implications for the teaching of pedagogies to adults that have been previously shown to be successful with children.

  5. Hybrid Grid Generation Using NW Grid

    SciTech Connect

    Jones-Oliveira, Janet B.; Oliveira, Joseph S.; Trease, Lynn L.; Trease, Harold E.; B.K. Soni, J. Hauser, J.F. Thompson, P.R. Eiseman

    2000-09-01

    We describe the development and use of a hybrid n-dimensional grid generation system called NWGRID. The Applied Mathematics Group at Pacific Northwest National Laboratory (PNNL) is developing this tool to support the Laboratory's computational science efforts in chemistry, biology, engineering and environmental (subsurface and atmospheric) modeling. NWGRID is the grid generation system, which is designed for multi-scale, multi-material, multi-physics, time-dependent, 3-D, hybrid grids that are either statically adapted or evolved in time. NWGRID'S capabilities include static and dynamic grids, hybrid grids, managing colliding surfaces, and grid optimization[using reconnections, smoothing, and adaptive mesh refinement (AMR) algorithms]. NWGRID'S data structure can manage an arbitrary number of grid objects, each with an arbitrary number of grid attributes. NWGRID uses surface geometry to build volumes by using combinations of Boolean operators and order relations. Point distributions can be input, generated using either ray shooting techniques or defined point-by-point. Connectivity matrices are then generated automatically for all variations of hybrid grids.

  6. Nonlethal suppression: from basic science to operationally relevant experimentation

    NASA Astrophysics Data System (ADS)

    Servatius, Richard J.; Beck, Kevin D.

    2006-05-01

    Use of force justification, second nature to law enforcement personnel, is increasingly considered by military personnel especially in military operations on urban terrain (MOUT) scenarios. In these situations, military and civilian law enforcement objectives are similar: exert control over individuals and groups with minimum force. Although the list of potential devices and systems grow, empirical demonstrations of effectiveness are lacking. Here, a position is presented regarding approaches to experimental analysis of nonlethal (a.k.a., less-than-lethal and less lethal) technologies and solutions. Appreciation of the concepts of suppression and its attendant behavioral variables will advance the development of nonlethal weapons and systems (NLW&S).

  7. Students' Epistemologies about Experimental Physics: Validating the Colorado Learning Attitudes about Science Survey for Experimental Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-01-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder…

  8. The Art and Science of Experimentation in Quantum Physics

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2010-05-01

    Taking its historical point of departure in Heisenberg's work, this article offers a view of quantum mechanics as, arguably, the first truly experimental and truly mathematical physical theory, that is, a theory concerned with experimenting with nature and mathematics alike. It is truly experimental because it is not, as in classical physics, merely the independent behavior of the system considered, in other words, what happens in any event, that we track, but what kind of experiments we perform that defines what happens. By the same token, the theory is also truly mathematical because, at least in the interpretation adopted here, its mathematical formalism does not stand in the service of a mathematical description of (quantum) physical processes in space and time in the way the formalism of classical physics does, but is only used to predict the outcomes of relevant experiments. It also follows that quantum theories experiment more freely with mathematics itself, since we invent predictive mathematical schemes, rather than proceed by refining mathematically our phenomenal representations of nature, which process constrains us in classical mechanics.

  9. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  10. Social Science and Neuroscience beyond Interdisciplinarity: Experimental Entanglements

    PubMed Central

    Callard, Felicity

    2015-01-01

    This article is an account of the dynamics of interaction across the social sciences and neurosciences. Against an arid rhetoric of ‘interdisciplinarity’, it calls for a more expansive imaginary of what experiment – as practice and ethos – might offer in this space. Arguing that opportunities for collaboration between social scientists and neuroscientists need to be taken seriously, the article situates itself against existing conceptualizations of these dynamics, grouping them under three rubrics: ‘critique’, ‘ebullience’ and ‘interaction’. Despite their differences, each insists on a distinction between sociocultural and neurobiological knowledge, or does not show how a more entangled field might be realized. The article links this absence to the ‘regime of the inter-’, an ethic of interdisciplinarity that guides interaction between disciplines on the understanding of their pre-existing separateness. The argument of the paper is thus twofold: (1) that, contra the ‘regime of the inter-’, it is no longer practicable to maintain a hygienic separation between sociocultural webs and neurobiological architecture; (2) that the cognitive neuroscientific experiment, as a space of epistemological and ontological excess, offers an opportunity to researchers, from all disciplines, to explore and register this realization. PMID:25972621

  11. Considerations for Life Science experimentation on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Davies, P.; Rossberg Walker, K.

    1992-01-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions.

  12. Considerations for Life Science experimentation on the Space Shuttle.

    PubMed

    Souza, K A; Davies, P; Rossberg Walker, K

    1992-10-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions. PMID:11537654

  13. Science and society: different bioethical approaches towards animal experimentation.

    PubMed

    Brom, Frans W A

    2002-01-01

    respect their integrity. By weighing these prima facie duties, the moral problem of animal experimentation exists in finding which duty actually has to be considered as the decisive duty. It will be argued that these three views, even though they will all justify animal experimentation to some extent, will do so in practice under different conditions. Many current conflicts regarding the use of animals for research may be better understood in light of the conflict between the three bioethical perspectives provided by these views. PMID:12098014

  14. Challenges facing production grids

    SciTech Connect

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  15. The Earth System Grid Federation (ESGF): Climate Science Infrastructure for Large-scale Data Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2015-12-01

    Progress in understanding and predicting climate change requires advanced tools to securely store, manage, access, process, analyze, and visualize enormous and distributed data sets. Only then can climate researchers understand the effects of climate change across all scales and use this information to inform policy decisions. With the advent of major international climate modeling intercomparisons, a need emerged within the climate-change research community to develop efficient, community-based tools to obtain relevant meteorological and other observational data, develop custom computational models, and export analysis tools for climate-change simulations. While many nascent efforts to fill these gaps appeared, they were not integrated and therefore did not benefit from collaborative development. Sharing huge data sets was difficult, and the lack of data standards prevented the merger of output data from different modeling groups. Thus began one of the largest-ever collaborative data efforts in climate science, resulting in the Earth System Grid Federation (ESGF), which is now used to disseminate model, observational, and reanalysis data for research assessed by the Intergovernmental Panel on Climate Change (IPCC). Today, ESGF is an open-source petabyte-level data storage and dissemination operational code-base that manages secure resources essential for climate change study. It is designed to remain robust even as data volumes grow exponentially. The internationally distributed, peer-to-peer ESGF "data cloud" archive represents the culmination of an effort that began in the late 1990s. ESGF portals are gateways to scientific data collections hosted at sites around the globe that allow the user to register and potentially access the entire ESGF network of data and services. The growing international interest in ESGF development efforts has attracted many others who want to make their data more widely available and easy to use. For example, the World Climate

  16. "Exploratory experimentation" as a probe into the relation between historiography and philosophy of science.

    PubMed

    Schickore, Jutta

    2016-02-01

    This essay utilizes the concept "exploratory experimentation" as a probe into the relation between historiography and philosophy of science. The essay traces the emergence of the historiographical concept "exploratory experimentation" in the late 1990s. The reconstruction of the early discussions about exploratory experimentation shows that the introduction of the concept had unintended consequences: Initially designed to debunk philosophical ideas about theory testing, the concept "exploratory experimentation" quickly exposed the poverty of our conceptual tools for the analysis of experimental practice. Looking back at a number of detailed analyses of experimental research, we can now appreciate that the concept of exploratory experimentation is too vague and too elusive to fill the desideratum whose existence it revealed. PMID:26774065

  17. Spline for blade grids design

    NASA Astrophysics Data System (ADS)

    Korshunov, Andrei; Shershnev, Vladimir; Korshunova, Ksenia

    2015-08-01

    Methods of designing blades grids of power machines, such as equal thickness shape built on middle-line arc, or methods based on target stress spreading were invented long time ago, well described and still in use. Science and technology has moved far from that time and laboriousness of experimental research, which were involving unique equipment, requires development of new robust and flexible methods of design, which will determine the optimal geometry of flow passage.This investigation provides simple and universal method of designing blades, which, in comparison to the currently used methods, requires significantly less input data but still provides accurate results. The described method is purely analytical for both concave and convex sides of the blade, and therefore lets to describe the curve behavior down the flow path at any point. Compared with the blade grid designs currently used in industry, geometric parameters of the designs constructed with this method show the maximum deviation below 0.4%.

  18. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    PubMed

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  19. Opening Possibilities in Experimental Science and Its History: Critical Explorations with Pendulums and Singing Tubes

    ERIC Educational Resources Information Center

    Cavicchi, Elizabeth

    2008-01-01

    A teacher and a college student explore experimental science and its history by reading historical texts, and responding with replications and experiments of their own. A curriculum of ever-widening possibilities evolves in their ongoing interactions with each other, history, and such materials as pendulums, flame, and resonant singing tubes.…

  20. Views of the STS-5 Science Press briefing with Student Experimenters

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Views of the STS-5 Science Press briefing with Student Experimenters. Photos include Michelle Issel of Wallingfor, Connecticut showing her studen experiment dealing with the formation of crystals in a weightless environment (37862); Aaron Gillette of Winter Haven, Florida displaying his student experiment dealing with the growth of Porifera in zero gravity (37863).

  1. The Role of the Scientific Discovery Narrative in Middle School Science Education: An Experimental Study

    ERIC Educational Resources Information Center

    Arya, Diana J.; Maul, Andrew

    2012-01-01

    In an experimental study (N = 209), the authors compared the effects of exposure to typical middle-school written science content when presented in the context of the scientific discovery narrative and when presented in a more traditional nonnarrative format on 7th and 8th grade students in the United States. The development of texts was…

  2. Factors Influencing Students' Choice(s) of Experimental Science Subjects within the International Baccalaureate Diploma Programme

    ERIC Educational Resources Information Center

    James, Kieran

    2007-01-01

    This article outlines a study conducted in Finland and Portugal into the reasons why International Baccalaureate (IB) Diploma Programme (DP) students choose particular Experimental Science (Group 4) subjects. Its findings suggest that interest, enjoyment, university course and career requirements have most influence on students' choices.…

  3. General Science, Ninth Grade: Theme III and Theme IV. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This document is the student laboratory manual that was designed to accompany some of the experimental activities found in the teacher's guide to this general science course for ninth graders. It contains laboratory worksheets for lessons on such topics as: (1) soil; (2) hazardous waste; (3) wildlife refuges; (4) the water cycle; (5) water…

  4. Mathematics and Experimental Sciences in the FRG-Upper Secondary Schools. Occasional Paper 40.

    ERIC Educational Resources Information Center

    Steiner, Hans-Georg

    The mathematics and experimental science courses in the programs of the upper secondary school in the Federal Republic of Germany (FRG) are discussed. The paper addresses: (1) the two "secondary levels" within the FRG school system, indicating that the Secondary I-Level (SI) comprises grades 5 through 9 or 10 while the Secondary II-Level (SII)…

  5. The resisted rise of randomisation in experimental design: British agricultural science, c.1910-1930.

    PubMed

    Berry, Dominic

    2015-09-01

    The most conspicuous form of agricultural experiment is the field trial, and within the history of such trials, the arrival of the randomised control trial (RCT) is considered revolutionary. Originating with R.A. Fisher within British agricultural science in the 1920s and 1930s, the RCT has since become one of the most prodigiously used experimental techniques throughout the natural and social sciences. Philosophers of science have already scrutinised the epistemological uniqueness of RCTs, undermining their status as the 'gold standard' in experimental design. The present paper introduces a historical case study from the origins of the RCT, uncovering the initially cool reception given to this method by agricultural scientists at the University of Cambridge and the (Cambridge based) National Institute of Agricultural Botany. Rather than giving further attention to the RCT, the paper focuses instead on a competitor method-the half-drill strip-which both predated the RCT and remained in wide use for at least a decade beyond the latter's arrival. In telling this history, John Pickstone's Ways of Knowing is adopted, as the most flexible and productive way to write the history of science, particularly when sciences and scientists have to work across a number of different kinds of place. It is shown that those who resisted the RCT did so in order to preserve epistemic and social goals that randomisation would have otherwise run a tractor through. PMID:26205200

  6. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  7. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    ERIC Educational Resources Information Center

    Onghena, Sofie

    2013-01-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact…

  8. Probeware in 8th Grade Science: A Quasi-Experimental Study on Attitude and Achievement

    NASA Astrophysics Data System (ADS)

    Moyer, John F., III

    The use of probeware in the delivery of science instruction has become quite widespread over the past few decades. The current emphasis on Science, Technology, Engineering, and Mathematics (STEM) education, especially in the case of underrepresented populations, seems to have accelerated the inclusion of probeware into curriculum. This quasi-experimental study sought to examine the effects of a direct replacement of traditional science tools with computer-based probeware on student achievement and student attitude toward science. Data analysis was conducted for large comparison groups and then for target STEM groups of African-American, low socioeconomic status, and female. Student achievement was measured by the Energy Concept Inventory and student attitude was measured by the Attitude Toward Science Inventory. The results showed that probeware did not have a significant effect on student achievement for almost all comparison groups. Analysis of student attitude toward science revealed that the use of probeware significantly affected overall student attitude as well as student attitude in several disaggregated subscales of attitude. These findings hold for both the comparison groups and the target STEM groups. Limitations of the study and suggestions for future research are presented.

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  10. Experimental Characterization of a Grid-Loss Event on a 2.5-MW Dynamometer Using Advanced Operational Modal Analysis: Preprint

    SciTech Connect

    Helsen, J.; Weijtjens, W.; Guo, Y.; Keller, J.; McNiff, B.; Devriendt, C.; Guillaume, P.

    2015-02-01

    This paper experimentally investigates a worst case grid loss event conducted on the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) drivetrain mounted on the 2.5MW NREL dynamic nacelle test-rig. The GRC drivetrain has a directly grid-coupled, fixed speed asynchronous generator. The main goal is the assessment of the dynamic content driving this particular assess the dynamic content of the high-speed stage of the GRC gearbox. In addition to external accelerometers, high frequency sampled measurements of strain gauges were used to assess torque fluctuations and bending moments both at the nacelle main shaft and gearbox high-speed shaft (HSS) through the entire duration of the event. Modal analysis was conducted using a polyreference Least Squares Complex Frequency-domain (pLSCF) modal identification estimator. The event driving the torsional resonance was identified. Moreover, the pLSCF estimator identified main drivetrain resonances based on a combination of acceleration and strain measurements. Without external action during the grid-loss event, a mode shape characterized by counter phase rotation of the rotor and generator rotor determined by the drivetrain flexibility and rotor inertias was the main driver of the event. This behavior resulted in significant torque oscillations with large amplitude negative torque periods. Based on tooth strain measurements of the HSS pinion, this work showed that at each zero-crossing, the teeth lost contact and came into contact with the backside flank. In addition, dynamic nontorque loads between the gearbox and generator at the HSS played an important role, as indicated by strain gauge-measurements.

  11. Grid Work

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Pointwise Inc.'s, Gridgen Software is a system for the generation of 3D (three dimensional) multiple block, structured grids. Gridgen is a visually-oriented, graphics-based interactive code used to decompose a 3D domain into blocks, distribute grid points on curves, initialize and refine grid points on surfaces and initialize volume grid points. Gridgen is available to U.S. citizens and American-owned companies by license.

  12. A quasi-experimental quantitative study of the effect of IB on science performance

    NASA Astrophysics Data System (ADS)

    Healer, Margaret Irene

    The purpose of this quasi-experimental quantitative research study was to investigate the effect of participation in the International Baccalaureate (IB) program on science performance. The findings of the 2x3 mixed ANOVA and Eta square analysis indicated a significant difference (in science CSAP mean scores between the treatment group: IB students ( n = 50) and the control group: non-IB students (n = 50) at the 5th through 10th grade level. The analysis of data concluded that although scores declined between 5th, 8th, and 10th grades with IB and non-IB students, a statistical difference was indicated at each level between the two groups: IB and non-IB in the area of science performance as measured by the CSAP assessment. Educational leaders can use the findings of this study to maximize student science achievement. Further research is recommended through a mixed study to determine the effectiveness of participation in the IB Program and a study of specificity of pedagogical strategies used with science performance with a larger sample size of IB and non-IB students longitudinally.

  13. Data Grid Implementations

    SciTech Connect

    Moore, Reagan W.; Studham, Ronald S.; Rajasekar, Arcot; Watson, Chip; Stockinger, Heinz; Kunszt, Peter; Charlie Catlett and Ian Foster

    2002-02-27

    Data grids link distributed, heterogeneous storage resources into a coherent data management system. From a user perspective, the data grid provides a uniform name space across the underlying storage systems, while supporting retrieval and storage of files. In the high energy physics community, at least six data grids have been implemented for the storage and distribution of experimental data. Data grids are also being used to support projects as diverse as digital libraries (National Library of Medicine Visible Embryo project), federation of multiple astronomy sky surveys (NSF National Virtual Observatory project), and integration of distributed data sets (Long Term Ecological Reserve). Data grids also form the core interoperability mechanisms for creating persistent archives, in which data collections are migrated to new technologies over time. The ability to provide a uniform name space across multiple administration domains is becoming a critical component of national-scale, collaborative projects.

  14. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    PubMed

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment. PMID:24787842

  15. MAGNETIC GRID

    DOEpatents

    Post, R.F.

    1960-08-01

    An electronic grid is designed employing magnetic forces for controlling the passage of charged particles. The grid is particularly applicable to use in gas-filled tubes such as ignitrons. thyratrons, etc., since the magnetic grid action is impartial to the polarity of the charged particles and, accordingly. the sheath effects encountered with electrostatic grids are not present. The grid comprises a conductor having sections spaced apart and extending in substantially opposite directions in the same plane, the ends of the conductor being adapted for connection to a current source.

  16. Highly transparent low resistance Ga doped ZnO/Cu grid double layers prepared at room temperature

    NASA Astrophysics Data System (ADS)

    Jang, Cholho; Zhizhen, Ye; Jianguo, Lü

    2015-12-01

    Ga doped ZnO (GZO)/Cu grid double layer structures were prepared at room temperature (RT). We have studied the electrical and optical characteristics of the GZO/Cu grid double layer as a function of the Cu grid spacing distance. The optical transmittance and sheet resistance of the GZO/Cu grid double layer are higher than that of the GZO/Cu film double layer regardless of the Cu grid spacing distance and increase as the Cu grid spacing distance increases. The calculated values for the transmittance and sheet resistance of the GZO/Cu grid double layer well follow the trend of the experimentally observed transmittance and sheet resistance ones. For the GZO/Cu grid double layer with a Cu grid spacing distance of 1 mm, the highest figure of merit (ΦTC = 6.19 × 10-3 Ω-1) was obtained. In this case, the transmittance, resistivity and filling factor (FF) of the GZO/Cu grid double layer are 83.74%, 1.10 × 10-4 Ω·cm and 0.173, respectively. Project supported by the Key Project of the National Natural Science Foundation of China (No. 91333203), the Program for Innovative Research Team in University of Ministry of Education of China (No. IRT13037), the National Natural Science Foundation of China (No. 51172204), and the Zhejiang Provincial Department of Science and Technology of China (No. 2010R50020).

  17. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  18. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  19. Can Jurors Recognize Missing Control Groups, Confounds, and Experimenter Bias in Psychological Science?

    PubMed Central

    McAuliff, Bradley D.; Kovera, Margaret Bull; Nunez, Gabriel

    2010-01-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed. PMID:18587635

  20. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    NASA Astrophysics Data System (ADS)

    Onghena, Sofie

    2013-04-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact that Belgium, as a result of its geographical position, considered itself as the centre of scientific relations between France and Germany, and as actually strengthened by its linguistic and cultural dualism in this regard. This pursuit of internationalist nationalism also affected the configuration of chemistry and physics as experimental courses at Belgian Royal State Schools, although the years preceding WWI are usually characterized as a period of rising nationalism in science, with countries such as Germany and France as prominent actors. To what extent did France and Germany influence Belgian debates on science education, science teachers' training, the use of textbooks, and the instalment of school laboratories and teaching collections?

  1. The Grid

    SciTech Connect

    White, Vicky

    2003-05-21

    By now almost everyone has heard of 'The Grid', or 'Grid Computing' as it should more properly be described. There are frequent articles in both the popular and scientific press talking about 'The Grid' or about some specific Grid project. Run II Experiments, US-CMS, BTeV, the Sloane Digital Sky Survey and the Lattice QCD folks are all incorporating aspects of Grid Computing in their plans, and the Fermilab Computing Division is supporting and encouraging these efforts. Why are we doing this and what does it have to do with running a physics experiment or getting scientific results? I will explore some of these questions and try to give an overview, not so much of the technical aspects of Grid Computing, rather of what the phenomenon means for our field.

  2. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  3. The DESY Grid Centre

    NASA Astrophysics Data System (ADS)

    Haupt, A.; Gellrich, A.; Kemp, Y.; Leffhalm, K.; Ozerov, D.; Wegner, P.

    2012-12-01

    DESY is one of the world-wide leading centers for research with particle accelerators, synchrotron light and astroparticles. DESY participates in LHC as a Tier-2 center, supports on-going analyzes of HERA data, is a leading partner for ILC, and runs the National Analysis Facility (NAF) for LHC and ILC in the framework of the Helmholtz Alliance, Physics at the Terascale. For the research with synchrotron light major new facilities are operated and built (FLASH, PETRA-III, and XFEL). DESY furthermore acts as Data-Tier1 centre for the Neutrino detector IceCube. Established within the EGI-project DESY operates a grid infrastructure which supports a number of virtual Organizations (VO), incl. ATLAS, CMS, and LHCb. Furthermore, DESY hosts some of HEP and non-HEP VOs, such as the HERA experiments and ILC as well as photon science communities. The support of the new astroparticle physics VOs IceCube and CTA is currently set up. As the global structure of the grid offers huge resources which are perfect for batch-like computing, DESY has set up the National Analysis Facility (NAF) which complements the grid to allow German HEP users for efficient data analysis. The grid infrastructure and the NAF use the same physics data which is distributed via the grid. We call the conjunction of grid and NAF the DESY Grid Centre. In the contribution to CHEP2012 we will in depth discuss the conceptional and operational aspects of our multi-VO and multi-community Grid Centre and present the system setup. We will in particular focus on the interplay of Grid and NAF and present experiences of the operations.

  4. The Frequency of Hands-On Experimentation and Student Attitudes toward Science: A Statistically Significant Relation (2005-51-Ornstein)

    ERIC Educational Resources Information Center

    Ornstein, Avi

    2006-01-01

    Attitudinal data tested hypotheses that students have more positive attitudes toward science when teachers regularly emphasize hands-on laboratory activities and when students more frequently experience higher levels of experimentation or inquiry. The first predicted that students would have more positive attitudes toward science in classrooms…

  5. Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording

    PubMed Central

    Lanfear, Robert; Jennions, Michael D.

    2015-01-01

    Observer bias and other “experimenter effects” occur when researchers’ expectations influence study outcome. These biases are strongest when researchers expect a particular result, are measuring subjective variables, and have an incentive to produce data that confirm predictions. To minimize bias, it is good practice to work “blind,” meaning that experimenters are unaware of the identity or treatment group of their subjects while conducting research. Here, using text mining and a literature review, we find evidence that blind protocols are uncommon in the life sciences and that nonblind studies tend to report higher effect sizes and more significant p-values. We discuss methods to minimize bias and urge researchers, editors, and peer reviewers to keep blind protocols in mind. PMID:26154287

  6. Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording.

    PubMed

    Holman, Luke; Head, Megan L; Lanfear, Robert; Jennions, Michael D

    2015-07-01

    Observer bias and other "experimenter effects" occur when researchers' expectations influence study outcome. These biases are strongest when researchers expect a particular result, are measuring subjective variables, and have an incentive to produce data that confirm predictions. To minimize bias, it is good practice to work "blind," meaning that experimenters are unaware of the identity or treatment group of their subjects while conducting research. Here, using text mining and a literature review, we find evidence that blind protocols are uncommon in the life sciences and that nonblind studies tend to report higher effect sizes and more significant p-values. We discuss methods to minimize bias and urge researchers, editors, and peer reviewers to keep blind protocols in mind. PMID:26154287

  7. Implementing Production Grids

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Ziobarth, John (Technical Monitor)

    2002-01-01

    We have presented the essence of experience gained in building two production Grids, and provided some of the global context for this work. As the reader might imagine, there were a lot of false starts, refinements to the approaches and to the software, and several substantial integration projects (SRB and Condor integrated with Globus) to get where we are today. However, the point of this paper is to try and make it substantially easier for others to get to the point where Information Power Grids (IPG) and the DOE Science Grids are today. This is what is needed in order to move us toward the vision of a common cyber infrastructure for science. The author would also like to remind the readers that this paper primarily represents the actual experiences that resulted from specific architectural and software choices during the design and implementation of these two Grids. The choices made were dictated by the criteria laid out in section 1. There is a lot more Grid software available today that there was four years ago, and various of these packages are being integrated into IPG and the DOE Grids. However, the foundation choices of Globus, SRB, and Condor would not be significantly different today than they were four years ago. Nonetheless, if the GGF is successful in its work - and we have every reason to believe that it will be - then in a few years we will see that the 28 functions provided by these packages will be defined in terms of protocols and MIS, and there will be several robust implementations available for each of the basic components, especially the Grid Common Services. The impact of the emerging Web Grid Services work is not yet clear. It will likely have a substantial impact on building higher level services, however it is the opinion of the author that this will in no way obviate the need for the Grid Common Services. These are the foundation of Grids, and the focus of almost all of the operational and persistent infrastructure aspects of Grids.

  8. Fibonacci Grids

    NASA Technical Reports Server (NTRS)

    Swinbank, Richard; Purser, James

    2006-01-01

    Recent years have seen a resurgence of interest in a variety of non-standard computational grids for global numerical prediction. The motivation has been to reduce problems associated with the converging meridians and the polar singularities of conventional regular latitude-longitude grids. A further impetus has come from the adoption of massively parallel computers, for which it is necessary to distribute work equitably across the processors; this is more practicable for some non-standard grids. Desirable attributes of a grid for high-order spatial finite differencing are: (i) geometrical regularity; (ii) a homogeneous and approximately isotropic spatial resolution; (iii) a low proportion of the grid points where the numerical procedures require special customization (such as near coordinate singularities or grid edges). One family of grid arrangements which, to our knowledge, has never before been applied to numerical weather prediction, but which appears to offer several technical advantages, are what we shall refer to as "Fibonacci grids". They can be thought of as mathematically ideal generalizations of the patterns occurring naturally in the spiral arrangements of seeds and fruit found in sunflower heads and pineapples (to give two of the many botanical examples). These grids possess virtually uniform and highly isotropic resolution, with an equal area for each grid point. There are only two compact singular regions on a sphere that require customized numerics. We demonstrate the practicality of these grids in shallow water simulations, and discuss the prospects for efficiently using these frameworks in three-dimensional semi-implicit and semi-Lagrangian weather prediction or climate models.

  9. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them. PMID:25311906

  10. Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra

    2004-01-01

    Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice

  11. LDCM Grid Prototype (LGP)

    NASA Technical Reports Server (NTRS)

    Weinstein, Beth; Lubelczyk, Jeff

    2006-01-01

    The LGP successfully demonstrated that grid technology could be used to create a collaboration among research scientists, their science development machines, and distributed data to create a science production system in a nationally distributed environment. Grid technology provides a low cost and effective method of enabling production of science products by the science community. To demonstrate this, the LGP partnered with NASA GSFC scientists and used their existing science algorithms to generate virtual Landsat-like data products using distributed data resources. LGP created 48 output composite scenes with 4 input scenes each for a total of 192 scienes processed in parallel. The demonstration took 12 hours, which beat the requirement by almost 50 percent, well within the LDCM requirement to process 250 scenes per day. The LGP project also showed the successful use of workflow tools to automate the processing. Investing in this technology has led to funding for a ROSES ACCESS proposal. The proposal intends to enable an expert science user to produce products from a number of similar distributed instrument data sets using the Land Cover Change Community-based Processing and Analysis System (LC-ComPS) Toolbox. The LC-ComPS Toolbox is a collection of science algorithms that enable the generation of data with ground resolution on the order of Landsat-class instruments.

  12. Solar Fridges and Personal Power Grids: How Berkeley Lab is Fighting Global Poverty (LBNL Science at the Theater)

    SciTech Connect

    Buluswar, Shashi; Gadgil, Ashok

    2012-11-26

    At this November 26, 2012 Science at the Theater, scientists discussed the recently launched LBNL Institute for Globally Transformative Technologies (LIGTT) at Berkeley Lab. LIGTT is an ambitious mandate to discover and develop breakthrough technologies for combating global poverty. It was created with the belief that solutions will require more advanced R&D and a deep understanding of market needs in the developing world. Berkeley Lab's Ashok Gadgil, Shashi Buluswar and seven other LIGTT scientists discussed what it takes to develop technologies that will impact millions of people. These include: 1) Fuel efficient stoves for clean cooking: Our scientists are improving the Berkeley Darfur Stove, a high efficiency stove used by over 20,000 households in Darfur; 2) The ultra-low energy refrigerator: A lightweight, low-energy refrigerator that can be mounted on a bike so crops can survive the trip from the farm to the market; 3) The solar OB suitcase: A low-cost package of the five most critical biomedical devices for maternal and neonatal clinics; 4) UV Waterworks: A device for quickly, safely and inexpensively disinfecting water of harmful microorganisms.

  13. Using R in experimental design with BIBD: An application in health sciences

    NASA Astrophysics Data System (ADS)

    Oliveira, Teresa A.; Francisco, Carla; Oliveira, Amílcar; Ferreira, Agostinho

    2016-06-01

    Considering the implementation of an Experimental Design, in any field, the experimenter must pay particular attention and look for the best strategies in the following steps: planning the design selection, conduct the experiments, collect observed data, proceed to analysis and interpretation of results. The focus is on providing both - a deep understanding of the problem under research and a powerful experimental process at a reduced cost. Mainly thanks to the possibility of allowing to separate variation sources, the importance of Experimental Design in Health Sciences is strongly recommended since long time. Particular attention has been devoted to Block Designs and more precisely to Balanced Incomplete Block Designs - in this case the relevance states from the fact that these designs allow testing simultaneously a number of treatments bigger than the block size. Our example refers to a possible study of inter reliability of the Parkinson disease, taking into account the UPDRS (Unified Parkinson's disease rating scale) in order to test if there are significant differences between the specialists who evaluate the patients performances. Statistical studies on this disease were described for example in Richards et al (1994), where the authors investigate the inter-rater Reliability of the Unified Parkinson's Disease Rating Scale Motor Examination. We consider a simulation of a practical situation in which the patients were observed by different specialists and the UPDRS on assessing the impact of Parkinson's disease in patients was observed. Assigning treatments to the subjects following a particular BIBD(9,24,8,3,2) structure, we illustrate that BIB Designs can be used as a powerful tool to solve emerging problems in this area. Once a structure with repeated blocks allows to have some block contrasts with minimum variance, see Oliveira et al. (2006), the design with cardinality 12 was selected for the example. R software was used for computations.

  14. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-01

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation. PMID:26451629

  15. Striped ratio grids for scatter estimation

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Wang, Adam S.; Star-Lack, Josh

    2016-03-01

    Striped ratio grids are a new concept for scatter management in cone-beam CT. These grids are a modification of conventional anti-scatter grids and consist of stripes which alternate between high grid ratio and low grid ratio. Such a grid is related to existing hardware concepts for scatter estimation such as blocker-based methods or primary modulation, but rather than modulating the primary, the striped ratio grid modulates the scatter. The transitions between adjacent stripes can be used to estimate and subtract the remaining scatter. However, these transitions could be contaminated by variation in the primary radiation. We describe a simple nonlinear image processing algorithm to estimate scatter, and proceed to validate the striped ratio grid on experimental data of a pelvic phantom. The striped ratio grid is emulated by combining data from two scans with different grids. Preliminary results are encouraging and show a significant reduction of scatter artifact.

  16. FermiGrid - experience and future plans

    SciTech Connect

    Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Timm, S.; Yocum, D.; /Fermilab

    2007-09-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and the Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems.

  17. Smart Grid Integration Laboratory

    SciTech Connect

    Troxell, Wade

    2011-12-22

    The initial federal funding for the Colorado State University Smart Grid Integration Laboratory is through a Congressionally Directed Project (CDP), DE-OE0000070 Smart Grid Integration Laboratory. The original program requested in three one-year increments for staff acquisition, curriculum development, and instrumentation all which will benefit the Laboratory. This report focuses on the initial phase of staff acquisition which was directed and administered by DOE NETL/ West Virginia under Project Officer Tom George. Using this CDP funding, we have developed the leadership and intellectual capacity for the SGIC. This was accomplished by investing (hiring) a core team of Smart Grid Systems engineering faculty focused on education, research, and innovation of a secure and smart grid infrastructure. The Smart Grid Integration Laboratory will be housed with the separately funded Integrid Laboratory as part of CSU's overall Smart Grid Integration Center (SGIC). The period of performance of this grant was 10/1/2009 to 9/30/2011 which included one no cost extension due to time delays in faculty hiring. The Smart Grid Integration Laboratory's focus is to build foundations to help graduate and undergraduates acquire systems engineering knowledge; conduct innovative research; and team externally with grid smart organizations. Using the results of the separately funded Smart Grid Workforce Education Workshop (May 2009) sponsored by the City of Fort Collins, Northern Colorado Clean Energy Cluster, Colorado State University Continuing Education, Spirae, and Siemens has been used to guide the hiring of faculty, program curriculum and education plan. This project develops faculty leaders with the intellectual capacity to inspire its students to become leaders that substantially contribute to the development and maintenance of Smart Grid infrastructure through topics such as: (1) Distributed energy systems modeling and control; (2) Energy and power conversion; (3) Simulation of

  18. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  19. Changes in Critical Thinking Skills Following a Course on Science and Pseudoscience: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    McLean, Carmen P.; Miller, Nathan A.

    2010-01-01

    We assessed changes in paranormal beliefs and general critical thinking skills among students (n = 23) enrolled in an experimental course designed to teach distinguishing science from pseudoscience and a comparison group of students (n = 30) in an advanced research methods course. On average, both courses were successful in reducing paranormal…

  20. Heritage Education: Exploring the Conceptions of Teachers and Administrators from the Perspective of Experimental and Social Science Teaching

    ERIC Educational Resources Information Center

    Perez, Roque Jimenez; Lopez, Jose Maria Cuenca; Listan, D. Mario Ferreras

    2010-01-01

    This paper describes a research project into heritage education. Taking an interdisciplinary perspective from within the field of Experimental and Social Science Education, it presents an analysis of teachers' and administrators' conceptions of heritage, its teaching and its dissemination in Spain. A statistical description is provided of the…

  1. Apollo-Soyuz pamphlet no. 9: General science. [experimental design in Astronomy, Biology, Geophysics, Aeronomy and Materials science

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    The objectives and planning activities for the Apollo-Soyuz mission are summarized. Aspects of the space flight considered include the docking module and launch configurations, spacecraft orbits, and weightlessness. The 28 NASA experiments conducted onboard the spacecraft are summarized. The contributions of the mission to the fields of astronomy, geoscience, biology, and materials sciences resulting from the experiments are explored.

  2. Experimental investigations of the nonlinear dynamics of a complex space-charge configuration inside and around a grid cathode with hole

    NASA Astrophysics Data System (ADS)

    Teodorescu-Soare, C. T.; Dimitriu, D. G.; Ionita, C.; Schrittwieser, R. W.

    2016-03-01

    By negatively biasing a metallic grid with a small hole, down to a critical value of the applied potential a complex space-charge structure appears inside and around the grid cathode. The static current-voltage characteristic of the discharge shows one or two current jumps (the number of current jumps depending on the working gas pressure), one of them being of hysteretic type. Electrical probe measurements show a positive potential inside the grid cathode with respect to the potential applied on it. This is interpreted as being due to the hollow cathode effect. Thus, the inner fireball appears around the virtual anode inside the grid cathode. For more negative potentials, the electrons inside the cathode reach sufficient energy to penetrate the inner sheath near the cathode, passing through the hole and giving rise to a second fireball-like structure located outside the cathode. This second structure interacts with the negative glow of the discharge. The recorded time series of the discharge current oscillations reveal strongly nonlinear dynamics of the complex space-charge structure: by changing the negative potential applied on the grid cathode, the structure passes through different dynamic states involving chaos, quasi-periodicity, intermittency and period-doubling bifurcations, appearing like a competition of different routes to chaos.

  3. Grid-Enabled Measures

    PubMed Central

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  4. Experimental Methods to Evaluate Science Utility Relative to the Decadal Survey

    NASA Technical Reports Server (NTRS)

    Widergren, Cynthia

    2012-01-01

    The driving factor for competed missions is the science that it plans on performing once it has reached its target body. These science goals are derived from the science recommended by the most current Decadal Survey. This work focuses on science goals in previous Venus mission proposals with respect to the 2013 Decadal Survey. By looking at how the goals compare to the survey and how much confidence NASA has in the mission's ability to accomplish these goals, a method was created to assess the science return utility of each mission. This method can be used as a tool for future Venus mission formulation and serves as a starting point for future development of create science utility assessment tools.

  5. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  6. LAPS Grid generation and adaptation

    NASA Astrophysics Data System (ADS)

    Pagliantini, Cecilia; Delzanno, Gia Luca; Guo, Zehua; Srinivasan, Bhuvana; Tang, Xianzhu; Chacon, Luis

    2011-10-01

    LAPS uses a common-data framework in which a general purpose grid generation and adaptation package in toroidal and simply connected domains is implemented. The initial focus is on implementing the Winslow/Laplace-Beltrami method for generating non-overlapping block structured grids. This is to be followed by a grid adaptation scheme based on Monge-Kantorovich optimal transport method [Delzanno et al., J. Comput. Phys,227 (2008), 9841-9864], that equidistributes application-specified error. As an initial set of applications, we will lay out grids for an axisymmetric mirror, a field reversed configuration, and an entire poloidal cross section of a tokamak plasma reconstructed from a CMOD experimental shot. These grids will then be used for computing the plasma equilibrium and transport in accompanying presentations. A key issue for Monge-Kantorovich grid optimization is the choice of error or monitor function for equi-distribution. We will compare the Operator Recovery Error Source Detector (ORESD) [Lapenta, Int. J. Num. Meth. Eng,59 (2004) 2065-2087], the Tau method and a strategy based on the grid coarsening [Zhang et al., AIAA J,39 (2001) 1706-1715] to find an ``optimal'' grid. Work supported by DOE OFES.

  7. An infrastructure for the integration of geoscience instruments and sensors on the Grid

    NASA Astrophysics Data System (ADS)

    Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.

    2009-04-01

    The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV

  8. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    ERIC Educational Resources Information Center

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  9. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    ERIC Educational Resources Information Center

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-01-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually "do" science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields.…

  10. Correlated Curriculum Program: An Experimental Program. Science Level 1 (9A, 9B, 10A).

    ERIC Educational Resources Information Center

    Loebl, Stanley, Ed.; And Others

    The unit plans in Correlated Science 1 are intended to be of use to the teacher in both lesson and team planning. The course in science was designed for optimum correlation with the work done in business, health, and industrial careers. Behavioral objectives, class routines, time allotments, student evaluation, and the design of the manual are…

  11. Your World and Welcome To It, Science (Experimental): 5314.03.

    ERIC Educational Resources Information Center

    Kleinman, David Z.

    Presented is a beginning course in biology with emphasis on ecology for students with limited interest and few experiences in science. These students most likely will not take many more science courses. Included are the basic ecological concepts of communities, population, societies and the effects humans have on the environment. Like all other…

  12. An Experimental Examination of Quick Writing in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Benedek-Wood, Elizabeth; Mason, Linda H.; Wood, Philip H.; Hoffman, Katie E.; McGuire, Ashley

    2014-01-01

    A staggered A-B design study was used to evaluate the effects of Self- Regulated Strategy Development (SRSD) instruction for quick writing in middle school science across four classrooms. A sixth-grade science teacher delivered all students' writing assessment and SRSD instruction for informative quick writing. Results indicated that…

  13. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    SciTech Connect

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequencies are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.

  14. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    ScienceCinema

    Marken, Ken [Superconductivity Technology Center, Los Alamos, New Mexico, United States

    2010-01-08

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors ? high-temperature superconducting (HTS) tapes ? which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  15. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  16. Grids: The Top Ten Questions

    DOE PAGESBeta

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  17. Experimental setup and the system performance for single-grid-based phase-contrast x-ray imaging (PCXI) with a microfocus x-ray tube

    NASA Astrophysics Data System (ADS)

    Lim, Hyunwoo; Park, Yeonok; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Park, Chulkyu; Woo, Taeho; Lee, Minsik; Kim, Jinsoo; Chung, Nagkun; Kim, Jinwon; Kim, Jinguk

    2015-08-01

    In this work, we investigated a simplified approach to phase-contrast x-ray imaging (PCXI) by using a single antiscatter grid and a microfocus x-ray tube, which has potential to open the way to further widespread use of PCXI into the related application areas. We established a table-top setup for PCXI studies of biological and non-biological samples and investigated the system performance. The PCXI system consists of a focused-linear grid having a strip density of 200 lines/in. (JPI Healthcare Corp.), a microfocus x-ray tube having a focal spot size of about 5 μm (Hamamatsu, L7910), and a high-resolution CMOS imaging detector having a pixel size of 48 μm (Rad-icon Imaging Corp., Shad-o-Box 2048). By using our prototype system, we successfully obtained attenuation, scattering, and differential phase-contrast x-ray images of improved visibility from the raw images of several samples at x-ray tube conditions of 50 kVp and 6 mAs. Our initial results indicate that the single-grid-based approach seems a useful method for PCXI with great simplicity and minimal requirements on the setup alignment.

  18. Arguing for Experimental "Facts" in Science: A Study of Research Article Results Sections in Biochemistry.

    ERIC Educational Resources Information Center

    Thompson, Dorothea K.

    1993-01-01

    Claims that the contextual nature of "results" sections in scientific articles remains largely unexplored. Examines scientific publications by biochemists. Identifies six rhetorical moves common to such articles. Demonstrates the rhetorical nature of science writing. (HB)

  19. Experimental stations as a tool to teach soil science at the University of Valencia

    NASA Astrophysics Data System (ADS)

    Cerdà, Artemi

    2010-05-01

    This paper shows the strategies used at the University of Valencia (Department of Geography. Soil Erosion and Degradation Research Group) to teach soil science at the Geography and Enviromental Science Degrees. The use of the Montesa and El Teularet research stations contribute with a better knowledge on soil science for the students as they can see the measurements carried out in the field. Students visit the stations and contribute to measurements and sampling every season. The use of meteorological stations, erosion plots, soil moisture and soil temperatures probes, and sampling give the students the chances to understand the theoretical approach they use to have. This presentation will show how the students evolve, and how their knowledge in soil science is improved.

  20. Pedagogical experimentations about participating science, in a european class, in France.

    NASA Astrophysics Data System (ADS)

    Burgio, Marion

    2015-04-01

    A european class is, in France, a class in which we teach a subject in a foreign language, for example science in English. I led, in my European class, during a seven weeks session, group work activities about different participating science actions. There were groups composed of three or four 16 years old students. Each group chose one type of participating science activity among : - Leading a visioconference with an IODP mission on board the Joides Resolution. - Being part of a "science songs community" with Tom Mc Fadden They divided the work and some of them studied the websites and contacted the actors to present the pedagogical or scientific background of their subject. Others had a concrete production like the organization of a visioconference with the Joides Resolution or the creation of a pedagogical song about geology. I will present some results of their work and explain the students motivation linked to this active learning method.

  1. An Experimental Science Program with the Open Classroom Approach Based on the Philadelphia Primary Science Guide. Part I, Primary Science Unit and Part II, Primary Ecology Unit.

    ERIC Educational Resources Information Center

    Quinn, Jeanette; Carty, Elaine

    Reported is a project designed to correlate six units of study from the Philadelphia Elementary Science Guide and to incorporate them in such a way as to reduce the suggested 32 weeks of teaching time, for the individual units, to 10 weeks for all 6 units. This was necessitated by an interruption of the school year by a teachers' strike. Three…

  2. "They Sweat for Science": The Harvard Fatigue Laboratory and Self-Experimentation in American Exercise Physiology.

    PubMed

    Johnson, Andi

    2015-08-01

    In many scientific fields, the practice of self-experimentation waned over the course of the twentieth century. For exercise physiologists working today, however, the practice of self-experimentation is alive and well. This paper considers the role of the Harvard Fatigue Laboratory and its scientific director, D. Bruce Dill, in legitimizing the practice of self-experimentation in exercise physiology. Descriptions of self-experimentation are drawn from papers published by members of the Harvard Fatigue Lab. Attention is paid to the ethical and practical justifications for self-experimentation in both the lab and the field. Born out of the practical, immediate demands of fatigue protocols, self-experimentation performed the long-term, epistemological function of uniting physiological data across time and space, enabling researchers to contribute to a general human biology program. PMID:25139499

  3. GridMan: A grid manipulation system

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Wang, Zhu

    1992-01-01

    GridMan is an interactive grid manipulation system. It operates on grids to produce new grids which conform to user demands. The input grids are not constrained to come from any particular source. They may be generated by algebraic methods, elliptic methods, hyperbolic methods, parabolic methods, or some combination of methods. The methods are included in the various available structured grid generation codes. These codes perform the basic assembly function for the various elements of the initial grid. For block structured grids, the assembly can be quite complex due to a large number of clock corners, edges, and faces for which various connections and orientations must be properly identified. The grid generation codes are distinguished among themselves by their balance between interactive and automatic actions and by their modest variations in control. The basic form of GridMan provides a much more substantial level of grid control and will take its input from any of the structured grid generation codes. The communication link to the outside codes is a data file which contains the grid or section of grid.

  4. Science.

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science instruction and technology: "A 3-D Journey in Space: A New Visual Cognitive Adventure" (Yoav Yair, Rachel Mintz, and Shai Litvak); "Using Collaborative Inquiry and Interactive Technologies in an Environmental Science Project for Middle School Teachers: A Description and Analysis" (Patricia…

  5. A Case-Based Approach Improves Science Students' Experimental Variable Identification Skills

    ERIC Educational Resources Information Center

    Grunwald, Sandra; Hartman, Andrew

    2010-01-01

    Incorporation of experimental case studies into the laboratory curriculum increases students' abilities to identify experimental variables that affect the outcome of an experiment. Here the authors describe how such case studies were incorporated using an online course management system into a biochemistry laboratory curriculum and the assessment…

  6. Application of the Shockley-Ramo theorem on the grid inefficiency of Frisch grid ionization chambers

    NASA Astrophysics Data System (ADS)

    Göök, A.; Hambsch, F.-J.; Oberstedt, A.; Oberstedt, S.

    2012-02-01

    The concept of grid inefficiency in Frisch grid ionization chambers and its influence on the anode pulse shape is explained in terms of the Shockley-Ramo theorem for induced charges. The grid inefficiency correction is deduced from numerically calculated weighting potentials. A method to determine the correction factor experimentally is also presented. Experimental and calculated values of the correction factor are shown to be in good agreement.

  7. Qualitative Quantitative and Experimental Concept Possession, Criteria for Identifying Conceptual Change in Science Education

    ERIC Educational Resources Information Center

    Lappi, Otto

    2013-01-01

    Students sometimes misunderstand or misinterpret scientific content because of persistent misconceptions that need to be overcome by science education--a learning process typically called conceptual change. The acquisition of scientific content matter thus requires a transformation of the initial knowledge-state of a common-sense picture of the…

  8. Mathematics Through Science, Part III: An Experimental Approach to Functions. Teacher's Commentary. Revised Edition.

    ERIC Educational Resources Information Center

    Bolduc, Elroy J., Jr.; And Others

    The purpose of this project is to teach learning and understanding of mathematics at the ninth grade level through the use of science experiments. This part of the program contains significant amounts of material normally found in a beginning algebra class. The material should be found useful for classes in general mathematics as a preparation for…

  9. Getting "What Works" Working: Building Blocks for the Integration of Experimental and Improvement Science

    ERIC Educational Resources Information Center

    Peterson, Amelia

    2016-01-01

    As a systemic approach to improving educational practice through research, "What Works" has come under repeated challenge from alternative approaches, most recently that of improvement science. While "What Works" remains a dominant paradigm for centralized knowledge-building efforts, there is need to understand why this…

  10. A Guide to Establishing a Science/Mathematics Research Program in High School. Experimental.

    ERIC Educational Resources Information Center

    Goodman, Harvey; And Others

    This guide has been designed to help teachers, supervisors, and administrators set up a science or mathematics research program which should provide students with a set of basic "tools" for use in problem solving situations. The guide is organized into 17 chapters. The first 15 chapters focus on: organizing a research program; recruiting students;…

  11. Science, suffrage, and experimentation: Mary Putnam Jacobi and the controversy over vivisection in late nineteenth-century America.

    PubMed

    Bittel, Carla Jean

    2005-01-01

    This article examines the medical activism of the New York physician Mary Putnam Jacobi (1842-1906), to illustrate the problems of gender and science at the center of the vivisection debate in late nineteenth-century America. In the post-Civil War era, individuals both inside and outside the medical community considered vivisection to be a controversial practice. Physicians divided over the value of live animal experimentation, while reformers and activists campaigned against it. Jacobi stepped into the center of the controversy and tried to use her public defense of experimentation to the advantage of women in the medical profession. Her advocacy of vivisection was part of her broader effort to reform medical education, especially at women's institutions. It was also a political strategy aimed at associating women with scientific practices to advance a women's rights agenda. Her work demonstrates how debates over women in medicine and science in medicine, suffrage, and experimentation overlapped at a critical moment of historical transition. PMID:16327083

  12. Critical Need for Family-Based, Quasi-Experimental Designs in Integrating Genetic and Social Science Research

    PubMed Central

    Lahey, Benjamin B.; Turkheimer, Eric; Lichtenstein, Paul

    2013-01-01

    Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene–environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles. PMID:23927516

  13. Critical need for family-based, quasi-experimental designs in integrating genetic and social science research.

    PubMed

    D'Onofrio, Brian M; Lahey, Benjamin B; Turkheimer, Eric; Lichtenstein, Paul

    2013-10-01

    Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene-environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles. PMID:23927516

  14. Unstructured grids for sonic-boom analysis

    NASA Technical Reports Server (NTRS)

    Fouladi, Kamran

    1993-01-01

    A fast and efficient unstructured grid scheme is evaluated for sonic-boom applications. The scheme is used to predict the near-field pressure signatures of a body of revolution at several body lengths below the configuration, and those results are compared with experimental data. The introduction of the 'sonic-boom grid topology' to this scheme make it well suited for sonic-boom applications, thus providing an alternative to conventional multiblock structured grid schemes.

  15. Nurbs and grid generation

    SciTech Connect

    Barnhill, R.E.; Farin, G.; Hamann, B.

    1995-12-31

    This paper provides a basic overview of NURBS and their application to numerical grid generation. Curve/surface smoothing, accelerated grid generation, and the use of NURBS in a practical grid generation system are discussed.

  16. Modeling of the charge-state separation at ITEP experimental facility for material science based on a Bernas ion source

    NASA Astrophysics Data System (ADS)

    Barminova, H. Y.; Saratovskyh, M. S.

    2016-02-01

    The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 1010 ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.

  17. Data Grid Management Systems

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.; Jagatheesan, Arun; Rajasekar, Arcot; Wan, Michael; Schroeder, Wayne

    2004-01-01

    The "Grid" is an emerging infrastructure for coordinating access across autonomous organizations to distributed, heterogeneous computation and data resources. Data grids are being built around the world as the next generation data handling systems for sharing, publishing, and preserving data residing on storage systems located in multiple administrative domains. A data grid provides logical namespaces for users, digital entities and storage resources to create persistent identifiers for controlling access, enabling discovery, and managing wide area latencies. This paper introduces data grids and describes data grid use cases. The relevance of data grids to digital libraries and persistent archives is demonstrated, and research issues in data grids and grid dataflow management systems are discussed.

  18. Experimental and credentialing capital: an adaptable framework for facilitating science outreach for underrepresented youth.

    PubMed

    Drazan, John F; D'Amato, Anthony R; Winkelman, Max A; Littlejohn, Aaron J; Johnson, Christopher; Ledet, Eric H; Eglash, Ron

    2015-08-01

    Increasing the numbers of black, latino and native youth in STEM careers is both an important way to reduce poverty in low income communities, and a contribution to the diversity of thought and experience that drives STEM research. But underrepresented youth are often alienated from STEM. Two new forms of social capital have been identified that can be combined to create a learning environment in which students and researchers can meet and explore an area of shared interest. Experimental capital refers to the intrinsic motivation that students can develop when they learn inquiry techniques for exploring topics that they feel ownership over. Credentialing capital denotes a shared interest and ability between all parties engaged in the experimental endeavor. These two forms of social capital form an adaptable framework for researchers to use to create effective outreach programs. In this case study sports biomechanics was utilized as the area of shared interest and understanding the slam dunk was used as experimental capital. PMID:26737094

  19. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2015-09-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.

  20. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    ERIC Educational Resources Information Center

    Wieman, Carl

    2015-01-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and…

  1. An Experimental Project Approach to Biology: Mastering the Interdisciplinary Skills at the Core of Science.

    ERIC Educational Resources Information Center

    Petersen, Chris E.

    2000-01-01

    Examines the educational benefits of an experimental project approach to students taking the last course of an introductory biology sequence. Educational benefits were defined in terms of analytical skills, knowledge of basic statistics, and experience with scientific writing. The study was viewed as a preliminary investigation, but one that…

  2. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    ERIC Educational Resources Information Center

    Pyatt, Kevin; Sims, Rod

    2012-01-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal…

  3. A New Elliptical Grid Clustering Method

    NASA Astrophysics Data System (ADS)

    Guansheng, Zheng

    A new base on grid clustering method is presented in this paper. This new method first does unsupervised learning on the high dimensions data. This paper proposed a grid-based approach to clustering. It maps the data onto a multi-dimensional space and applies a linear transformation to the feature space instead of to the objects themselves and then approach a grid-clustering method. Unlike the conventional methods, it uses a multidimensional hyper-eclipse grid cell. Some case studies and ideas how to use the algorithms are described. The experimental results show that EGC can discover abnormity shapes of clusters.

  4. Spatial services grid

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Li, Qi; Cheng, Jicheng

    2005-10-01

    This paper discusses the concept, key technologies and main application of Spatial Services Grid. The technologies of Grid computing and Webservice is playing a revolutionary role in studying the spatial information services. The concept of the SSG (Spatial Services Grid) is put forward based on the SIG (Spatial Information Grid) and OGSA (open grid service architecture). Firstly, the grid computing is reviewed and the key technologies of SIG and their main applications are reviewed. Secondly, the grid computing and three kinds of SIG (in broad sense)--SDG (spatial data grid), SIG (spatial information grid) and SSG (spatial services grid) and their relationships are proposed. Thirdly, the key technologies of the SSG (spatial services grid) is put forward. Finally, three representative applications of SSG (spatial services grid) are discussed. The first application is urban location based services gird, which is a typical spatial services grid and can be constructed on OGSA (Open Grid Services Architecture) and digital city platform. The second application is region sustainable development grid which is the key to the urban development. The third application is Region disaster and emergency management services grid.

  5. Experimental education of Astronomy across the seedbeds of investigation in sciences

    NASA Astrophysics Data System (ADS)

    Taborda, E.

    2009-05-01

    In Colombia, the geographic situation help us in the moment of make academic work of astronomic observation, due to the opportunity of look almost the totality of the nocturnal sky in the hemispheres north and south in on night generating the possibility of make easy our labor as educators and to the astronomy and the related science with the students learn and the socialize in fundamental areas as mathematics, physic, chemistry, biology, art, technology, geography and history between others fundamental areas. In our presentation will be show the results of 3 years of in which we the students of primary and high school studies as a descriptive study of these research. we need economic help for the aid to this event.

  6. Large-Scale Experimental Planetary Science Meets Planetary Defense: Deorbiting an Asteroidal Satellite

    NASA Technical Reports Server (NTRS)

    Cintala, M. J.; Durda, D. D.; Housen, K. R.

    2005-01-01

    Other than remote-sensing and spacecraft-derived data, the only information that exists regarding the physical and chemical properties of asteroids is that inferred through calculations, numerical simulations, extrapolation of experiments, and meteorite studies. Our understanding of the dynamics of accretion of planetesimals, collisional disruption of asteroids, and the macroscopic, shock-induced modification of the surfaces of such small objects is also, for the most part, founded on similar inferences. While considerable strides have been made in improving the state of asteroid science, too many unknowns remain to assert that we understand the parameters necessary for the more practical problem of deflecting an asteroid or asteroid pair on an Earth-intersecting trajectory. Many of these deficiencies could be reduced or eliminated by intentionally deorbiting an asteroidal satellite and monitoring the resulting collision between it and the primary asteroid, a capability that is well within the limitations of current technology.

  7. Research exemption/experimental use in the European Union: patents do not block the progress of science.

    PubMed

    Jaenichen, Hans-Rainer; Pitz, Johann

    2015-02-01

    In the public debate about patents, specifically in the area of biotechnology, the position has been taken that patents block the progress of science. As we demonstrate in this review, this is not the case in the European Union (EU). The national patent acts of the EU member states define research and experimental use exemptions from patent infringement that allow sufficient room for research activities to promote innovation. This review provides a comparative overview of the legal requirements and the extent and limitations of experimental use exemptions, including the so-called Bolar provision, in Germany, the United Kingdom, France, Spain, Italy, and The Netherlands. The legal framework in the respective countries is illustrated with reference to practical examples concerning tests on patent-protected genetic targets and antibodies. Specific questions concerning the use of patent-protected research tools, the outsourcing of research activities, and the use of preparatory and supplying acts for experimental purposes that are necessary for conducting experiments are covered. PMID:25377145

  8. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    NASA Astrophysics Data System (ADS)

    Pyatt, Kevin; Sims, Rod

    2012-02-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal data were gathered and analyzed to measure the instructional value of physical and virtual lab experiences in terms of student performance and attitudes. Test statistics were conducted for differences of means for assessment data. Student attitudes towards virtual experiences in comparison to physical lab experiences were measured using a newly created Virtual and Physical Experimentation Questionnaire (VPEQ). VPEQ was specifically developed for this study, and included new scales of Usefulness of Lab, and Equipment Usability which measured attitudinal dimensions in virtual and physical lab experiences. A factor analysis was conducted for questionnaire data, and reliability of the scales and internal consistency of items within scales were calculated. The new scales were statistically valid and reliable. The instructional value of physical and virtual lab experiences was comparable in terms of student performance. Students showed preference towards the virtual medium in their lab experiences. Students showed positive attitudes towards physical and virtual experiences, and demonstrated a preference towards inquiry-based experiences, physical or virtual. Students found virtual experiences to have higher equipment usability as well as a higher degree of open-endedness. In regards to student access to inquiry-based lab experiences, virtual and online alternatives were viewed favorably by students.

  9. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    SciTech Connect

    Box, Dennis

    2014-01-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  10. Grid-free compressive beamforming.

    PubMed

    Xenaki, Angeliki; Gerstoft, Peter

    2015-04-01

    The direction-of-arrival (DOA) estimation problem involves the localization of a few sources from a limited number of observations on an array of sensors, thus it can be formulated as a sparse signal reconstruction problem and solved efficiently with compressive sensing (CS) to achieve high-resolution imaging. On a discrete angular grid, the CS reconstruction degrades due to basis mismatch when the DOAs do not coincide with the angular directions on the grid. To overcome this limitation, a continuous formulation of the DOA problem is employed and an optimization procedure is introduced, which promotes sparsity on a continuous optimization variable. The DOA estimation problem with infinitely many unknowns, i.e., source locations and amplitudes, is solved over a few optimization variables with semidefinite programming. The grid-free CS reconstruction provides high-resolution imaging even with non-uniform arrays, single-snapshot data and under noisy conditions as demonstrated on experimental towed array data. PMID:25920844

  11. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored. PMID:25146296

  12. Experimental studies in fluid mechanics and materials science using acoustic levitation

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.; Robey, J.; Arce, A.; Gaspar, M.

    1987-01-01

    Ground-based and short-duration low gravity experiments have been carried out with the use of ultrasonic levitators to study the dynamics of freely suspended liquid drops under the influence of predominantly capillary and acoustic radiation forces. Some of the effects of the levitating field on the shape as well as the fluid flow fields within the drop have been determined. The development and refinement of measurement techniques using levitated drops with size on the order of 2 mm in diameter have yielded methods having direct application to experiments in microgravity. In addition, containerless melting, undercooling, and freezing of organic materials as well as low melting metals have provided experimental data and observations on the application of acoustic positioning techniques to materials studies.

  13. Parallel grid population

    DOEpatents

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  14. Corrosion chemistry closing comments: opportunities in corrosion science facilitated by operando experimental characterization combined with multi-scale computational modelling.

    PubMed

    Scully, John R

    2015-01-01

    Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways. PMID:26114392

  15. Distributed data mining on grids: services, tools, and applications.

    PubMed

    Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo

    2004-12-01

    Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed. PMID:15619945

  16. A Moving Grid Capability for NPARC

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    1998-01-01

    Version 3.1 of the NPARC computational fluid dynamics flow solver introduces a capability to solve unsteady flow on moving multi-block, structured grids with nominally second-order time accuracy. The grid motion is due to segments of the boundary grid that translate and rotate in a rigid-body manner or deform. The grid is regenerated at each time step to accommodate the boundary grid motion. The flow equations and computational models sense the moving grid through the grid velocities, which are computed from a time-difference of the grids at two consecutive time levels. For three-dimensional flow domains, it is assumed that the grid retains a planar character with respect to one coordinate. The application and accuracy of NPARC v3.1 is demonstrated for flow about a flying wedge, rotating flap, a collapsing bump in a duct, and the upstart / restart flow in a variable-geometry inlet. The results compare well with analytic and experimental results.

  17. Integration des sciences et de la langue: Creation et experimentation d'un modele pedagogique pour ameliorer l'apprentissage des sciences en milieu francophone minoritaire

    NASA Astrophysics Data System (ADS)

    Cormier, Marianne

    Les faibles resultats en sciences des eleves du milieu francophone minoritaire, lors d'epreuves au plan national et international, ont interpelle la recherche de solutions. Cette these avait pour but de creer et d'experimenter un modele pedagogique pour l'enseignement des sciences en milieu linguistique minoritaire. En raison de la presence de divers degres de francite chez la clientele scolaire de ce milieu, plusieurs elements langagiers (l'ecriture, la discussion et la lecture) ont ete integres a l'apprentissage scientifique. Nous avions recommande de commencer le processus d'apprentissage avec des elements langagiers plutot informels (redaction dans un journal, discussions en dyades...) pour progresser vers des activites langagieres plus formelles (redaction de rapports ou d'explications scientifiques). En ce qui a trait a l'apprentissage scientifique, le modele preconisait une demarche d'evolution conceptuelle d'inspiration socio-constructiviste tout en s'appuyant fortement sur l'apprentissage experientiel. Lors de l'experimentation du modele, nous voulions savoir si celui-ci provoquait une evolution conceptuelle chez les eleves, et si, simultanement, le vocabulaire scientifique de ces derniers s'enrichissait. Par ailleurs, nous cherchions a comprendre comment les eleves vivaient leurs apprentissages dans le cadre de ce modele pedagogique. Une classe de cinquieme annee de l'ecole de Grande-Digue, dans le Sud-est du Nouveau-Brunswick, a participe a la mise a l'essai du modele en etudiant les marais sales locaux. Lors d'entrevues initiales, nous avons remarque que les connaissances des eleves au sujet des marais sales etaient limitees. En effet, s'ils etaient conscients que les marais etaient des lieux naturels, ils ne pouvaient pas necessairement les decrire avec precision. Nous avons egalement constate que les eleves utilisaient surtout des mots communs (plantes, oiseaux, insectes) pour decrire le marais. Les resultats obtenus indiquent que les eleves ont

  18. Toilets and the Smart Grid: A role for history and art in communicating assessed science for Earth—The Operators' Manual

    NASA Astrophysics Data System (ADS)

    Alley, R. B.; Haines-Stiles, G.; Akuginow, E.

    2010-12-01

    Assessed science consistently shows that an economically efficient response to global warming would begin now, with the likelihood of side benefits including increased employment, security, and environmental quality. This result has been obtained consistently for many years, yet societal responses over this time have fallen well short of the economically efficient path, suggesting that society is being strongly influenced by additional considerations. First-hand experience indicates that many people, including many policy-makers, “know” global-warming “science” that did not come from the scientific assessment bodies or their participating scientists. Instead, this supposedly supporting science was provided by opponents of actions to deal with global warming, and was designed to be inaccurate and easily defeated (e.g., “All of global warming theory rests on the correlation between CO2 and temperature”, or “…rests on the hockey stick.”) A useful discussion of possible wise responses to the problem is difficult when so much that many people “know” just isn’t so. The inaccurate information has been presented very effectively, but we believe that accurate information can be presented even more effectively, honestly showing the costs and benefits of efficient response while explicitly addressing the widespread misconceptions. The history of previous environmental issues offers one path forward, with denial preceding solutions in such diverse cases as the San Francisco earthquake and toilets in Edinburgh. We will provide first-hand reports from preparation of an NSF Informal Science Education-funded project, Earth—The Operators’ Manual.

  19. DOE SciDAC’s Earth System Grid Center for Enabling Technologies Final Report for University of Southern California Information Sciences Institute

    SciTech Connect

    Chervenak, Ann Louise

    2013-12-19

    The mission of the Earth System Grid Federation (ESGF) is to provide the worldwide climate-research community with access to the data, information, model codes, analysis tools, and intercomparison capabilities required to make sense of enormous climate data sets. Its specific goals are to (1) provide an easy-to-use and secure web-based data access environment for data sets; (2) add value to individual data sets by presenting them in the context of other data sets and tools for comparative analysis; (3) address the specific requirements of participating organizations with respect to bandwidth, access restrictions, and replication; (4) ensure that the data are readily accessible through the analysis and visualization tools used by the climate research community; and (5) transfer infrastructure advances to other domain areas. For the ESGF, the U.S. Department of Energy’s (DOE’s) Earth System Grid Center for Enabling Technologies (ESG-CET) team has led international development and delivered a production environment for managing and accessing ultra-scale climate data. This production environment includes multiple national and international climate projects (such as the Community Earth System Model and the Coupled Model Intercomparison Project), ocean model data (such as the Parallel Ocean Program), observation data (Atmospheric Radiation Measurement Best Estimate, Carbon Dioxide Information and Analysis Center, Atmospheric Infrared Sounder, etc.), and analysis and visualization tools, all serving a diverse user community. These data holdings and services are distributed across multiple ESG-CET sites (such as ANL, LANL, LBNL/NERSC, LLNL/PCMDI, NCAR, and ORNL) and at unfunded partner sites, such as the Australian National University National Computational Infrastructure, the British Atmospheric Data Centre, the National Oceanic and Atmospheric Administration Geophysical Fluid Dynamics Laboratory, the Max Planck Institute for Meteorology, the German Climate Computing

  20. Method of grid generation

    DOEpatents

    Barnette, Daniel W.

    2002-01-01

    The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.

  1. Dynamic Power Grid Simulation

    Energy Science and Technology Software Center (ESTSC)

    2015-09-14

    GridDyn is a part of power grid simulation toolkit. The code is designed using modern object oriented C++ methods utilizing C++11 and recent Boost libraries to ensure compatibility with multiple operating systems and environments.

  2. Collar grids for intersecting geometric components within the Chimera overlapped grid scheme

    NASA Technical Reports Server (NTRS)

    Parks, Steven J.; Buning, Pieter G.; Chan, William M.; Steger, Joseph L.

    1991-01-01

    A method for overcoming problems with using the Chimera overset grid scheme in the region of intersecting geometry components is presented. A 'collar grid' resolves the intersection region and provides communication between the component grids. This approach is validated by comparing computed and experimental data for a flow about a wing/body configuration. Application of the collar grid scheme to the Orbiter fuselage and vertical tail intersection in a computation of the full Space Shuttle launch vehicle demonstrates its usefulness for simulation of flow about complex aerospace vehicles.

  3. IPG Power Grid Overview

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas

    2003-01-01

    This presentation will describe what is meant by grids and then cover the current state of the IPG. This will include an overview of the middleware that is key to the operation of the grid. The presentation will then describe some of the future directions that are planned for the IPG. Finally the presentation will conclude with a brief overview of the Global Grid Forum, which is a key activity that will contribute to the successful availability of grid components.

  4. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  5. SYSTEMS MANUAL FOR THE EXPERIMENTAL LITERATURE COLLECTION AND REFERENCE RETRIEVAL SYSTEM OF THE CENTER FOR THE INFORMATION SCIENCES. EXPERIMENTAL RETRIEVAL SYSTEMS STUDIES, REPORT NUMBER 2.

    ERIC Educational Resources Information Center

    ANDERSON, RONALD R.; TAYLOR, ROBERT S.

    THE MANUAL DESCRIBES AND DOCUMENTS THE RETRIEVAL SYSTEM IN TERMS OF ITS TAPE AND DISK FILE PROGRAMS AND ITS SEARCH PROGRAMS AS USED BY THE LEHIGH CENTER FOR THE INFORMATION SCIENCES FOR SELECTED CURRENT LITERATURE OF THE INFORMATION SCIENCES, ABOUT 2500 DOCUMENT REFERENCES. THE SYSTEM IS PRESENTLY ON-LINE VIA TELETYPE AND CONVERSION IS IN PROCESS…

  6. Understanding The Smart Grid

    SciTech Connect

    2007-11-15

    The report provides an overview of what the Smart Grid is and what is being done to define and implement it. The electric industry is preparing to undergo a transition from a centralized, producer-controlled network to a decentralized, user-interactive one. Not only will the technology involved in the electric grid change, but the entire business model of the industry will change too. A major objective of the report is to identify the changes that the Smart Grid will bring about so that industry participants can be prepared to face them. A concise overview of the development of the Smart Grid is provided. It presents an understanding of what the Smart Grid is, what new business opportunities or risks might come about due to its introduction, and what activities are already taking place regarding defining or implementing the Smart Grid. This report will be of interest to the utility industry, energy service providers, aggregators, and regulators. It will also be of interest to home/building automation vendors, information technology vendors, academics, consultants, and analysts. The scope of the report includes an overview of the Smart Grid which identifies the main components of the Smart Grid, describes its characteristics, and describes how the Smart Grid differs from the current electric grid. The overview also identifies the key concepts involved in the transition to the Smart Grid and explains why a Smart Grid is needed by identifying the deficiencies of the current grid and the need for new investment. The report also looks at the impact of the Smart Grid, identifying other industries which have gone through a similar transition, identifying the overall benefits of the Smart Grid, and discussing the impact of the Smart Grid on industry participants. Furthermore, the report looks at current activities to implement the Smart Grid including utility projects, industry collaborations, and government initiatives. Finally, the report takes a look at key technology

  7. An Experimental Design to Study the Effectiveness of PBL in Higher Education, in First Year Science Students at a University in Peru, South America

    ERIC Educational Resources Information Center

    Alcazar, Maria Teresa Moreno; Fitzgerald, Victoria Landa

    2005-01-01

    An experimental study was designed to study the effectiveness of Problem Based Learning (PBL) in the context of higher education in an urban-city university in Lima, Peru. In the fall semester of 2004, eleven sections of Chemistry 1 were offered to first year students in the College of Science at this University. In six of these eleven sections…

  8. Grid quality improvement by a grid adaptation technique

    NASA Technical Reports Server (NTRS)

    Lee, K. D.; Henderson, T. L.; Choo, Y. K.

    1991-01-01

    A grid adaptation technique is presented which improves grid quality. The method begins with an assessment of grid quality by defining an appropriate grid quality measure. Then, undesirable grid properties are eliminated by a grid-quality-adaptive grid generation procedure. The same concept has been used for geometry-adaptive and solution-adaptive grid generation. The difference lies in the definition of the grid control sources; here, they are extracted from the distribution of a particular grid property. Several examples are presented to demonstrate the versatility and effectiveness of the method.

  9. Navigation in Grid Space with the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a navigational tool for computational grids. The navigational process is based on measuring the grid characteristics with the NAS Grid Benchmarks (NGB) and using the measurements to assign tasks of a grid application to the grid machines. The tool allows the user to explore the grid space and to navigate the execution at a grid application to minimize its turnaround time. We introduce the notion of gridscape as a user view of the grid and show how it can be me assured by NGB, Then we demonstrate how the gridscape can be used with two different schedulers to navigate a grid application through a rudimentary grid.

  10. Development and Evaluation of an Experimental Curriculum for the New Quincy (Mass.) Vocational-Technical School. The Science Curriculum.

    ERIC Educational Resources Information Center

    Champagne, Audrey; Albert, Anne

    Activities concerning the development of the science curriculum of Project ABLE are summarized. The science curriculum attempts to relate science content to vocational areas where applicable, but emphasizes generalizations which the student will apply in his specific vocational field. Intended for 10th, 11th, and 12th grade students, the…

  11. Emerging trends: grid technology in pathology.

    PubMed

    Bueno, Gloria; García-Rojo, Marcial; Déniz, Oscar; Fernández-Carrobles, María del Milagro; Vállez, Noelia; Salido, Jesús; García-González, Jesús

    2012-01-01

    Grid technology has enabled clustering and access to, and interaction among, a wide variety of geographically distributed resources such as supercomputers, storage systems, data sources, instruments as well as special devices and services, realizing network-centric operations. Their main applications include large scale computational and data intensive problems in science and engineering. Grids are likely to have a deep impact on health related applications. Moreover, they seem to be suitable for tissue-based diagnosis. They offer a powerful tool to deal with current challenges in many biomedical domains involving complex anatomical and physiological modeling of structures from images or large image databases assembling and analysis. This chapter analyzes the general structures and functions of a Grid environment implemented for tissue-based diagnosis on digital images. Moreover, it presents a Grid middleware implemented by the authors for diagnostic pathology applications. The chapter is a review of the work done as part of the European COST project EUROTELEPATH. PMID:22925801

  12. On the Frisch-Grid signal in ionization chambers

    NASA Astrophysics Data System (ADS)

    Al-Adili, A.; Hambsch, F.-J.; Bencardino, R.; Pomp, S.; Oberstedt, S.; Zeynalov, Sh.

    2012-04-01

    A recent theoretical approach concerning the grid-inefficiency (GI) problem in Twin Frisch-Grid Ionization Chambers was validated experimentally. The experimental verification focused on the induced signal on the anode plate. In this work the investigation was extended by studying the grid signal. The aim was to verify the grid-signal dependency on the grid inefficiency σ. The measurements were made with fission fragments from Cf(sf)252, using two different grids, with 1 and 2 mm wire distances, leading to the GI values: σ=0.031 and σ=0.083, respectively. The theoretical grid signal was confirmed because the detected grid pulse-height distribution was smaller for the larger σ. By applying the additive GI correction approach, the two grid pulse heights were consistent. In the second part of the work, the corrected grid signal was used to deduce emission angles of the fission fragments. It is inconvenient to treat the grid signal by means of conventional analogue electronics, because of its bipolarity. Therefore, the anode and grid signals were summed to create a unipolar, angle-dependent pulse height. Until now the so-called summing method has been the well-established approach to deduce the angle from the grid signal. However, this operation relies strongly on an accurate and stable calibration between the two summed signals. By application of digital-signal processing, the grid signal's bipolarity is no longer an issue. Hence one can bypass the intermediate summation step of the two different pre-amplifier signals, which leads to higher stability. In this work the grid approach was compared to the summing method in three cases: Cf(sf)252, U(n,f)235 and U(n,f)234. By using the grid directly, the angular resolution was found equally good in the first case but gave 7% and 20% improvements, respectively, in the latter cases.

  13. Grid enabled Service Support Environment - SSE Grid

    NASA Astrophysics Data System (ADS)

    Goor, Erwin; Paepen, Martine

    2010-05-01

    The SSEGrid project is an ESA/ESRIN project which started in 2009 and is executed by two Belgian companies, Spacebel and VITO, and one Dutch company, Dutch Space. The main project objectives are the introduction of a Grid-based processing on demand infrastructure at the Image Processing Centre for earth observation products at VITO and the inclusion of Grid processing services in the Service Support Environment (SSE) at ESRIN. The Grid-based processing on demand infrastructure is meant to support a Grid processing on demand model for Principal Investigators (PI) and allow the design and execution of multi-sensor applications with geographically spread data while minimising the transfer of huge volumes of data. In the first scenario, 'support a Grid processing on demand model for Principal Investigators', we aim to provide processing power close to the EO-data at the processing and archiving centres. We will allow a PI (non-Grid expert user) to upload his own algorithm, as a process, and his own auxiliary data from the SSE Portal and use them in an earth observation workflow on the SSEGrid Infrastructure. The PI can design and submit workflows using his own processes, processes made available by VITO/ESRIN and possibly processes from other users that are available on the Grid. These activities must be user-friendly and not requiring detailed knowledge about the underlying Grid middleware. In the second scenario we aim to design, implement and demonstrate a methodology to set up an earth observation processing facility, which uses large volumes of data from various geographically spread sensors. The aim is to provide solutions for problems that we face today, like wasting bandwidth by copying large volumes of data to one location. We will avoid this by processing the data where they are. The multi-mission Grid-based processing on demand infrastructure will allow developing and executing complex and massive multi-sensor data (re-)processing applications more

  14. 75 FR 6414 - Consumer Interface With the Smart Grid

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-09

    ... From the Federal Register Online via the Government Publishing Office OFFICE OF SCIENCE AND TECHNOLOGY POLICY Consumer Interface With the Smart Grid AGENCY: Office of Science and Technology Policy... this notice, the Office of Science and Technology Policy (OSTP) within the Executive Office of...

  15. Securing smart grid technology

    NASA Astrophysics Data System (ADS)

    Chaitanya Krishna, E.; Kosaleswara Reddy, T.; Reddy, M. YogaTeja; Reddy G. M., Sreerama; Madhusudhan, E.; AlMuhteb, Sulaiman

    2013-03-01

    In the developing countries electrical energy is very important for its all-round improvement by saving thousands of dollars and investing them in other sector for development. For Growing needs of power existing hierarchical, centrally controlled grid of the 20th Century is not sufficient. To produce and utilize effective power supply for industries or people we should have Smarter Electrical grids that address the challenges of the existing power grid. The Smart grid can be considered as a modern electric power grid infrastructure for enhanced efficiency and reliability through automated control, high-power converters, modern communications infrastructure along with modern IT services, sensing and metering technologies, and modern energy management techniques based on the optimization of demand, energy and network availability and so on. The main objective of this paper is to provide a contemporary look at the current state of the art in smart grid communications as well as critical issues on smart grid technologies primarily in terms of information and communication technology (ICT) issues like security, efficiency to communications layer field. In this paper we propose new model for security in Smart Grid Technology that contains Security Module(SM) along with DEM which will enhance security in Grid. It is expected that this paper will provide a better understanding of the technologies, potential advantages and research challenges of the smart grid and provoke interest among the research community to further explore this promising research area.

  16. ITIL and Grid services at GridKa

    NASA Astrophysics Data System (ADS)

    Marten, H.; Koenig, T.

    2010-04-01

    The Steinbuch Centre for Computing (SCC) is a new organizational unit of the Karlsruhe Institute of Technology (KIT). Founded in February 2008 as a merger of the previous Institute for Scientific Computing of Forschungszentrum Karlsruhe and the Computing Centre of the Technical University Karlsruhe, SCC provides a broad spectrum of IT services for 8.000 employees and 18.000 students and carries out research and development in key areas of information technology under the same roof. SCC is also known to host the German WLCG [1] Tier-1 centre GridKa. In order to accompany the merging of the two existing computing centres located at a distance of about 10 km and to provide common first class services for science, SCC has selected the IT service management according to the industrial quasi-standard "IT Infrastructure Library (ITIL)" [3] as a strategic element. The paper discusses the implementation of a few ITIL key components from the perspective of a Scientific Computing Centre using examples of Grid services at GridKa.

  17. Grid Application for the BaBar Experiment

    SciTech Connect

    Khan, A.; Wilson, F.; /Rutherford

    2006-08-14

    This paper discusses the use of e-Science Grid in providing computational resources for modern international High Energy Physics (HEP) experiments. We investigate the suitability of the current generation of Grid software to provide the necessary resources to perform large-scale simulation of the experiment and analysis of data in the context of multinational collaboration.

  18. DICOM image communication in globus-based medical grids.

    PubMed

    Vossberg, Michal; Tolxdorff, Thomas; Krefting, Dagmar

    2008-03-01

    Grid computing, the collaboration of distributed resources across institutional borders, is an emerging technology to meet the rising demand on computing power and storage capacity in fields such as high-energy physics, climate modeling, or more recently, life sciences. A secure, reliable, and highly efficient data transport plays an integral role in such grid environments and even more so in medical grids. Unfortunately, many grid middleware distributions, such as the well-known Globus Toolkit, lack the integration of the world-wide medical image communication standard Digital Imaging and Communication in Medicine (DICOM). Currently, the DICOM protocol first needs to be converted to the file transfer protocol (FTP) that is offered by the grid middleware. This effectively reduces most of the advantages and security an integrated network of DICOM devices offers. In this paper, a solution is proposed that adapts the DICOM protocol to the Globus grid security infrastructure and utilizes routers to transparently route traffic to and from DICOM systems. Thus, all legacy DICOM devices can be seamlessly integrated into the grid without modifications. A prototype of the grid routers with the most important DICOM functionality has been developed and successfully tested in the MediGRID test bed, the German grid project for life sciences. PMID:18348944

  19. Impingement-Current-Erosion Characteristics of Accelerator Grids on Two-Grid Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Barker, Timothy

    1996-01-01

    Accelerator grid sputter erosion resulting from charge-exchange-ion impingement is considered to be a primary cause of failure for electrostatic ion thrusters. An experimental method was developed and implemented to measure erosion characteristics of ion-thruster accel-grids for two-grid systems as a function of beam current, accel-grid potential, and facility background pressure. Intricate accelerator grid erosion patterns, that are typically produced in a short time (a few hours), are shown. Accelerator grid volumetric and depth-erosion rates are calculated from these erosion patterns and reported for each of the parameters investigated. A simple theoretical volumetric erosion model yields results that are compared to experimental findings. Results from the model and experiments agree to within 10%, thereby verifying the testing technique. In general, the local distribution of erosion is concentrated in pits between three adjacent holes and trenches that join pits. The shapes of the pits and trenches are shown to be dependent upon operating conditions. Increases in beam current and the accel-grid voltage magnitude lead to deeper pits and trenches. Competing effects cause complex changes in depth-erosion rates as background pressure is increased. Shape factors that describe pits and trenches (i.e. ratio of the average erosion width to the maximum possible width) are also affected in relatively complex ways by changes in beam current, ac tel-grid voltage magnitude, and background pressure. In all cases, however, gross volumetric erosion rates agree with theoretical predictions.

  20. Ambiguities in the grid-inefficiency correction for Frisch-Grid Ionization Chambers

    NASA Astrophysics Data System (ADS)

    Al-Adili, A.; Hambsch, F.-J.; Bencardino, R.; Oberstedt, S.; Pomp, S.

    2012-05-01

    Ionization chambers with Frisch grids have been very successfully applied to neutron-induced fission-fragment studies during the past 20 years. They are radiation resistant and can be easily adapted to the experimental conditions. The use of Frisch grids has the advantage to remove the angular dependency from the charge induced on the anode plate. However, due to the Grid Inefficiency (GI) in shielding the charges, the anode signal remains slightly angular dependent. The correction for the GI is, however, essential to determine the correct energy of the ionizing particles. GI corrections can amount to a few percent of the anode signal. Presently, two contradicting correction methods are considered in literature. The first method adding the angular-dependent part of the signal to the signal pulse height; the second method subtracting the former from the latter. Both additive and subtractive approaches were investigated in an experiment where a Twin Frisch-Grid Ionization Chamber (TFGIC) was employed to detect the spontaneous fission fragments (FF) emitted by a 252Cf source. Two parallel-wire grids with different wire spacing (1 and 2 mm, respectively), were used individually, in the same chamber side. All the other experimental conditions were unchanged. The 2 mm grid featured more than double the GI of the 1 mm grid. The induced charge on the anode in both measurements was compared, before and after GI correction. Before GI correction, the 2 mm grid resulted in a lower pulse-height distribution than the 1 mm grid. After applying both GI corrections to both measurements only the additive approach led to consistent grid independent pulse-height distributions. The application of the subtractive correction on the contrary led to inconsistent, grid-dependent results. It is also shown that the impact of either of the correction methods is small on the FF mass distributions of 235U(nth, f).

  1. Solar cell grid patterns

    NASA Technical Reports Server (NTRS)

    Yasui, R. K.; Berman, P. A. (Inventor)

    1976-01-01

    A grid pattern is described for a solar cell of the type which includes a semiconductive layer doped to a first polarity and a top counter-doped layer. The grid pattern comprises a plurality of concentric conductive grids of selected geometric shapes which are centered about the center of the exposed active surface of the counter-doped layer. Connected to the grids is one or more conductors which extend to the cell's periphery. For the pattern area, the grids and conductors are arranged in the pattern to minimize the maximum distance which any injected majority carriers have to travel to reach any of the grids or conductors. The pattern has a multiaxes symmetry with respect to the cell center to minimize the maximum temperature differentials between points on the cell surface and to provide a more uniform temperature distribution across the cell face.

  2. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  3. FURSMASA: a new approach to rapid scoring functions that uses a MD-averaged potential energy grid and a solvent-accessible surface area term with parameters GA fit to experimental data.

    PubMed

    Pearlman, David A; Rao, B Govinda; Charifson, Paul

    2008-05-15

    We demonstrate a new approach to the development of scoring functions through the formulation and parameterization of a new function, which can be used both for rapidly ranking the binding of ligands to proteins and for estimating relative aqueous molecular solubilities. The intent of this work is to introduce a new paradigm for creation of scoring functions, wherein we impose the following criteria upon the function: (1) simple; (2) intuitive; (3) requires no postparameterization tweaking; (4) can be applied (without reparameterization) to multiple target systems; and (5) can be rapidly evaluated for any potential ligand. Following these criteria, a new function, FURSMASA (function for rapid scoring using an MD-averaged grid and the accessible surface area) has been developed. Three novel features of the function include: (1) use of an MD-averaged potential energy grid for ligand-protein interactions, rather than a simple static grid; (2) inclusion of a term that depends on the change in the solvent-accessible surface area changes on an atomic (not molecular) basis; and (3) use of the recently derived predictive index (PI) target when optimizing the function, which focuses the function on its intended purpose of relative ranking. A genetic algorithm is used to optimize the function against test data sets that include ligands for the following proteins: IMPDH, p38, gyrase B, HIV-1, and TACE, as well as the Syracuse Research solubility database. We find that the function is predictive, and can simultaneously fit all the test data sets with cross-validated predictive indices ranging from 0.68 to 0.82. As a test of the ability of this function to predict binding for systems not in the training set, the resulting fitted FURSAMA function is then applied to 23 ligands of the COX-2 enzyme. Comparing the results for COX-2 against those obtained using a variety of well-known rapid scoring functions demonstrates that FURSMASA outperforms all of them in terms of the PI and

  4. Consideration of Experimental Approaches in the Physical and Biological Sciences in Designing Long-Term Watershed Studies in Forested Landscapes

    NASA Astrophysics Data System (ADS)

    Stallard, R. F.

    2011-12-01

    The importance of biological processes in controlling weathering, erosion, stream-water composition, soil formation, and overall landscape development is generally accepted. The U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) Project in eastern Puerto Rico and Panama and the Smithsonian Tropical Research Institute (STRI) Panama Canal Watershed Experiment (PCWE) are landscape-scale studies based in the humid tropics where the warm temperatures, moist conditions, and luxuriant vegetation promote especially rapid biological and chemical processes - photosynthesis, respiration, decay, and chemical weathering. In both studies features of small-watershed, large-watershed, and landscape-scale-biology experiments are blended to satisfy the research needs of the physical and biological sciences. The WEBB Project has successfully synthesized its first fifteen years of data, and has addressed the influence of land cover, geologic, topographic, and hydrologic variability, including huge storms on a wide range of hydrologic, physical, and biogeochemical processes. The ongoing PCWE should provide a similar synthesis of a moderate-sized humid tropical watershed. The PCWE and the Agua Salud Project (ASP) within the PCWE are now addressing the role of land cover (mature forests, pasture, invasive-grass dominated, secondary succession, native species plantation, and teak) at scales ranging from small watersheds to the whole Panama Canal watershed. Biologists have participated in the experimental design at both watershed scales, and small (0.1 ha) to large (50 ha) forest-dynamic plots have a central role in interfacing between physical scientists and biologists. In these plots, repeated, high-resolution mapping of all woody plants greater than 1-cm diameter provides a description of population changes through time presumably reflecting individual life histories, interactions with other organisms and the influence of landscape processes and climate

  5. Enhanced Elliptic Grid Generation

    NASA Technical Reports Server (NTRS)

    Kaul, Upender K.

    2007-01-01

    An enhanced method of elliptic grid generation has been invented. Whereas prior methods require user input of certain grid parameters, this method provides for these parameters to be determined automatically. "Elliptic grid generation" signifies generation of generalized curvilinear coordinate grids through solution of elliptic partial differential equations (PDEs). Usually, such grids are fitted to bounding bodies and used in numerical solution of other PDEs like those of fluid flow, heat flow, and electromagnetics. Such a grid is smooth and has continuous first and second derivatives (and possibly also continuous higher-order derivatives), grid lines are appropriately stretched or clustered, and grid lines are orthogonal or nearly so over most of the grid domain. The source terms in the grid-generating PDEs (hereafter called "defining" PDEs) make it possible for the grid to satisfy requirements for clustering and orthogonality properties in the vicinity of specific surfaces in three dimensions or in the vicinity of specific lines in two dimensions. The grid parameters in question are decay parameters that appear in the source terms of the inhomogeneous defining PDEs. The decay parameters are characteristic lengths in exponential- decay factors that express how the influences of the boundaries decrease with distance from the boundaries. These terms govern the rates at which distance between adjacent grid lines change with distance from nearby boundaries. Heretofore, users have arbitrarily specified decay parameters. However, the characteristic lengths are coupled with the strengths of the source terms, such that arbitrary specification could lead to conflicts among parameter values. Moreover, the manual insertion of decay parameters is cumbersome for static grids and infeasible for dynamically changing grids. In the present method, manual insertion and user specification of decay parameters are neither required nor allowed. Instead, the decay parameters are

  6. A grid amplifier

    NASA Technical Reports Server (NTRS)

    Kim, Moonil; Weikle, Robert M., II; Hacker, Jonathan B.; Delisio, Michael P.; Rutledge, David B.; Rosenberg, James J.; Smith, R. P.

    1991-01-01

    A 50-MESFET grid amplifier is reported that has a gain of 11 dB at 3.3 GHz. The grid isolates the input from the output by using vertical polarization for the input beam and horizontal polarization for the transmitted output beam. The grid unit cell is a two-MESFET differential amplifier. A simple calibration procedure allows the gain to be calculated from a relative power measurement. This grid is a hybrid circuit, but the structure is suitable for fabrication as a monolithic wafer-scale integrated circuit, particularly at millimeter wavelengths.

  7. D. Carlos de Braganca, a Pioneer of Experimental Marine Oceanography: Filling the Gap between Formal and Informal Science Education

    ERIC Educational Resources Information Center

    Faria, Claudia; Pereira, Goncalo; Chagas, Isabel

    2012-01-01

    The activities presented in this paper are part of a wider project that investigates the effects of infusing the history of science in science teaching, toward students' learning and attitude. Focused on the work of D. Carlos de Braganca, King of Portugal from 1889 to 1908, and a pioneer oceanographer, the activities are addressed at the secondary…

  8. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services

  9. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services

  10. Geometric grid generation

    NASA Technical Reports Server (NTRS)

    Ives, David

    1995-01-01

    This paper presents a highly automated hexahedral grid generator based on extensive geometrical and solid modeling operations developed in response to a vision of a designer-driven one day turnaround CFD process which implies a designer-driven one hour grid generation process.

  11. Internet 2 Access Grid.

    ERIC Educational Resources Information Center

    Simco, Greg

    2002-01-01

    Discussion of the Internet 2 Initiative, which is based on collaboration among universities, businesses, and government, focuses on the Access Grid, a Computational Grid that includes interactive multimedia within high-speed networks to provide resources to enable remote collaboration among the research community. (Author/LRW)

  12. Security for grids

    SciTech Connect

    Humphrey, Marty; Thompson, Mary R.; Jackson, Keith R.

    2005-08-14

    Securing a Grid environment presents a distinctive set of challenges. This paper groups the activities that need to be secured into four categories: naming and authentication; secure communication; trust, policy, and authorization; and enforcement of access control. It examines the current state of the art in securing these processes and introduces new technologies that promise to meet the security requirements of Grids more completely.

  13. Campus Grids: Bringing Additional Computational Resources to HEP Researchers

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Fraser, Dan; Bockelman, Brian; Swanson, David

    2012-12-01

    It is common at research institutions to maintain multiple clusters that represent different owners or generations of hardware, or that fulfill different needs and policies. Many of these clusters are consistently under utilized while researchers on campus could greatly benefit from these unused capabilities. By leveraging principles from the Open Science Grid it is now possible to utilize these resources by forming a lightweight campus grid. The campus grids framework enables jobs that are submitted to one cluster to overflow, when necessary, to other clusters within the campus using whatever authentication mechanisms are available on campus. This framework is currently being used on several campuses to run HEP and other science jobs. Further, the framework has in some cases been expanded beyond the campus boundary by bridging campus grids into a regional grid, and can even be used to integrate resources from a national cyberinfrastructure such as the Open Science Grid. This paper will highlight 18 months of operational experiences creating campus grids in the US, and the different campus configurations that have successfully utilized the campus grid infrastructure.

  14. The SIM astronmetric grid

    NASA Technical Reports Server (NTRS)

    Swartz, R.

    2002-01-01

    The Space Interferometry Mission (SIM) is fundamentally a one-dimensional instrument with a 15-degree field-of-regard. Mission objectives require a global reference grid of thousands of well-understood stars with positions known to 4 microarcseconds which will be used to establish the instrument baseline vector during scientific observations. This accuracy will be achieved by frequently observing a set of stars throughout the mission and performing a global fit of the observations to determine position, proper motion and parallax for each star. Each star will be observed approximately 200 times with about 6.5 stars per single instrument field on the sky. We describe the nature of the reference grid, the candidate objects, and the results of simulations demonstrating grid performance, including estimates of the grid robustness when including effects such as instrument drift and possible contamination of the grid star sample by undetected binaries.

  15. Optimization Of A Computational Grid

    NASA Technical Reports Server (NTRS)

    Pearce, Daniel G.

    1993-01-01

    In improved method of generation of computational grid, grid-generation process decoupled from definition of geometry. Not necessary to redefine boundary. Instead, continuous boundaries in physical domain specified, and then grid points in computational domain mapped onto continuous boundaries.

  16. Decentral Smart Grid Control

    NASA Astrophysics Data System (ADS)

    Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk

    2015-01-01

    Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals.

  17. Dynamic Load Balancing for Adaptive Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Saini, Subhash (Technical Monitor)

    1998-01-01

    Dynamic mesh adaptation on unstructured grids is a powerful tool for computing unsteady three-dimensional problems that require grid modifications to efficiently resolve solution features. By locally refining and coarsening the mesh to capture phenomena of interest, such procedures make standard computational methods more cost effective. Highly refined meshes are required to accurately capture shock waves, contact discontinuities, vortices, and shear layers in fluid flow problems. Adaptive meshes have also proved to be useful in several other areas of computational science and engineering like computer vision and graphics, semiconductor device modeling, and structural mechanics. Local mesh adaptation provides the opportunity to obtain solutions that are comparable to those obtained on globally-refined grids but at a much lower cost. Additional information is contained in the original extended abstract.

  18. Visual Methods for Model and Grid Validation

    NASA Technical Reports Server (NTRS)

    Pang, Alex

    1998-01-01

    This joint research interchange proposal allowed us to contribute in two directions that are of interest to NASA. These are: (a) data level comparative visualization of experimental and computational fluid flow, and (b) visualization tools for analysis of adaptively refined Cartesian grids.

  19. Grid Computing Education Support

    SciTech Connect

    Steven Crumb

    2008-01-15

    The GGF Student Scholar program enabled GGF the opportunity to bring over sixty qualified graduate and under-graduate students with interests in grid technologies to its three annual events over the three-year program.

  20. Space Development Grid Portal

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2004-01-01

    This viewgraph presentation provides information on the development of a portal to provide secure and distributed grid computing for Payload Operations Integrated Center and Mission Control Center ground services.

  1. Geoscientific Workflows using Grid Computing Infrastructure

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Woodcock, Robert; Rankine, Terry

    2010-05-01

    A major benefit of high-performance computing is the vast improvement it offers scientists and researchers in Australia when it comes to exploring large data problems. Thus, high performance computers and grid computing have become key components in exploration and mining applications in the past few years; anything from running numerical simulations to creating a virtual laboratories or providing a mechanism to discover and process geodetic information collected by GPS stations. However, scientists have previously been restricted in their abilities to access high-performance computing for these and other uses due to the fragmented nature of the infrastructure. Australia's high-performance centres host a range of different supercomputers from different manufacturers, with differing configurations, different architectures. Each supercomputer is different so users who want to use more than one supercomputer successfully need to know the difference between them all. CSIRO has found an answer to this dilemma via the AuScope Grid. AuScope Grid is creating an e-Research Infrastructure to federate and make nationally distributed datasets and high-performance computing resources interoperable. AuScope Grid is developing tools to manipulate large data volumes and establishing an appropriate governance framework to ensure sustainability. AuScope Grid's premise is to comprise distributed data storage hardware, high bandwidth network links, data management protocols, middleware and software. Major geoscience and geospatial data stores of the government agencies are deploying this technology for use internally and as an external face to their data. Combining this with high-performance compute resources and high-bandwidth networks the academic community can now tackle some of the science problems they have wanted to attempt for some time but have not been feasible until now. AuScope has deployed a grid computing platform which standardises access to high performance computers

  2. IDL Grid Web Portal

    NASA Astrophysics Data System (ADS)

    Massimino, P.; Costa, A.

    2008-08-01

    Image Data Language is a software for data analysis, visualization and cross-platform application development. The potentiality of IDL is well-known in the academic scientific world, especially in the astronomical environment where thousands of procedures are developed by using IDL. The typical use of IDL is the interactive mode but it is also possible to run IDL programs that do not require any interaction with the user, submitting them in batch or background modality. Through the interactive mode the user immediately receives images or other data produced in the running phase of the program; in batch or background mode, the user will have to wait for the end of the program, sometime for many hours or days to obtain images or data that IDL produced as output: in fact in Grid environment it is possible to access to or retrieve data only after completion of the program. The work that we present gives flexibility to IDL procedures submitted to the Grid computer infrastructure. For this purpose we have developed an IDL Grid Web Portal to allow the user to access the Grid and to submit IDL programs granting a full job control and the access to images and data generated during the running phase, without waiting for their completion. We have used the PHP technology and we have given the same level of security that Grid normally offers to its users. In this way, when the user notices that the intermediate program results are not those expected, he can stop the job, change the parameters to better satisfy the computational algorithm and resubmit the program, without consuming the CPU time and other Grid resources. The IDL Grid Web Portal allows you to obtain IDL generated images, graphics and data tables by using a normal browser. All conversations from the user and the Grid resources occur via Web, as well as authentication phases. The IDL user has not to change the program source much because the Portal will automatically introduce the appropriate modification before

  3. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  4. Beyond grid security

    NASA Astrophysics Data System (ADS)

    Hoeft, B.; Epting, U.; Koenig, T.

    2008-07-01

    While many fields relevant to Grid security are already covered by existing working groups, their remit rarely goes beyond the scope of the Grid infrastructure itself. However, security issues pertaining to the internal set-up of compute centres have at least as much impact on Grid security. Thus, this talk will present briefly the EU ISSeG project (Integrated Site Security for Grids). In contrast to groups such as OSCT (Operational Security Coordination Team) and JSPG (Joint Security Policy Group), the purpose of ISSeG is to provide a holistic approach to security for Grid computer centres, from strategic considerations to an implementation plan and its deployment. The generalised methodology of Integrated Site Security (ISS) is based on the knowledge gained during its implementation at several sites as well as through security audits, and this will be briefly discussed. Several examples of ISS implementation tasks at the Forschungszentrum Karlsruhe will be presented, including segregation of the network for administration and maintenance and the implementation of Application Gateways. Furthermore, the web-based ISSeG training material will be introduced. This aims to offer ISS implementation guidance to other Grid installations in order to help avoid common pitfalls.

  5. Using Grid Benchmarks for Dynamic Scheduling of Grid Applications

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert

    2003-01-01

    Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.

  6. Exploring Hypersonic, Unstructured-Grid Issues through Structured Grids

    NASA Technical Reports Server (NTRS)

    Mazaheri, Ali R.; Kleb, Bill

    2007-01-01

    Pure-tetrahedral unstructured grids have been shown to produce asymmetric heat transfer rates for symmetric problems. Meanwhile, two-dimensional structured grids produce symmetric solutions and as documented here, introducing a spanwise degree of freedom to these structured grids also yields symmetric solutions. The effects of grid skewness and other perturbations of structured-grids are investigated to uncover possible mechanisms behind the unstructured-grid solution asymmetries. By using controlled experiments around a known, good solution, the effects of particular grid pathologies are uncovered. These structured-grid experiments reveal that similar solution degradation occurs as for unstructured grids, especially for heat transfer rates. Non-smooth grids within the boundary layer is also shown to produce large local errors in heat flux but do not affect surface pressures.

  7. Near-Body Grid Adaption for Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  8. Condenser Microphone Protective Grid Correction for High Frequency Measurements

    NASA Technical Reports Server (NTRS)

    Lee, Erik; Bennett, Reginald

    2010-01-01

    Use of a protective grid on small diameter microphones can prolong the lifetime of the unit, but the high frequency effects can complicate data interpretation. Analytical methods have been developed to correct for the grid effect at high frequencies. Specifically, the analysis pertains to quantifying the microphone protective grid response characteristics in the acoustic near field of a rocket plume noise source. A frequency response function computation using two microphones will be explained. Experimental and instrumentation setup details will be provided. The resulting frequency response function for a B&K 4944 condenser microphone protective grid will be presented, along with associated uncertainties

  9. Grid generation strategies for turbomachinery configurations

    NASA Astrophysics Data System (ADS)

    Lee, K. D.; Henderson, T. L.

    1991-01-01

    Turbomachinery flow fields involve unique grid generation issues due to their geometrical and physical characteristics. Several strategic approaches are discussed to generate quality grids. The grid quality is further enhanced through blending and adapting. Grid blending smooths the grids locally through averaging and diffusion operators. Grid adaptation redistributes the grid points based on a grid quality assessment. These methods are demonstrated with several examples.

  10. GRIDS: Grid-Scale Rampable Intermittent Dispatchable Storage

    SciTech Connect

    2010-09-01

    GRIDS Project: The 12 projects that comprise ARPA-E’s GRIDS Project, short for “Grid-Scale Rampable Intermittent Dispatchable Storage,” are developing storage technologies that can store renewable energy for use at any location on the grid at an investment cost less than $100 per kilowatt hour. Flexible, large-scale storage would create a stronger and more robust electric grid by enabling renewables to contribute to reliable power generation.

  11. Reference installation for the German grid initiative D-Grid

    NASA Astrophysics Data System (ADS)

    Buehler, W.; Dulov, O.; Garcia, A.; Jejkal, T.; Jrad, F.; Marten, H.; Mol, X.; Nilsen, D.; Schneider, O.

    2010-04-01

    The D-Grid reference installation is a test platform for the German grid initiative. The main task is to create the grid prototype for software and hardware components needed in the D-Grid community. For each grid-related task field different alternative middleware is included. With respect to changing demands from the community, new versions of the reference installation are released every six months.

  12. Arc Length Based Grid Distribution For Surface and Volume Grids

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne

    1996-01-01

    Techniques are presented for distributing grid points on parametric surfaces and in volumes according to a specified distribution of arc length. Interpolation techniques are introduced which permit a given distribution of grid points on the edges of a three-dimensional grid block to be propagated through the surface and volume grids. Examples demonstrate how these methods can be used to improve the quality of grids generated by transfinite interpolation.

  13. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  14. Exploring the Opinions of Pre-Service Science Teachers in Their Experimental Designs Prepared Based on Various Approaches

    ERIC Educational Resources Information Center

    Benzer, Elif

    2015-01-01

    The students in working in laboratories in 21st century are preferred to take place as active participants in the experiments coming up with their own designs and projects by developing new ideas and problems rather than implementing the ones told and ordered by others during these experiments. The science teachers that would have the students…

  15. The Experimental Teaching Reform in Biochemistry and Molecular Biology for Undergraduate Students in Peking University Health Science Center

    ERIC Educational Resources Information Center

    Yang, Xiaohan; Sun, Luyang; Zhao, Ying; Yi, Xia; Zhu, Bin; Wang, Pu; Lin, Hong; Ni, Juhua

    2015-01-01

    Since 2010, second-year undergraduate students of an eight-year training program leading to a Doctor of Medicine degree or Doctor of Philosophy degree in Peking University Health Science Center (PKUHSC) have been required to enter the "Innovative talent training project." During that time, the students joined a research lab and…

  16. Recruiting, Preparing, and Retaining High Quality Secondary Mathematics and Science Teachers for Urban Schools: The Cal Teach Experimental Program

    ERIC Educational Resources Information Center

    Newton, Xiaoxia A.; Jang, Heeju; Nunes, Nicci; Stone, Elisa

    2010-01-01

    Recruiting, preparing, and retaining high quality secondary mathematics and science teachers are three of the most critical problems in the nation's urban schools that serve a vast majority of children from socially and economically disadvantaged backgrounds. Although the factors contributing to these problems are complex, one area that has caught…

  17. INTRODUCTION TO THE SOCIAL SCIENCES. AN EXPERIMENTAL INDEPENDENT STUDY COURSE FOR GIFTED HIGH SCHOOL STUDENTS. TEACHER MANUAL.

    ERIC Educational Resources Information Center

    SANDBERG, JOHN H.; AND OTHERS

    "INTRODUCTION TO THE SOCIAL SCIENCES" IS A RECOGNIZED AND ACCREDITED COURSE IN THE SCHOOL CURRICULUM. THOUGH IT IS OFFERED TO SENIORS WHO SCORE IN THE TOP TWO OR THREE PERCENTILE RANKS ON STANDARDIZED TESTS SUCH AS THE STANFORD-BINET INTELLIGENCE SCALE, IT COULD BE DEVELOPED INTO A SEMINAR. MEETING ONCE OR TWICE A WEEK, THE TWO SEMESTER COURSE…

  18. Unstructured Grids on NURBS Surfaces

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1993-01-01

    A simple and efficient computational method is presented for unstructured surface grid generation. This method is built upon an advancing front technique combined with grid projection. The projection technique is based on a Newton-Raphson method. This combined approach has been successfully implemented for structured and unstructured grids. In this paper, the implementation for unstructured grid is discussed.

  19. The Benefits of Grid Networks

    ERIC Educational Resources Information Center

    Tennant, Roy

    2005-01-01

    In the article, the author talks about the benefits of grid networks. In speaking of grid networks the author is referring to both networks of computers and networks of humans connected together in a grid topology. Examples are provided of how grid networks are beneficial today and the ways in which they have been used.

  20. Decentralized Service Allocation in a Broker Overlay Based Grid

    NASA Astrophysics Data System (ADS)

    Azab, Abdulrahman; Meling, Hein

    Grid computing is based on coordinated resource sharing in a dynamic environment of multi-institutional virtual organizations. Data exchanges, and service allocation, are challenging problems in the field of Grid computing. This is due to the decentralization of Grid systems. Building decentralized Grid systems with efficient resource management and software component mechanisms is a need for achieving the required efficiency and usability of Grid systems. In this work, a decentralized Grid system model is presented in which, the system is divided into virtual organizations each controlled by a broker. An overlay network of brokers is responsible for global resource management and managing allocation of services. Experimental results show that, the system achieves dependable performance with various loads of services, and broker failures.

  1. Uniformity on the grid via a configuration framework

    SciTech Connect

    Igor V Terekhov et al.

    2003-03-11

    As Grid permeates modern computing, Grid solutions continue to emerge and take shape. The actual Grid development projects continue to provide higher-level services that evolve in functionality and operate with application-level concepts which are often specific to the virtual organizations that use them. Physically, however, grids are comprised of sites whose resources are diverse and seldom project readily onto a grid's set of concepts. In practice, this also creates problems for site administrators who actually instantiate grid services. In this paper, we present a flexible, uniform framework to configure a grid site and its facilities, and otherwise describe the resources and services it offers. We start from a site configuration and instantiate services for resource advertisement, monitoring and data handling; we also apply our framework to hosting environment creation. We use our ideas in the Information Management part of the SAM-Grid project, a grid system which will deliver petabyte-scale data to the hundreds of users. Our users are High Energy Physics experimenters who are scattered worldwide across dozens of institutions and always use facilities that are shared with other experiments as well as other grids. Our implementation represents information in the XML format and includes tools written in XQuery and XSLT.

  2. Unstructured grids on SIMD torus machines

    NASA Technical Reports Server (NTRS)

    Bjorstad, Petter E.; Schreiber, Robert

    1994-01-01

    Unstructured grids lead to unstructured communication on distributed memory parallel computers, a problem that has been considered difficult. Here, we consider adaptive, offline communication routing for a SIMD processor grid. Our approach is empirical. We use large data sets drawn from supercomputing applications instead of an analytic model of communication load. The chief contribution of this paper is an experimental demonstration of the effectiveness of certain routing heuristics. Our routing algorithm is adaptive, nonminimal, and is generally designed to exploit locality. We have a parallel implementation of the router, and we report on its performance.

  3. Robotic Exploration and Science in Pits and Caves: Results from Three Years and Counting of Analog Field Experimentation

    NASA Astrophysics Data System (ADS)

    Wong, U. Y.; Whittaker, W. L.

    2015-10-01

    Robots are poised to access, investigate, and model planetary caves. We present the results of a multi-year campaign to develop robotic technologies for this domain, anchored by the most comprehensive analog field experimentation to date.

  4. Spaceflight Operations Services Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Mehrotra, Piyush; Lisotta, Anthony

    2004-01-01

    NASA over the years has developed many types of technologies and conducted various types of science resulting in numerous variations of operations, data and applications. For example, operations range from deep space projects managed by JPL, Saturn and Shuttle operations managed from JSC and KSC, ISS science operations managed from MSFC and numerous low earth orbit satellites managed from GSFC that are varied and intrinsically different but require many of the same types of services to fulfill their missions. Also, large data sets (databases) of Shuttle flight data, solar system projects and earth observing data exist which because of their varied and sometimes outdated technologies are not and have not been fully examined for additional information and knowledge. Many of the applications/systems supporting operational services e.g. voice, video, telemetry and commanding, are outdated and obsolete. The vast amounts of data are located in various formats, at various locations and range over many years. The ability to conduct unified space operations, access disparate data sets and to develop systems and services that can provide operational services does not currently exist in any useful form. In addition, adding new services to existing operations is generally expensive and with the current budget constraints not feasible on any broad level of implementation. To understand these services a discussion of each one follows. The Spaceflight User-based Services are those services required to conduct space flight operations. Grid Services are those Grid services that will be used to overcome, through middleware software, some or all the problems that currently exists. In addition, Network Services will be discussed briefly. Network Services are crucial to any type of remedy and are evolving adequately to support any technology currently in development.

  5. GridLAB-D/SG

    Energy Science and Technology Software Center (ESTSC)

    2011-08-30

    GridLAB-D is a new power system simulation tool that provides valuable information to users who design and operate electric power transmission and distribution systems, and to utilities that wish to take advantage of the latest smart grid technology. This special release of GridLAB-D was developed to study the proposed Smart Grid technology that is used by Battelle Memorial Institute in the AEP gridSMART demonstration project in Northeast Columbus, Ohio.

  6. An Approach for Dynamic Grids

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Liou, Meng-Sing; Hindman, Richard G.

    1994-01-01

    An approach is presented for the generation of two-dimensional, structured, dynamic grids. The grid motion may be due to the motion of the boundaries of the computational domain or to the adaptation of the grid to the transient, physical solution. A time-dependent grid is computed through the time integration of the grid speeds which are computed from a system of grid speed equations. The grid speed equations are derived from the time-differentiation of the grid equations so as to ensure that the dynamic grid maintains the desired qualities of the static grid. The grid equations are the Euler-Lagrange equations derived from a variational statement for the grid. The dynamic grid method is demonstrated for a model problem involving boundary motion, an inviscid flow in a converging-diverging nozzle during startup, and a viscous flow over a flat plate with an impinging shock wave. It is shown that the approach is more accurate for transient flows than an approach in which the grid speeds are computed using a finite difference with respect to time of the grid. However, the approach requires significantly more computational effort.

  7. Investigation of proper imaging conditions in the moving grid technique for a reduction of grid line artifacts

    NASA Astrophysics Data System (ADS)

    Park, Chulkyu; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Lee, Minsik; Park, Yeonok; Kang, Yoonseok; Kim, Jinsoo; Chung, Nagkun; Kim, Jinwon; Kim, Jinguk

    2014-11-01

    The most critical problem remaining as an obstacle for the successful use of antiscatter grids in digital X-ray imaging is probably the observation of grid line artifacts such as moiré patterns and shadows of the grid strips themselves in X-ray images, resulting in a risk of misdiagnosis by physicians. In this paper, in order to find a practical solution to the problem of grid line artifacts, we revisited the moving grid technique and investigated its proper imaging conditions. We implemented a simple and useful moving-grid analysis code, iTOM ™ , and performed systematic simulations for a theoretical analysis. We also performed experiments and compared the results to the simulated ones to demonstrate the effectiveness of the code. According to our simulation and experimental results, the grid line artifacts can be effectively reduced when the grid moves with a large velocity or with specific velocities that reduce the coefficient of variation (CV), even with a small velocity. These velocities are determined by using the related parameters such as the grid pitch, the grid strip width, and the exposure time.

  8. Complex Volume Grid Generation Through the Use of Grid Reusability

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This paper presents a set of surface and volume grid generation techniques which reuse existing surface and volume grids. These methods use combinations of data manipulations to reduce grid generation time, improve grid characteristics, and increase the capabilities of existing domain discretization software. The manipulation techniques utilize physical and computational domains to produce basis function on which to operate and modify grid character and smooth grids using Trans-Finite Interpolation, a vector interpolation method and parametric re-mapping technique. With these new techniques, inviscid grids can be converted to viscous grids, multiple zone grid adaption can be performed to improve CFD solver efficiency, and topological changes to improve modeling of flow fields can be done simply and quickly. Examples of these capabilities are illustrated as applied to various configurations.

  9. NREL Smart Grid Projects

    SciTech Connect

    Hambrick, J.

    2012-01-01

    Although implementing Smart Grid projects at the distribution level provides many advantages and opportunities for advanced operation and control, a number of significant challenges must be overcome to maintain the high level of safety and reliability that the modern grid must provide. For example, while distributed generation (DG) promises to provide opportunities to increase reliability and efficiency and may provide grid support services such as volt/var control, the presence of DG can impact distribution operation and protection schemes. Additionally, the intermittent nature of many DG energy sources such as photovoltaics (PV) can present a number of challenges to voltage regulation, etc. This presentation provides an overview a number of Smart Grid projects being performed by the National Renewable Energy Laboratory (NREL) along with utility, industry, and academic partners. These projects include modeling and analysis of high penetration PV scenarios (with and without energy storage), development and testing of interconnection and microgrid equipment, as well as the development and implementation of advanced instrumentation and data acquisition used to analyze the impacts of intermittent renewable resources. Additionally, standards development associated with DG interconnection and analysis as well as Smart Grid interoperability will be discussed.

  10. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  11. Two grid iteration with a conjugate gradient fine grid smoother applied to a groundwater flow model

    SciTech Connect

    Hagger, M.J.; Spence, A.; Cliffe, K.A.

    1994-12-31

    This talk is concerned with the efficient solution of Ax=b, where A is a large, sparse, symmetric positive definite matrix arising from a standard finite element discretisation of the groundwater flow problem {triangledown}{sm_bullet}(k{triangledown}p)=0. Here k is the coefficient of rock permeability in applications and is highly discontinuous. The discretisation is carried out using the Harwell NAMMU finite element package, using, for 2D, 9 node biquadratic rectangular elements, and 27 node biquadratics for 3D. The aim is to develop a robust technique for iterative solutions of 3D problems based on a regional groundwater flow model of a geological area with sharply varying hydrogeological properties. Numerical experiments with polynomial preconditioned conjugate gradient methods on a 2D groundwater flow model were found to yield very poor results, converging very slowly. In order to utilise the fact that A comes from the discretisation of a PDE the authors try the two grid method as is well analysed from studies of multigrid methods, see for example {open_quotes}Multi-Grid Methods and Applications{close_quotes} by W. Hackbusch. Specifically they consider two discretisations resulting in stiffness matrices A{sub N} and A{sub n}, of size N and n respectively, where N > n, for both a model problem and the geological model. They perform a number of conjugate gradient steps on the fine grid, ie using A{sub N}, followed by an exact coarse grid solve, using A{sub n}, and then update the fine grid solution, the exact coarse grid solve being done using a frontal method factorisation of A{sub n}. Note that in the context of the standard two grid method this is equivalent to using conjugate gradients as a fine grid smoothing step. Experimental results are presented to show the superiority of the two grid iteration method over the polynomial preconditioned conjugate gradient method.

  12. Simulation of Unsteady Flows Using an Unstructured Navier-Stokes Solver on Moving and Stationary Grids

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Vatsa, Veer N.; Atkins, Harold L.

    2005-01-01

    We apply an unsteady Reynolds-averaged Navier-Stokes (URANS) solver for unstructured grids to unsteady flows on moving and stationary grids. Example problems considered are relevant to active flow control and stability and control. Computational results are presented using the Spalart-Allmaras turbulence model and are compared to experimental data. The effect of grid and time-step refinement are examined.

  13. Design of Grid Multiscroll Chaotic Attractors via Transformations

    NASA Astrophysics Data System (ADS)

    Ai, Xingxing; Sun, Kehui; He, Shaobo; Wang, Huihai

    Three transformation approaches for generating grid multiscroll chaotic attractors are presented through theoretical analysis and numerical simulation. Three kinds of grid multiscroll chaotic attractors are generated based on one-dimensional multiscroll Chua system. The dynamics of the multiscroll chaotic attractors are analyzed by means of equilibrium points, eigenvalues, the largest Lyapunov exponent and complexity. As the experimental verification, we implemented the circular grid multiscroll attractor on DSP platform. The simulation and experimental results are consistent well with that of theoretical analysis, and it shows that the design approaches are effective.

  14. Grid Data Management and Customer Demands at MeteoSwiss

    NASA Astrophysics Data System (ADS)

    Rigo, G.; Lukasczyk, Ch.

    2010-09-01

    Data grids constitute the required input form for a variety of applications. Therefore, customers increasingly expect climate services to not only provide measured data, but also grids of these with the required configurations on an operational basis. Currently, MeteoSwiss is establishing a production chain for delivering data grids by subscription directly from the data warehouse in order to meet the demand for precipitation data grids by governmental, business and science customers. The MeteoSwiss data warehouse runs on an Oracle database linked with an ArcGIS Standard edition geodatabase. The grids are produced by Unix-based software written in R called GRIDMCH which extracts the station data from the data warehouse and stores the files in the file system. By scripts, the netcdf-v4 files are imported via an FME interface into the database. Currently daily and monthly deliveries of daily precipitation grids are available from MeteoSwiss with a spatial resolution of 2.2km x 2.2km. These daily delivered grids are a preliminary based on 100 measuring sites whilst the grid of the monthly delivery of daily sums is calculated out of about 430 stations. Crucial for the absorption by the customers is the understanding of and the trust into the new grid product. Clearly stating needs which can be covered by grid products, the customers require a certain lead time to develop applications making use of the particular grid. Therefore, early contacts and a continuous attendance as well as flexibility in adjusting the production process to fulfill emerging customer needs are important during the introduction period. Gridding over complex terrain can lead to temporally elevated uncertainties in certain areas depending on the weather situation and coverage of measurements. Therefore, careful instructions on the quality and use and the possibility to communicate the uncertainties of gridded data proofed to be essential especially to the business and science customers who require

  15. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  16. The Computing Grids

    NASA Astrophysics Data System (ADS)

    Govoni, P.

    2009-12-01

    Since the beginning of the millennium, High Energy Physics research institutions like CERN and INFN pioneered several projects aimed at exploiting the synergy among computing power, storage and network resources, and creating an infrastructure of distributed computing on a worldwide scale. In the year 2000, after the Monarch project [ http://monarc.web.cern.ch/MONARC/], DataGrid started [ http://eu-datagrid.web.cern.ch/eu-datagrid/] aimed at providing High Energy Physics with the computing power needed for the LHC enterprise. This program evolved into the EU DataGrid project, that implemented the first actual prototype of a Grid middleware running on a testbed environment. The next step consisted in the application to the LHC experiments, with the LCG project [ http://lcg.web.cern.ch/LCG/], in turn followed by the EGEE [ http://www.eu-egee.org/] and EGEE II programs.

  17. Interactive surface grid generation

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1991-01-01

    This paper describes a surface grid generation tool called S3D. It is the result of integrating a robust and widely applicable interpolation technique with the latest in workstation technology. Employing the use of a highly efficient and user-friendly graphical interface, S3D permits real-time interactive analyses of surface geometry data and facilitates the construction of surface grids for a wide range of applications in Computational Fluid Dynamics (CFD). The design objectives are for S3D to be stand-alone and easy to use so that CFD analysts can take a hands-on approach toward most if not all of their surface grid generation needs. Representative examples of S3D applications are presented in describing the various elements involved in the process.

  18. GridPV Toolbox

    Energy Science and Technology Software Center (ESTSC)

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feedermore » on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.« less

  19. GridPV Toolbox

    SciTech Connect

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago; Reno, Matthew; Coogan, Kyle

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  20. SLGRID: spectral synthesis software in the grid

    NASA Astrophysics Data System (ADS)

    Sabater, J.; Sánchez, S.; Verdes-Montenegro, L.

    2011-11-01

    SLGRID (http://www.e-ciencia.es/wiki/index.php/Slgrid) is a pilot project proposed by the e-Science Initiative of Andalusia (eCA) and supported by the Spanish e-Science Network in the frame of the European Grid Initiative (EGI). The aim of the project was to adapt the spectral synthesis software Starlight (Cid-Fernandes et al. 2005) to the Grid infrastructure. Starlight is used to estimate the underlying stellar populations (their ages and metallicities) using an optical spectrum, hence, it is possible to obtain a clean nebular spectrum that can be used for the diagnostic of the presence of an Active Galactic Nucleus (Sabater et al. 2008, 2009). The typical serial execution of the code for big samples of galaxies made it ideal to be integrated into the Grid. We obtain an improvement on the computational time of order N, being N the number of nodes available in the Grid. In a real case we obtained our results in 3 hours with SLGRID instead of the 60 days spent using Starlight in a PC. The code has already been ported to the Grid. The first tests were made within the e-CA infrastrusture and, later, itwas tested and improved with the colaboration of the CETA-CIEMAT. The SLGRID project has been recently renewed. In a future it is planned to adapt the code for the reduction of data from Integral Field Units where each dataset is composed of hundreds of spectra. Electronic version of the poster at http://www.iaa.es/~jsm/SEA2010

  1. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation

  2. Essential Grid Workflow Monitoring Elements

    SciTech Connect

    Gunter, Daniel K.; Jackson, Keith R.; Konerding, David E.; Lee,Jason R.; Tierney, Brian L.

    2005-07-01

    Troubleshooting Grid workflows is difficult. A typicalworkflow involves a large number of components networks, middleware,hosts, etc. that can fail. Even when monitoring data from all thesecomponents is accessible, it is hard to tell whether failures andanomalies in these components are related toa given workflow. For theGrid to be truly usable, much of this uncertainty must be elim- inated.We propose two new Grid monitoring elements, Grid workflow identifiersand consistent component lifecycle events, that will make Gridtroubleshooting easier, and thus make Grids more usable, by simplifyingthe correlation of Grid monitoring data with a particular Gridworkflow.

  3. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  4. A laboratory course for teaching laboratory techniques, experimental design, statistical analysis, and peer review process to undergraduate science students.

    PubMed

    Gliddon, C M; J Rosengren, R

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and written discussion of results. The laboratory practices were a guided inquiry based around retinol's ability to potentiate acetaminophen-mediated hepatotoxicity. To induce critical thinking, students were given a choice as to which assay they could use to determine how retinol affected acetaminophen hepatotoxicity. Short summaries were handed in following each assay and formed the bases of the formative assessment. To complete the feedback loop, a summative assessment that consisted of all the graphs and concepts from the short summaries were combined into a manuscript. To give the students exposure to science communication, the manuscript had to be written in accordance to the submission guidelines for Toxicological Sciences. Evaluation of this course was determined by a student questionnaire using a Likert scale and students' responses were very favorable. While the subject matter was toxicological centric, the content could be easily modified to suit another subject matter in biochemistry and molecular biology. PMID:23166024

  5. Quasi-optical overmoded waveguide frequency multiplier grid arrays

    NASA Astrophysics Data System (ADS)

    Rosenau, Steven Andrew

    There is a growing need for compact, lightweight, inexpensive high power millimeter wave sources. Frequency multipliers can provide these sources by efficiently converting high power microwave signals to millimeter frequencies. Quasi-optical frequency multiplier grid arrays, comprised of hundreds to thousands of varactor devices and antennas on a single wafer, utilize spatial power combining to significantly increase power handling capability beyond that of a single device. In this dissertation work, theoretical and experimental investigations of frequency multiplier grid arrays have been conducted with a specific focus on overmoded waveguide systems. The principles of frequency multipliers and quasi-optical grid array power combining are presented. Simulation, design and experimental measurement techniques are described for both frequency tripler and doubler grid arrays. During this dissertation work, several quantum barrier varactor frequency tripler grid array systems and Schottky varactor frequency doubler grid array systems were designed, fabricated and tested. A frequency tripler grid array system, containing an innovative integrated output structure, achieved a multiplication efficiency of 3.4% and an output power of 148 mW. The two most efficient frequency doubler grid array systems achieved 11.7% multiplication efficiency and 0.41 W output power.

  6. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, activities, field projects and computer programs in the biological and physical sciences. Instructional procedures, experimental designs, materials, and background information are suggested. Topics include fluid mechanics, electricity, crystals, arthropods, limpets, acid neutralization, and software evaluation. (ML)

  7. APEC Smart Grid Initiative

    SciTech Connect

    Bloyd, Cary N.

    2012-03-01

    This brief paper describes the activities of the Asia Pacific Economic Cooperation (APEC) Smart Grid Initiative (ASGI) which is being led by the U.S. and developed by the APEC Energy Working Group. In the paper, I describe the origin of the initiative and briefly mention the four major elements of the initiative along with existing APEC projects which support it.

  8. Unlocking the smart grid

    SciTech Connect

    Rokach, Joshua Z.

    2010-10-15

    The country has progressed in a relatively short time from rotary dial phones to computers, cell phones, and iPads. With proper planning and orderly policy implementation, the same will happen with the Smart Grid. Here are some suggestions on how to proceed. (author)

  9. NSTAR Smart Grid Pilot

    SciTech Connect

    Rabari, Anil; Fadipe, Oloruntomi

    2014-03-31

    NSTAR Electric & Gas Corporation (“the Company”, or “NSTAR”) developed and implemented a Smart Grid pilot program beginning in 2010 to demonstrate the viability of leveraging existing automated meter reading (“AMR”) deployments to provide much of the Smart Grid functionality of advanced metering infrastructure (“AMI”), but without the large capital investment that AMI rollouts typically entail. In particular, a central objective of the Smart Energy Pilot was to enable residential dynamic pricing (time-of-use “TOU” and critical peak rates and rebates) and two-way direct load control (“DLC”) by continually capturing AMR meter data transmissions and communicating through customer-sited broadband connections in conjunction with a standardsbased home area network (“HAN”). The pilot was supported by the U.S. Department of Energy’s (“DOE”) through the Smart Grid Demonstration program. NSTAR was very pleased to not only receive the funding support from DOE, but the guidance and support of the DOE throughout the pilot. NSTAR is also pleased to report to the DOE that it was able to execute and deliver a successful pilot on time and on budget. NSTAR looks for future opportunities to work with the DOE and others in future smart grid projects.

  10. Efficient grid generation

    NASA Technical Reports Server (NTRS)

    Seki, Rycichi

    1989-01-01

    Because the governing equations in fluid dynamics contain partial differentials and are too difficult in most cases to solve analytically, these differentials are generally replaced by finite difference terms. These terms contain terms in the solution at nearby states. This procedure discretizes the field into a finite number of states. These states, when plotted, form a grid, or mesh, of points. It is at these states, or field points, that the solution is found. The optimum choice of states, the x, y, z coordinate values, minimizes error and computational time. But the process of finding these states is made more difficult by complex boundaries, and by the need to control step size differences between the states, that is, the need to control the spacing of field points. One solution technique uses a different set of state variables, which define a different coordinate system, to generate the grid more easily. A new method, developed by Dr. Joseph Steger, combines elliptic and hyperbolic partial differential equations into a mapping function between the physical and computational coordinate systems. This system of equations offers more control than either equation provides alone. The Steger algorithm was modified in order to allow bodies with stronger concavities to be used, offering the possibility of generating a single grid about multiple bodies. Work was also done on identifying areas where grid breakdown occurs.

  11. Grid generation research at OSU

    NASA Technical Reports Server (NTRS)

    Nakamura, S.

    1992-01-01

    In the last two years, effort was concentrated on: (1) surface modeling; (2) surface grid generation; and (3) 3-D flow space grid generation. The surface modeling shares the same objectives as the surface modeling in computer aided design (CAD), so software available in CAD can in principle be used for solid modeling. Unfortunately, however, the CAD software cannot be easily used in practice for grid generation purposes, because they are not designed to provide appropriate data base for grid generation. Therefore, we started developing a generalized surface modeling software from scratch, that provides the data base for the surface grid generation. Generating surface grid is an important step in generating a 3-D space for flow space. To generate a surface grid on a given surface representation, we developed a unique algorithm that works on any non-smooth surfaces. Once the surface grid is generated, a 3-D space can be generated. For this purpose, we also developed a new algorithm, which is a hybrid of the hyperbolic and the elliptic grid generation methods. With this hybrid method, orthogonality of the grid near the solid boundary can be easily achieved without introducing empirical fudge factors. Work to develop 2-D and 3-D grids for turbomachinery blade geometries was performed, and as an extension of this research we are planning to develop an adaptive grid procedure with an interactive grid environment.

  12. One-Dimensional Grid Turbulence

    NASA Astrophysics Data System (ADS)

    Kerstein, Alan R.; Nilsen, Vebjørn

    1998-11-01

    To capture molecular mixing and other small scale phenomena such as chemical reactions and differential diffusion, it is essential to resolve all the length (and time) scales. For large Reynolds number flows this is impossible to do in three-dimensional turbulence simulations with the current and foreseeable future computer technology. To circumvent this problem the one-dimensional turbulence (ODT) model, as the name implies, considers only one spatial dimension in which all the length scales can be resolved even at very large Reynolds numbers. To incorporate the effect of advection on a one-dimensional domain, the evolution of the velocity and scalar profiles is randomly interrupted by a sequence of profile rearrangements representing the effect of turbulent eddies. Results obtained from ODT simulations of grid turbulence with a passive scalar are presented. The decay exponents for the velocity and passive scalar fluctuations, as predicted by ODT, compare favorably with experimental data.

  13. Teachers' personal didactical models and obstacles to professional development: Case-studies with secondary experimental science teachers

    NASA Astrophysics Data System (ADS)

    Wamba Aguado, Ana Maria

    The aim of this thesis has been to elaborate criteria which characterise how teachers teach, as a curriculum component of their professional knowledge and to infer the obstacles which hinder their desired professional development, in such a way that they are considered in the design of proposals for teacher training in secondary education. In addition to this, a further objective was to elaborate and validate data analysis instruments. Case studies were carried out on three natural science secondary teachers with more than ten years' experience, enabling the characterisation of the teachers' science and science teaching conceptions as well as the description of classroom practice. Finally, with the help of these data together with the material used by the teachers, the inference of the teachers' personal didactical models and the obstacles to their professional development were made possible. Instruments for data collection used a questionnaire to facilitate the realisation of a semi-structured interview, video recordings of the classroom intervention of each teacher which correspond to a teaching unit taught over a two-week period and all the written material produced for the unit was collected. For the data analysis a taxonomy of classroom intervention patterns and a progression hypothesis towards desirable professional knowledge were elaborated, from the perspective of a research in the classroom model and according to a system of categories and subcategories which refer to their concepts about scientific knowledge, school knowledge, how to teach and evaluation. With the interview and the questionnaire a profile of exposed conceptions was obtained. The intervention profile was obtained using the classroom recordings; according to the patterns identified and their sequencing, both of which determine the characteristic structures and routines of these teachers. An outcome of these results was the validation of the previously mentioned taxonomy as an instrument of

  14. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  15. Reading to learn experimental practice: The role of text and firsthand experience in the acquisition of an abstract science principle

    NASA Astrophysics Data System (ADS)

    Richmond, Erica Kesin

    2008-10-01

    From the onset of schooling, texts are used as important educational tools. In the primary years, they are integral to learning how to decode and develop fluency. In the later elementary years, they are often essential to the acquisition of academic content. Unfortunately, many children experience difficulties with this process, which is due in large part to their unfamiliarity with the genre of academic texts. The articles presented in this dissertation share an underlying theme of how to develop children's ability to comprehend and learn from academic, and specifically, non-narrative texts. The first article reviews research on the development of non-narrative discourse to elucidate the linguistic precursors to non-narrative text comprehension. The second and third articles draw from an empirical study that investigated the best way to integrate text, manipulation, and first-hand experience for children's acquisition and application of an abstract scientific principle. The scientific principle introduced in the study was the Control of Variables Strategy (CVS), a fundamental idea underlying scientific reasoning and a strategy for designing unconfounded experiments. Eight grade 4 classes participated in the study (N = 129), in one of three conditions: (a) read procedural text and manipulate experimental materials, (b) listen to procedural text and manipulate experimental materials, or (c) read procedural text with no opportunity to manipulate experimental materials. Findings from the study indicate that children who had the opportunity to read and manipulate materials were most effective at applying the strategy to designing and justifying unconfounded experiments, and evaluating written and physical experimental designs; however, there was no effect of instructional condition on a written assessment of evaluating familiar and unfamiliar experimental designs one week after the intervention. These results suggest that the acquisition and application of an abstract

  16. Les apports de l'experimentation assistee par ordinateur (ExAO) en pedagogie par projet en Sciences de la nature au collegial

    NASA Astrophysics Data System (ADS)

    Marcotte, Alice

    The goals of this research were to conceptualize and to produce a test synthesis model for the Sciences program, where the student had to demonstrate his or her competency using the approach Considering New Situations from Acquired Knowledge. The test took the form of a student-structured project utilizing the experimental process: the student's scientific investigation was supported and facilitated by computer-assisted experimentation (CAEx). The model of action was elaborated in developmental research within the school setting, tested in biology, and continued in an interdisciplinary context. Our study focused on the advantages and the constraints of this new learning environment, which modify laboratories using traditional instrumentation. The final research was not to evaluate a type of test synthesis, but to propose and to improve this model of test synthesis based on experimental process and supported by CAEx. In order to implement the competency approach within an integration activity, we chose a cooperative learning environment contained within the pedagogical project. This didactic environment was inspired by socio-constructivism which involves students in open scientific problem-solving. Computer-assisted experimentation turned out to be a valuable tool for this environment, facilitating the implementation of the scientific process by increased induction. Resistance to confronted and uncircumvented reality changes students' perception of scientific knowledge. They learn to integrate the building of this knowledge, and then to realize the extent of their learning and their training. Students' opinions, which were gathered from questionnaires, reveal that they favorably perceive this type of environment in interaction with their peers and the experimentation. While this new knowledge contributes to CAEx within the pedagogical project, the products of this research included a teaching guide for the test synthesis, a booklet featuring the projects carried out

  17. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework.

    PubMed

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-01-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework

  18. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework

    PubMed Central

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-01-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework

  19. Behavior of grid-stiffened composite structures under transverse loading

    NASA Astrophysics Data System (ADS)

    Gan, Changsheng

    The energy absorption characteristics and failure modes of grid-stiffened composite plates under transverse load were studied in detail. Several laboratory scale composite grid plates were fabricated by using co-mingled E-glass fiber/polypropylene matrix and carbon/nylon composites in a thermoplastic stamping process. Both experimental and finite element approaches were used to evaluate and understand the role of major failure modes on the performance of damaged grid-stiffened composite plates under transverse load. The load-deflection responses of grid-stiffened composite plates were determined and compared with those of sandwich composite plates of the same size. The failure modes of grid-stiffened composite plates under different load conditions were investigated and used as the basis for FEA models. The intrinsic strength properties of constituent composite materials were measured by using either three point bending or tensile test and were used as input data to the FEA models. Several FEA models including the major failure modes based on the experimental results were built to simulate the damage processes of grid-stiffened composite plates under transverse load. A FORTRAN subroutine was implemented within the ABAQUS code to incorporate the material failure models. Effects of damage on the modal frequencies and loss factors of grid-stiffened composite plates were also investigated experimentally. Experimental and simulation results showed that sandwich composite specimens failed catastrophically with the load dropping sharply at the displacement corresponding to initial and final failure. However, grid-stiffened composite specimens failed in a more gradual and forgiving way in a sequence of relatively small load drops. No catastrophic load drops were observed in the grid structures over the range of displacements investigated here. The SEA values of the grid composite specimens are typically higher than those of the sandwich specimens with the same boundary

  20. A grid spacing control technique for algebraic grid generation methods

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Kudlinski, R. A.; Everton, E. L.

    1982-01-01

    A technique which controls the spacing of grid points in algebraically defined coordinate transformations is described. The technique is based on the generation of control functions which map a uniformly distributed computational grid onto parametric variables defining the physical grid. The control functions are smoothed cubic splines. Sets of control points are input for each coordinate directions to outline the control functions. Smoothed cubic spline functions are then generated to approximate the input data. The technique works best in an interactive graphics environment where control inputs and grid displays are nearly instantaneous. The technique is illustrated with the two-boundary grid generation algorithm.

  1. Spectral methods on arbitrary grids

    NASA Technical Reports Server (NTRS)

    Carpenter, Mark H.; Gottlieb, David

    1995-01-01

    Stable and spectrally accurate numerical methods are constructed on arbitrary grids for partial differential equations. These new methods are equivalent to conventional spectral methods but do not rely on specific grid distributions. Specifically, we show how to implement Legendre Galerkin, Legendre collocation, and Laguerre Galerkin methodology on arbitrary grids.

  2. Ion Engine Grid Gap Measurements

    NASA Technical Reports Server (NTRS)

    Soulas, Gerge C.; Frandina, Michael M.

    2004-01-01

    A simple technique for measuring the grid gap of an ion engine s ion optics during startup and steady-state operation was demonstrated with beam extraction. The grid gap at the center of the ion optics assembly was measured with a long distance microscope that was focused onto an alumina pin that protruded through the center accelerator grid aperture and was mechanically attached to the screen grid. This measurement technique was successfully applied to a 30 cm titanium ion optics assembly mounted onto an NSTAR engineering model ion engine. The grid gap and each grid s movement during startup from room temperature to both full and low power were measured. The grid gaps with and without beam extraction were found to be significantly different. The grid gaps at the ion optics center were both significantly smaller than the cold grid gap and different at the two power levels examined. To avoid issues associated with a small grid gap during thruster startup with titanium ion optics, a simple method was to operate the thruster initially without beam extraction to heat the ion optics. Another possible method is to apply high voltage to the grids prior to igniting the discharge because power deposition to the grids from the plasma is lower with beam extraction than without. Further testing would be required to confirm this approach.

  3. Grid Interaction Technical Team Roadmap

    SciTech Connect

    2013-06-01

    The mission of the Grid Interaction Technical Team (GITT) is to support a transition scenario to large scale grid-connected vehicle charging with transformational technology, proof of concept and information dissemination. The GITT facilitates technical coordination and collaboration between vehicle-grid connectivity and communication activities among U.S. DRIVE government and industry partners.

  4. Towards Hybrid Overset Grid Simulations of the Launch Environment

    NASA Astrophysics Data System (ADS)

    Moini-Yekta, Shayan

    A hybrid overset grid approach has been developed for the design and analysis of launch vehicles and facilities in the launch environment. The motivation for the hybrid grid methodology is to reduce the turn-around time of computational fluid dynamic simulations and improve the ability to handle complex geometry and flow physics. The LAVA (Launch Ascent and Vehicle Aerodynamics) hybrid overset grid scheme consists of two components: an off-body immersed-boundary Cartesian solver with block-structured adaptive mesh refinement and a near-body unstructured body-fitted solver. Two-way coupling is achieved through overset connectivity between the off-body and near-body grids. This work highlights verification using code-to-code comparisons and validation using experimental data for the individual and hybrid solver. The hybrid overset grid methodology is applied to representative unsteady 2D trench and 3D generic rocket test cases.

  5. Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid

    SciTech Connect

    2012-02-08

    GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improve the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.

  6. Are clinical trials with mesenchymal stem/progenitor cells too far ahead of the science? Lessons from experimental hematology.

    PubMed

    Prockop, Darwin J; Prockop, Susan E; Bertoncello, Ivan

    2014-12-01

    The cells referred to as mesenchymal stem/progenitor cells (MSCs) are currently being used to treat thousands of patients with diseases of essentially all the organs and tissues of the body. Strikingly positive results have been reported in some patients, but there have been few prospective controlled studies. Also, the reasons for the beneficial effects are frequently unclear. As a result there has been a heated debate as to whether the clinical trials with these new cell therapies are too far ahead of the science. The debate is not easily resolved, but important insights are provided by the 60-year history that was required to develop the first successful stem cell therapy, the transplantation of hematopoietic stem cells. The history indicates that development of a dramatically new therapy usually requires patience and a constant dialogue between basic scientists and physicians carrying out carefully designed clinical trials. It also suggests that the field can be moved forward by establishing better records of how MSCs are prepared, by establishing a large supply of reference MSCs that can be used to validate assays and compare MSCs prepared in different laboratories, and by continuing efforts to establish in vivo assays for the efficacy of MSCs. PMID:25100155

  7. Are Clinical Trials With Mesenchymal Stem/Progenitor Cells too Far Ahead of the Science? Lessons From Experimental Hematology

    PubMed Central

    Prockop, Darwin J; Prockop, Susan E; Bertoncello, Ivan

    2014-01-01

    The cells referred to as mesenchymal stem/progenitor cells (MSCs) are currently being used to treat thousands of patients with diseases of essentially all the organs and tissues of the body. Strikingly positive results have been reported in some patients, but there have been few prospective controlled studies. Also, the reasons for the beneficial effects are frequently unclear. As a result there has been a heated debate as to whether the clinical trials with these new cell therapies are too far ahead of the science. The debate is not easily resolved, but important insights are provided by the 60-year history that was required to develop the first successful stem cell therapy, the transplantation of hematopoietic stem cells. The history indicates that development of a dramatically new therapy usually requires patience and a constant dialogue between basic scientists and physicians carrying out carefully designed clinical trials. It also suggests that the field can be moved forward by establishing better records of how MSCs are prepared, by establishing a large supply of reference MSCs that can be used to validate assays and compare MSCs prepared in different laboratories, and by continuing efforts to establish in vivo assays for the efficacy of MSCs. Stem Cells 2014;32:3055–3061 PMID:25100155

  8. The ion optics of a two grid electron-bombardment thruster

    NASA Technical Reports Server (NTRS)

    Aston, G.; Kaufman, H. R.

    1976-01-01

    A detailed experimental investigation has been performed to determine the ion beam divergence of an electron-bombardment ion thruster as a function of grid geometry changes. The results show that, to a good approximation, each geometrical grid parameter independently affects one aspect of grid set performance. These observations are used to develop a graphical technique for predicting the ion beam divergence of an arbitrary ion source and grid geometry combination. The usefulness of this technique is demonstrated by comparing predicted ion beam divergence of the 30-cm diameter Engineering Model ion thruster with independent experimental determinations. Good agreement is shown between predicted and experimental results.

  9. Smart Grid Demonstration Project

    SciTech Connect

    Miller, Craig; Carroll, Paul; Bell, Abigail

    2015-03-11

    The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives, to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and

  10. Gridded electron reversal ionizer

    NASA Technical Reports Server (NTRS)

    Chutjian, Ara (Inventor)

    1993-01-01

    A gridded electron reversal ionizer forms a three dimensional cloud of zero or near-zero energy electrons in a cavity within a filament structure surrounding a central electrode having holes through which the sample gas, at reduced pressure, enters an elongated reversal volume. The resultant negative ion stream is applied to a mass analyzer. The reduced electron and ion space-charge limitations of this configuration enhances detection sensitivity for material to be detected by electron attachment, such as narcotic and explosive vapors. Positive ions may be generated by generating electrons having a higher energy, sufficient to ionize the target gas and pulsing the grid negative to stop the electron flow and pulsing the extraction aperture positive to draw out the positive ions.

  11. 76 FR 1153 - Atlantic Grid Operations A LLC, Atlantic Grid Operations B LLC, Atlantic Grid Operations C LLC...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-07

    ... Energy Regulatory Commission Atlantic Grid Operations A LLC, Atlantic Grid Operations B LLC, Atlantic Grid Operations C LLC, Atlantic Grid Operations D LLC and Atlantic Grid Operations E LLC; Notice of... (Commission) Rules of Practice and Procedure, 18 CFR 385.207, and Order No. 679,\\1\\ Atlantic Grid Operations...

  12. Experimental Investigation of Space Radiation Processing in Lunar Soil Ilmenite: Combining Perspectives from Surface Science and Transmission Electron Microscopy

    NASA Technical Reports Server (NTRS)

    Christoffersen, R.; Keller, L. P.; Rahman, Z.; Baragiola, R.

    2010-01-01

    Energetic ions mostly from the solar wind play a major role in lunar space weathering because they contribute structural and chemical changes to the space-exposed surfaces of lunar regolith grains. In mature mare soils, ilmenite (FeTiO3) grains in the finest size fraction have been shown in transmission electron microscope (TEM) studies to exhibit key differences in their response to space radiation processing relative to silicates [1,2,3]. In ilmenite, solar ion radiation alters host grain outer margins to produce 10-100 nm thick layers that are microstructurally complex, but dominantly crystalline compared to the amorphous radiation-processed rims on silicates [1,2,3]. Spatially well-resolved analytical TEM measurements also show nm-scale compositional and chemical state changes in these layers [1,3]. These include shifts in Fe/Ti ratio from strong surface Fe-enrichment (Fe/Ti >> 1), to Fe depletion (Fe/Ti < 1) at 40-50 nm below the grain surface [1,3]. These compositional changes are not observed in the radiation-processed rims on silicates [4]. Several mechanism(s) to explain the overall relations in the ilmenite grain rims by radiation processing and/or additional space weathering processes were proposed by [1], and remain under current consideration [3]. A key issue has concerned the ability of ion radiation processing alone to produce some of the deeper- penetrating compositional changes. In order to provide some experimental constraints on these questions, we have performed a combined X-ray photoelectron spectroscopy (XPS) and field-emission scanning transmission electron (FE-STEM) study of experimentally ion-irradiated ilmenite. A key feature of this work is the combination of analytical techniques sensitive to changes in the irradiated samples at depth scales going from the immediate surface (approx.5 nm; XPS), to deeper in the grain interior (5-100 nm; FE-STEM).

  13. Grid generation and inviscid flow computation about aircraft geometries

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1989-01-01

    Grid generation and Euler flow about fighter aircraft are described. A fighter aircraft geometry is specified by an area ruled fuselage with an internal duct, cranked delta wing or strake/wing combinations, canard and/or horizontal tail surfaces, and vertical tail surfaces. The initial step before grid generation and flow computation is the determination of a suitable grid topology. The external grid topology that has been applied is called a dual-block topology which is a patched C (exp 1) continuous multiple-block system where inner blocks cover the highly-swept part of a cranked wing or strake, rearward inner-part of the wing, and tail components. Outer-blocks cover the remainder of the fuselage, outer-part of the wing, canards and extend to the far field boundaries. The grid generation is based on transfinite interpolation with Lagrangian blending functions. This procedure has been applied to the Langley experimental fighter configuration and a modified F-18 configuration. Supersonic flow between Mach 1.3 and 2.5 and angles of attack between 0 degrees and 10 degrees have been computed with associated Euler solvers based on the finite-volume approach. When coupling geometric details such as boundary layer diverter regions, duct regions with inlets and outlets, or slots with the general external grid, imposing C (exp 1) continuity can be extremely tedious. The approach taken here is to patch blocks together at common interfaces where there is no grid continuity, but enforce conservation in the finite-volume solution. The key to this technique is how to obtain the information required for a conservative interface. The Ramshaw technique which automates the computation of proportional areas of two overlapping grids on a planar surface and is suitable for coding was used. Researchers generated internal duct grids for the Langley experimental fighter configuration independent of the external grid topology, with a conservative interface at the inlet and outlet.

  14. TRMM Gridded Text Products

    NASA Technical Reports Server (NTRS)

    Stocker, Erich Franz

    2007-01-01

    NASA's Tropical Rainfall Measuring Mission (TRMM) has many products that contain instantaneous or gridded rain rates often among many other parameters. However, these products because of their completeness can often seem intimidating to users just desiring surface rain rates. For example one of the gridded monthly products contains well over 200 parameters. It is clear that if only rain rates are desired, this many parameters might prove intimidating. In addition, for many good reasons these products are archived and currently distributed in HDF format. This also can be an inhibiting factor in using TRMM rain rates. To provide a simple format and isolate just the rain rates from the many other parameters, the TRMM product created a series of gridded products in ASCII text format. This paper describes the various text rain rate products produced. It provides detailed information about parameters and how they are calculated. It also gives detailed format information. These products are used in a number of applications with the TRMM processing system. The products are produced from the swath instantaneous rain rates and contain information from the three major TRMM instruments: radar, radiometer, and combined. They are simple to use, human readable, and small for downloading.

  15. Wireless Communications in Smart Grid

    NASA Astrophysics Data System (ADS)

    Bojkovic, Zoran; Bakmaz, Bojan

    Communication networks play a crucial role in smart grid, as the intelligence of this complex system is built based on information exchange across the power grid. Wireless communications and networking are among the most economical ways to build the essential part of the scalable communication infrastructure for smart grid. In particular, wireless networks will be deployed widely in the smart grid for automatic meter reading, remote system and customer site monitoring, as well as equipment fault diagnosing. With an increasing interest from both the academic and industrial communities, this chapter systematically investigates recent advances in wireless communication technology for the smart grid.

  16. Grid generation for turbomachinery problems

    NASA Technical Reports Server (NTRS)

    Steinhoff, J.; Reddy, K. C.

    1986-01-01

    The development of a computer code to generate numerical grids for complex internal flow systems such as the fluid flow inside the space shuttle main engine is outlined. The blending technique for generating a grid for stator-rotor combination at a particular radial section is examined. The computer programs which generate these grids are listed in the Appendices. These codes are capable of generating grids at different cross sections and thus providng three dimensional stator-rotor grids for the turbomachinery of the space shuttle main engine.

  17. Experimental evidence shows no fractionation of strontium isotopes ((87)Sr/(86)Sr) among soil, plants, and herbivores: implications for tracking wildlife and forensic science.

    PubMed

    Flockhart, D T Tyler; Kyser, T Kurt; Chipley, Don; Miller, Nathan G; Norris, D Ryan

    2015-01-01

    Strontium isotopes ((87)Sr/(86)Sr) can be useful biological markers for a wide range of forensic science applications, including wildlife tracking. However, one of the main advantages of using (87)Sr/(86)Sr values, that there is no fractionation from geological bedrock sources through the food web, also happens to be a critical assumption that has never been tested experimentally. We test this assumption by measuring (87)Sr/(86)Sr values across three trophic levels in a controlled greenhouse experiment. Adult monarch butterflies were raised on obligate larval host milkweed plants that were, in turn, grown on seven different soil types collected across Canada. We found no significant differences between (87)Sr/(86)Sr values in leachable Sr from soil minerals, organic soil, milkweed leaves, and monarch butterfly wings. Our results suggest that strontium isoscapes developed from (87)Sr/(86)Sr values in bedrock or soil may serve as a reliable biological marker in forensic science for a range of taxa and across large geographic areas. PMID:25789981

  18. Using a Non-Equivalent Groups Quasi Experimental Design to Reduce Internal Validity Threats to Claims Made by Math and Science K-12 Teacher Recruitment Programs

    NASA Astrophysics Data System (ADS)

    Moin, Laura

    2009-10-01

    The American Recovery and Reinvestment Act national policy established in 2009 calls for ``meaningful data'' that demonstrate educational improvements, including the recruitment of high-quality teachers. The scant data available and the low credibility of many K-12 math/science teacher recruitment program evaluations remain the major barriers for the identification of effective recruitment strategies. Our study presents a methodology to better evaluate the impact of recruitment programs on increasing participants' interest in teaching careers. The research capitalizes on the use of several control groups and presents a non-equivalent groups quasi-experimental evaluation design that produces program effect claims with higher internal validity than claims generated by current program evaluations. With this method that compares responses to a teaching career interest question from undergraduates all along a continuum from just attending an information session to participating (or not) in the recruitment program, we were able to compare the effect of the program in increasing participants' interest in teaching careers versus the evolution of the same interest but in the absence of the program. We were also able to make suggestions for program improvement and further research. While our findings may not apply to other K-12 math/science teacher recruitment programs, we believe that our evaluation methodology does and will contribute to conduct stronger program evaluations. In so doing, our evaluation procedure may inform recruitment program designers and policy makers.

  19. 3D Structured Grid Adaptation

    NASA Technical Reports Server (NTRS)

    Banks, D. W.; Hafez, M. M.

    1996-01-01

    Grid adaptation for structured meshes is the art of using information from an existing, but poorly resolved, solution to automatically redistribute the grid points in such a way as to improve the resolution in regions of high error, and thus the quality of the solution. This involves: (1) generate a grid vis some standard algorithm, (2) calculate a solution on this grid, (3) adapt the grid to this solution, (4) recalculate the solution on this adapted grid, and (5) repeat steps 3 and 4 to satisfaction. Steps 3 and 4 can be repeated until some 'optimal' grid is converged to but typically this is not worth the effort and just two or three repeat calculations are necessary. They also may be repeated every 5-10 time steps for unsteady calculations.

  20. Progress in Grid Generation: From Chimera to DRAGON Grids

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Kao, Kai-Hsiung

    1994-01-01

    Hybrid grids, composed of structured and unstructured grids, combines the best features of both. The chimera method is a major stepstone toward a hybrid grid from which the present approach is evolved. The chimera grid composes a set of overlapped structured grids which are independently generated and body-fitted, yielding a high quality grid readily accessible for efficient solution schemes. The chimera method has been shown to be efficient to generate a grid about complex geometries and has been demonstrated to deliver accurate aerodynamic prediction of complex flows. While its geometrical flexibility is attractive, interpolation of data in the overlapped regions - which in today's practice in 3D is done in a nonconservative fashion, is not. In the present paper we propose a hybrid grid scheme that maximizes the advantages of the chimera scheme and adapts the strengths of the unstructured grid while at the same time keeps its weaknesses minimal. Like the chimera method, we first divide up the physical domain by a set of structured body-fitted grids which are separately generated and overlaid throughout a complex configuration. To eliminate any pure data manipulation which does not necessarily follow governing equations, we use non-structured grids only to directly replace the region of the arbitrarily overlapped grids. This new adaptation to the chimera thinking is coined the DRAGON grid. The nonstructured grid region sandwiched between the structured grids is limited in size, resulting in only a small increase in memory and computational effort. The DRAGON method has three important advantages: (1) preserving strengths of the chimera grid; (2) eliminating difficulties sometimes encountered in the chimera scheme, such as the orphan points and bad quality of interpolation stencils; and (3) making grid communication in a fully conservative and consistent manner insofar as the governing equations are concerned. To demonstrate its use, the governing equations are

  1. Smart Grid Risk Management

    NASA Astrophysics Data System (ADS)

    Abad Lopez, Carlos Adrian

    Current electricity infrastructure is being stressed from several directions -- high demand, unreliable supply, extreme weather conditions, accidents, among others. Infrastructure planners have, traditionally, focused on only the cost of the system; today, resilience and sustainability are increasingly becoming more important. In this dissertation, we develop computational tools for efficiently managing electricity resources to help create a more reliable and sustainable electrical grid. The tools we present in this work will help electric utilities coordinate demand to allow the smooth and large scale integration of renewable sources of energy into traditional grids, as well as provide infrastructure planners and operators in developing countries a framework for making informed planning and control decisions in the presence of uncertainty. Demand-side management is considered as the most viable solution for maintaining grid stability as generation from intermittent renewable sources increases. Demand-side management, particularly demand response (DR) programs that attempt to alter the energy consumption of customers either by using price-based incentives or up-front power interruption contracts, is more cost-effective and sustainable in addressing short-term supply-demand imbalances when compared with the alternative that involves increasing fossil fuel-based fast spinning reserves. An essential step in compensating participating customers and benchmarking the effectiveness of DR programs is to be able to independently detect the load reduction from observed meter data. Electric utilities implementing automated DR programs through direct load control switches are also interested in detecting the reduction in demand to efficiently pinpoint non-functioning devices to reduce maintenance costs. We develop sparse optimization methods for detecting a small change in the demand for electricity of a customer in response to a price change or signal from the utility

  2. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  3. Design and manufacturing of interlocked composite grids

    NASA Astrophysics Data System (ADS)

    Han, Dongyup

    Composite grid structures made from pultruded unidirectional glass or carbon ribs provide unmatched performance/cost combination of any composite panels. A new manufacturing method for an ortho-grid using slotted joint and adhesive bonding ("Interlocked Composite Grid" or ICG) has been developed. The high structural performance of the grid is derived from uni-plies and the efficient load transfer mechanism. Pultrusion is one of the cheapest, fastest and reliable manufacturing processes for composite sections. Pultruded ribs, along with the simple assembly concept, lead to the low cost structure. Also, the flexibility in assembly eliminates the size limitation and large civil composite structures can be built. Two different equivalent stiffness models, the equivalent plate stiffness matrices and the equivalent engineering constants, have been formulated. The former model, more accurate than the equivalent engineering constants, includes the effects of the slots, the internal ribs, and the skins. The latter is used for establishing simple design guidelines. The equivalent stiffness models have been verified with numerical analysis and experimental data. The simplicity and flexibility of the design of an ICG has been demonstrated by sample design problems. Also, an approximate cost estimation rule has been established. ICG beams and panels have been built and tested under static and dynamic flexural loading. Superior mechanical properties, such as high damage tolerance, resilience, and durability have been demonstrated. The failure mode has been identified.

  4. Service Oriented Gridded Atmospheric Radiances (SOAR)

    NASA Astrophysics Data System (ADS)

    Halem, M.; Goldberg, M. D.; Tilmes, C.; Zhou, L.; Shen, S.; Yesha, Y.

    2005-12-01

    responsively meeting diverse user specified requests in terms of the spatial and temporal compositing of radiance fields. Moreover, the volume of sounder data records produced from current and future instruments varies from GB's to TB's per day and griding these sounding data can thin the volume to KB's to MB's per day making them easier to download to desktops and laptops. This not only will better serve a wider earth science community but makes these capabilities more readily useful to the education community. This presentation will describe the rationale for the project, an overview of the system architecture, a description of the framework for executing the applications on the distributed cluster and present examples of gridded service requests that are currently available. This demonstration project represents a foundation for the development of a distributed web service architecture that will be able to invoke requested services for temperature and moisture retrievals for arbitrary integrated gridded radiance data sets. We plan to extend the framework to accommodate such services for other earth observing instruments as well.

  5. Earth System Grid and EGI interoperability

    NASA Astrophysics Data System (ADS)

    Raciazek, J.; Petitdidier, M.; Gemuend, A.; Schwichtenberg, H.

    2012-04-01

    The Earth Science data centers have developed a data grid called Earth Science Grid Federation (ESGF) to give the scientific community world wide access to CMIP5 (Coupled Model Inter-comparison Project 5) climate data. The CMIP5 data will permit to evaluate the impact of climate change in various environmental and societal areas, such as regional climate, extreme events, agriculture, insurance… The ESGF grid provides services like searching, browsing and downloading of datasets. At the security level, ESGF data access is protected by an authentication mechanism. An ESGF trusted X509 Short-Lived EEC certificate with the correct roles/attributes is required to get access to the data in a non-interactive way (e.g. from a worker node). To access ESGF from EGI (i.e. by earth science applications running on EGI infrastructure), the security incompatibility between the two grids is the challenge: the EGI proxy certificate is not ESGF trusted nor it contains the correct roles/attributes. To solve this problem, we decided to use a Credential Translation Service (CTS) to translate the EGI X509 proxy certificate into the ESGF Short-Lived EEC certificate (the CTS will issue ESGF certificates based on EGI certificate authentication). From the end user perspective, the main steps to use the CTS are: the user binds his two identities (EGI and ESGF) together in the CTS using the CTS web interface (this steps has to be done only once) and then request an ESGF Short-Lived EEC certificate every time is needed, using a command-line tools. The implementation of the CTS is on-going. It is based on the open source MyProxy software stack, which is used in many grid infrastructures. On the client side, the "myproxy-logon" command-line tools is used to request the certificate translation. A new option has been added to "myproxy-logon" to select the original certificate (in our case, the EGI one). On the server side, MyProxy server operates in Certificate Authority mode, with a new module

  6. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y. Y.; Harley, P. C.; Hornbrook, R. S.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M. C.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L.; Smith, J. N.

    2014-06-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include - soil property measurements; - hydrological studies; - measurements of high-frequency turbulence parameters; - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; - determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; - aerosol number and mass distributions; - chemical speciation of aerosol particles; - characterization of ice and cloud condensation nuclei; - trace gas measurements; and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  7. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008-2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y.; Harley, P. C.; Hornbrook, R. H.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L., III; Smith, J. N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air, but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include: - soil property measurements, - hydrological studies, - measurements of high-frequency turbulence parameters, - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy, - biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry, - aerosol number and mass distributions, - chemical speciation of aerosol particles, - characterization of ice and cloud condensation nuclei, - trace gas measurements, and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurement, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  8. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    SciTech Connect

    Ortega, John; Turnipseed, A.; Guenther, Alex B.; Karl, Thomas G.; Day, D. A.; Gochis, David; Huffman, J. A.; Prenni, Anthony J.; Levin, E. J.; Kreidenweis, Sonia M.; DeMott, Paul J.; Tobo, Y.; Patton, E. G.; Hodzic, Alma; Cui, Y. Y.; Harley, P.; Hornbrook, R. S.; Apel, E. C.; Monson, Russell K.; Eller, A. S.; Greenberg, J. P.; Barth, Mary; Campuzano-Jost, Pedro; Palm, B. B.; Jiminez, J. L.; Aiken, A. C.; Dubey, Manvendra K.; Geron, Chris; Offenberg, J.; Ryan, M. G.; Fornwalt, Paula J.; Pryor, S. C.; Keutsch, Frank N.; DiGangi, J. P.; Chan, A. W.; Goldstein, Allen H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, Chris; Mauldin, R. L.; Smith, James N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and interrelationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include – soil property measurements; – hydrological studies; – measurements of high-frequency turbulence parameters; – eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; – determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; – aerosol number and mass distributions; – chemical speciation of aerosol particles; – characterization of ice and cloud condensation nuclei; – trace gas measurements; and – model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  9. Ion beamlet steering for two-grid electrostatic thrusters. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Homa, J. M.

    1984-01-01

    An experimental study of ion beamlet steering in which the direction of beamlets emitted from a two grid aperture system is controlled by relative translation of the grids, is described. The results can be used to design electrostatic accelerating devices for which the direction and focus of emerging beamlets are important. Deflection and divergence angle data are presented for two grid systems as a function of the relative lateral displacement of the holes in these grids. At large displacements, accelerator grid impingements become excessive and this determines the maximum allowable displacement and as a result the useful range of beamlet deflection. Beamlet deflection is shown to vary linearly with grid offset angle over this range. The divergence of the beamlets is found to be unaffected by deflection over the useful range of beamlet deflection. The grids of a typical dished grid ion thruster are examined to determine the effects of thermally induced grid distortion and prescribed offsets of grid hole centerlines on the characteristics of the emerging beamlets. The results are used to determine the region on the grid surface where ion beamlet deflections exceed the useful range. Over this region high accelerator grid impingement currents and rapid grid erosion are predicted.

  10. The pilot way to Grid resources using glideinWMS

    SciTech Connect

    Sfiligoi, Igor; Bradley, Daniel C.; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank; /UC, San Diego

    2010-09-01

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of Grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the nonuniformity of computer resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of Grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  11. Grist : grid-based data mining for astronomy

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden; Nichol, Robert

    2004-01-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  12. Thundercloud: Domain specific information security training for the smart grid

    NASA Astrophysics Data System (ADS)

    Stites, Joseph

    In this paper, we describe a cloud-based virtual smart grid test bed: ThunderCloud, which is intended to be used for domain-specific security training applicable to the smart grid environment. The test bed consists of virtual machines connected using a virtual internal network. ThunderCloud is remotely accessible, allowing students to undergo educational exercises online. We also describe a series of practical exercises that we have developed for providing the domain-specific training using ThunderCloud. The training exercises and attacks are designed to be realistic and to reflect known vulnerabilities and attacks reported in the smart grid environment. We were able to use ThunderCloud to offer practical domain-specific security training for smart grid environment to computer science students at little or no cost to the department and no risk to any real networks or systems.

  13. Integrating Grid Services into the Cray XT4 Environment

    SciTech Connect

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic grid interfaces that mask the underlying system-specific details for the end user.

  14. The BioGRID interaction database: 2015 update.

    PubMed

    Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Oughtred, Rose; Boucher, Lorrie; Heinicke, Sven; Chen, Daici; Stark, Chris; Breitkreutz, Ashton; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Nixon, Julie; Ramage, Lindsay; Winter, Andrew; Sellam, Adnane; Chang, Christie; Hirschman, Jodi; Theesfeld, Chandra; Rust, Jennifer; Livstone, Michael S; Dolinski, Kara; Tyers, Mike

    2015-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http://thebiogrid.org) is an open access database that houses genetic and protein interactions curated from the primary biomedical literature for all major model organism species and humans. As of September 2014, the BioGRID contains 749,912 interactions as drawn from 43,149 publications that represent 30 model organisms. This interaction count represents a 50% increase compared to our previous 2013 BioGRID update. BioGRID data are freely distributed through partner model organism databases and meta-databases and are directly downloadable in a variety of formats. In addition to general curation of the published literature for the major model species, BioGRID undertakes themed curation projects in areas of particular relevance for biomedical sciences, such as the ubiquitin-proteasome system and various human disease-associated interaction networks. BioGRID curation is coordinated through an Interaction Management System (IMS) that facilitates the compilation interaction records through structured evidence codes, phenotype ontologies, and gene annotation. The BioGRID architecture has been improved in order to support a broader range of interaction and post-translational modification types, to allow the representation of more complex multi-gene/protein interactions, to account for cellular phenotypes through structured ontologies, to expedite curation through semi-automated text-mining approaches, and to enhance curation quality control. PMID:25428363

  15. The International Symposium on Grids and Clouds

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.

  16. Flow simulation on generalized grids

    SciTech Connect

    Koomullil, R.P.; Soni, B.K.; Huang, Chi Ti

    1996-12-31

    A hybrid grid generation methodology and flow simulation on grids having an arbitrary number of sided polygons is presented. A hyperbolic type marching scheme is used for generating structured grids near the solid boundaries. A local elliptic solver is utilized for smoothing the grid lines and for avoiding grid line crossing. A new method for trimming the overlaid structured grid is presented. Delaunay triangulation is employed to generate an unstructured grid in the regions away from the body. The structured and unstructured grid regions are integrated together to form a single grid for the flow solver. An edge based data structure is used to store the grid information to ease the handling of general polygons. Integral form of the Navier-Stokes equations makes up the governing equations. A Roe averaged Riemann solver is utilized to evaluate the numerical flux at cell faces. Higher order accuracy is achieved by applying Taylor`s series expansion to the conserved variables, and the gradient is calculated by using Green`s theorem. For the implicit scheme, the sparse matrix resulting from the linearization is solved using GMRES method. The flux Jacobians are calculated numerically or by an approximate analytic method. Results are presented to validate the current methodology.

  17. A Direct Experimental Evidence For the New Thermodynamic Boundary in the Supercritical State: Implications for Earth and Planetary Sciences.

    NASA Astrophysics Data System (ADS)

    Bolmatov, D.

    2015-12-01

    While scientists have a good theoretical understanding of the heat capacity of both solids and gases, a general theory of the heat capacity of liquids has always remained elusive. Apart from being an awkward hole in our knowledge, heat capacity - the amount of heat needed to change a substance's temperature by a certain amount - is a relevant quantity that it would be nice to be able to predict. I will introduce a phonon-based approach to liquids and supercritical fluids to describe its thermodynamics in terms of sound propagation. I will show that the internal liquid energy has a transverse sound propagation gaps and explain their evolution with temperature variations on the P-T diagram. I will explain how this theoretical framework covers the Debye theory of solids, the phonon theory of liquids, and thermodynamic limits such as the Delong-Petit and the ideal gas thermodynamic limits. As a results, the experimental evidence for the new thermodynamic boundary in the supercritical state (the Frenkel line) on the P-T phase diagram will be demonstrated. Then, I will report on inelastic X-ray scattering experiments combined with the molecular dynamics simulations on deeply supercritical Ar. The presented results unveil the mechanism and regimes of sound propagation in the liquid matter and provide compelling evidence for the adiabatic-to-isothermal longitudinal sound propagation transition. As a result, a universal link will be demonstrated between the positive sound dispersion (PSD) phenomenon and the origin of transverse sound propagation revealing the viscous-to-elastic crossover in compressed liquids. Both can be considered as a universal fingerprint of the dynamic response of a liquid. They can be used then for a signal detection and analysis of a dynamic response in deep water and other fluids which is relevant for describing the thermodynamics of gas giants. The consequences of this finding will be discussed, including a physically justified way to demarcate the

  18. An Experimental Study of a Televised Science Series, Grades 1-4, Comparing the Quality and Sequence of Television and Classroom Questions with a Proposed Strategy of Science Instruction.

    ERIC Educational Resources Information Center

    Beisenherz, Paul Chalmers

    The effectiveness of a televised science series used in the Seattle metropolitan area was investigated, using factorial design to provide a treatment variable representing four degrees of utilization of TV science and non-TV science. Teachers and their intact classes were randomly assigned to one of four treatment groups: TV science only, plus…

  19. Evaluating the Information Power Grid using the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    VanderWijngaartm Rob F.; Frumkin, Michael A.

    2004-01-01

    The NAS Grid Benchmarks (NGB) are a collection of synthetic distributed applications designed to rate the performance and functionality of computational grids. We compare several implementations of the NGB to determine programmability and efficiency of NASA's Information Power Grid (IPG), whose services are mostly based on the Globus Toolkit. We report on the overheads involved in porting existing NGB reference implementations to the IPG. No changes were made to the component tasks of the NGB can still be improved.

  20. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  1. TASMANIAN Sparse Grids Module

    SciTech Connect

    and Drayton Munster, Miroslav Stoyanov

    2013-09-20

    Sparse Grids are the family of methods of choice for multidimensional integration and interpolation in low to moderate number of dimensions. The method is to select extend a one dimensional set of abscissas, weights and basis functions by taking a subset of all possible tensor products. The module provides the ability to create global and local approximations based on polynomials and wavelets. The software has three components, a library, a wrapper for the library that provides a command line interface via text files ad a MATLAB interface via the command line tool.

  2. TASMANIAN Sparse Grids Module

    Energy Science and Technology Software Center (ESTSC)

    2013-09-20

    Sparse Grids are the family of methods of choice for multidimensional integration and interpolation in low to moderate number of dimensions. The method is to select extend a one dimensional set of abscissas, weights and basis functions by taking a subset of all possible tensor products. The module provides the ability to create global and local approximations based on polynomials and wavelets. The software has three components, a library, a wrapper for the library thatmore » provides a command line interface via text files ad a MATLAB interface via the command line tool.« less

  3. 78 FR 70076 - Large Scale Networking (LSN)-Middleware and Grid Interagency Coordination (MAGIC) Team

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... From the Federal Register Online via the Government Publishing Office NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Middleware and Grid Interagency Coordination (MAGIC) Team AGENCY: The... and responsibility for middleware, Grid, and cloud projects. The MAGIC Team reports to the Large...

  4. The Volume Grid Manipulator (VGM): A Grid Reusability Tool

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This document is a manual describing how to use the Volume Grid Manipulation (VGM) software. The code is specifically designed to alter or manipulate existing surface and volume structured grids to improve grid quality through the reduction of grid line skewness, removal of negative volumes, and adaption of surface and volume grids to flow field gradients. The software uses a command language to perform all manipulations thereby offering the capability of executing multiple manipulations on a single grid during an execution of the code. The command language can be input to the VGM code by a UNIX style redirected file, or interactively while the code is executing. The manual consists of 14 sections. The first is an introduction to grid manipulation; where it is most applicable and where the strengths of such software can be utilized. The next two sections describe the memory management and the manipulation command language. The following 8 sections describe simple and complex manipulations that can be used in conjunction with one another to smooth, adapt, and reuse existing grids for various computations. These are accompanied by a tutorial section that describes how to use the commands and manipulations to solve actual grid generation problems. The last two sections are a command reference guide and trouble shooting sections to aid in the use of the code as well as describe problems associated with generated scripts for manipulation control.

  5. GRID[subscript C] Renewable Energy Data Streaming into Classrooms

    ERIC Educational Resources Information Center

    DeLuca, V. William; Carpenter, Pam; Lari, Nasim

    2010-01-01

    For years, researchers have shown the value of using real-world data to enhance instruction in mathematics, science, and social studies. In an effort to help develop students' higher-order thinking skills in a data-rich learning environment, Green Research for Incorporating Data in the Classroom (GRID[subscript C]), a National Science…

  6. National Grid Deep Energy Retrofit Pilot Program—Clark Residence

    SciTech Connect

    2010-03-30

    In this case study, Building Science Corporation partnered with local utility company, National Grid, Massachusetts homes. This project involved the renovation of a 18th century Cape-style building and achieved a super-insulated enclosure (R-35 walls, R-50+ roof, R-20+ foundation), extensive water management improvements, high-efficiency water heater, and state-of-the-art ventilation.

  7. A Diagnostic Study of Computer Application of Structural Communication Grid

    ERIC Educational Resources Information Center

    Bahar, Mehmet; Aydin, Fatih; Karakirik, Erol

    2009-01-01

    In this article, Structural communication grid (SCG), an alternative measurement and evaluation technique, has been firstly summarised and the design, development and implementation of a computer based SCG system have been introduced. The system is then tested on a sample of 154 participants consisting of candidate students, science teachers and…

  8. e-Science and its implications.

    PubMed

    Hey, Tony; Trefethen, Anne

    2003-08-15

    After a definition of e-science and the Grid, the paper begins with an overview of the technological context of Grid developments. NASA's Information Power Grid is described as an early example of a 'prototype production Grid'. The discussion of e-science and the Grid is then set in the context of the UK e-Science Programme and is illustrated with reference to some UK e-science projects in science, engineering and medicine. The Open Standards approach to Grid middleware adopted by the community in the Global Grid Forum is described and compared with community-based standardization processes used for the Internet, MPI, Linux and the Web. Some implications of the imminent data deluge that will arise from the new generation of e-science experiments in terms of archiving and curation are then considered. The paper concludes with remarks about social and technological issues posed by Grid-enabled 'collaboratories' in both scientific and commercial contexts. PMID:12952686

  9. A grid quality manipulation system

    NASA Technical Reports Server (NTRS)

    Lu, Ning; Eiseman, Peter R.

    1991-01-01

    A grid quality manipulation system is described. The elements of the system are the measures by which quality is assessed, the computer graphic display of those measures, and the local grid manipulation to provide a response to the viewed quality indication. The display is an overlaid composite where the region is first covered with colors to reflect the values of the quality indicator, the grid is then placed on top of those colors, and finally a control net is placed on top of everything. The net represents the grid in terms of the control point form of algebraic grid generation. As a control point is moved, both the grid and the colored quality measures also move. This is a real time dynamic action so that the consequences of the manipulation are continuously seen.

  10. Prepares Overset Grids for Processing

    Energy Science and Technology Software Center (ESTSC)

    1998-04-22

    Many large and complex computational problems require multiple, structured, generically overlapped (overset) grids to obtain numerical solutions in a timely manner. BREAKUP significantly reduces required compute times by preparing overset grids for processing on massively parallel computers. BREAKUP subdivides the original grids for use on a user-specified number of parallel processors. Grid-to-grid and intragrid communications are maintained in the parallel environment via connectivity tables generated by BREAKUP. The subgrids are formed to be statically loadmore » balanced and to incur a minimum of communication between the subgrids. When the output of BREAKUP is submitted to an appropriately modified flow solver, subgrid solutions will be updated simultaneously. This contrasts to the much less efficient solution method of updating each original grid sequentially as done in the past.« less

  11. Colorado Electrical Transmission Grid

    DOE Data Explorer

    Zehner, Richard E.

    2012-02-01

    Citation Information: Originator: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Originator: Xcel Energy Publication Date: 2012 Title: Colorado XcelEnergy NonXcel Transmission Network Edition: First Publication Information: Publication Place: Earth Science & Observation Center, Cooperative Institute for Research in Environmental Science (CIRES), University of Colorado, Boulder Publisher: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Description: This layer contains transmission network of Colorado Spatial Domain: Extent: Top: 4540689.017558 m Left: 160606.141934 m Right: 758715.946645 m Bottom: 4098910.893397m Contact Information: Contact Organization: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Contact Person: Khalid Hussein Address: CIRES, Ekeley Building Earth Science & Observation Center (ESOC) 216 UCB City: Boulder State: CO Postal Code: 80309-0216 Country: USA Contact Telephone: 303-492-6782 Spatial Reference Information: Coordinate System: Universal Transverse Mercator (UTM) WGS’1984 Zone 13N False Easting: 500000.00000000 False Northing: 0.00000000 Central Meridian: -105.00000000 Scale Factor: 0.99960000 Latitude of Origin: 0.00000000 Linear Unit: Meter Datum: World Geodetic System ’1984 (WGS ’1984) Prime Meridian: Greenwich Angular Unit: Degree Digital Form: Format Name: Shapefile

  12. 77 FR 58416 - Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... From the Federal Register Online via the Government Publishing Office NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team AGENCY: The... to the Large Scale Networking (LSN) Coordinating Group (CG). Public Comments: The government...

  13. 78 FR 7464 - Large Scale Networking (LSN)-Middleware And Grid Interagency Coordination (MAGIC) Team

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... From the Federal Register Online via the Government Publishing Office NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Middleware And Grid Interagency Coordination (MAGIC) Team AGENCY: The... Team reports to the Large Scale Networking (LSN) Coordinating Group (CG). Public Comments:...

  14. Smart Grid Enabled EVSE

    SciTech Connect

    None, None

    2014-10-15

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  15. Grid Task Execution

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2007-01-01

    IPG Execution Service is a framework that reliably executes complex jobs on a computational grid, and is part of the IPG service architecture designed to support location-independent computing. The new grid service enables users to describe the platform on which they need a job to run, which allows the service to locate the desired platform, configure it for the required application, and execute the job. After a job is submitted, users can monitor it through periodic notifications, or through queries. Each job consists of a set of tasks that performs actions such as executing applications and managing data. Each task is executed based on a starting condition that is an expression of the states of other tasks. This formulation allows tasks to be executed in parallel, and also allows a user to specify tasks to execute when other tasks succeed, fail, or are canceled. The two core components of the Execution Service are the Task Database, which stores tasks that have been submitted for execution, and the Task Manager, which executes tasks in the proper order, based on the user-specified starting conditions, and avoids overloading local and remote resources while executing tasks.

  16. On unstructured grids and solvers

    NASA Technical Reports Server (NTRS)

    Barth, T. J.

    1990-01-01

    The fundamentals and the state-of-the-art technology for unstructured grids and solvers are highlighted. Algorithms and techniques pertinent to mesh generation are discussed. It is shown that grid generation and grid manipulation schemes rely on fast multidimensional searching. Flow solution techniques for the Euler equations, which can be derived from the integral form of the equations are discussed. Sample calculations are also provided.

  17. Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale

    NASA Astrophysics Data System (ADS)

    Barrios, M. I.

    2013-12-01

    The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues

  18. Grid Generation Techniques Utilizing the Volume Grid Manipulator

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1998-01-01

    This paper presents grid generation techniques available in the Volume Grid Manipulation (VGM) code. The VGM code is designed to manipulate existing line, surface and volume grids to improve the quality of the data. It embodies an easy to read rich language of commands that enables such alterations as topology changes, grid adaption and smoothing. Additionally, the VGM code can be used to construct simplified straight lines, splines, and conic sections which are common curves used in the generation and manipulation of points, lines, surfaces and volumes (i.e., grid data). These simple geometric curves are essential in the construction of domain discretizations for computational fluid dynamic simulations. By comparison to previously established methods of generating these curves interactively, the VGM code provides control of slope continuity and grid point-to-point stretchings as well as quick changes in the controlling parameters. The VGM code offers the capability to couple the generation of these geometries with an extensive manipulation methodology in a scripting language. The scripting language allows parametric studies of a vehicle geometry to be efficiently performed to evaluate favorable trends in the design process. As examples of the powerful capabilities of the VGM code, a wake flow field domain will be appended to an existing X33 Venturestar volume grid; negative volumes resulting from grid expansions to enable flow field capture on a simple geometry, will be corrected; and geometrical changes to a vehicle component of the X33 Venturestar will be shown.

  19. From the grid to the smart grid, topologically

    NASA Astrophysics Data System (ADS)

    Pagani, Giuliano Andrea; Aiello, Marco

    2016-05-01

    In its more visionary acceptation, the smart grid is a model of energy management in which the users are engaged in producing energy as well as consuming it, while having information systems fully aware of the energy demand-response of the network and of dynamically varying prices. A natural question is then: to make the smart grid a reality will the distribution grid have to be upgraded? We assume a positive answer to the question and we consider the lower layers of medium and low voltage to be the most affected by the change. In our previous work, we analyzed samples of the Dutch distribution grid (Pagani and Aiello, 2011) and we considered possible evolutions of these using synthetic topologies modeled after studies of complex systems in other technological domains (Pagani and Aiello, 2014). In this paper, we take an extra important step by defining a methodology for evolving any existing physical power grid to a good smart grid model, thus laying the foundations for a decision support system for utilities and governmental organizations. In doing so, we consider several possible evolution strategies and apply them to the Dutch distribution grid. We show how increasing connectivity is beneficial in realizing more efficient and reliable networks. Our proposal is topological in nature, enhanced with economic considerations of the costs of such evolutions in terms of cabling expenses and economic benefits of evolving the grid.

  20. Topology and grid adaption for high-speed flow computations

    NASA Technical Reports Server (NTRS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-01-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  1. Density separation of solids in ferrofluids with magnetic grids

    SciTech Connect

    Fay, H.; Quets, J.M.

    1980-04-01

    Nonmagnetic solids in a superparamagnetic ferrofluid are subjected to body forces proportional to the intensity of magnetization of the fluid and the gradient of the magnetic field. An apparent density of the fluid can be defined from the force equations, and since the apparent density can be much larger than the true density, it is possible to levitate or float dense objects. Mixtures of solids with a density greater than the apparent density sink while lower density solids float. In practice it is difficult to create a uniform gradient over a large volume and single gap magnetic separators require very large magnets or have a limited throughput. To overcome that problem, multiple gap magnetic grids have been designed. Such grids consist of planar arrays of parallel bars of alternating polarity, driven by permanent magnets. When immersed in ferrofluid, magnetic grids create nonuniform field gradients and apparent densities in the fluid. However, both analysis and experimental measurements show that the grid acts as a barrier to particles below a critical density, while permitting more dense particles to fall through the grid. Thus, a magnetic grid filter can be used as a high throughput binary separator of solids according to their densities. Such filters can be cascaded for more complex separations. Several magnetic grid filters have been designed, built, and tested. Magnetic measurements qualitatively agree with the theoretical predictions. Experiments with synthetic mixtures have demonstrated that good binary separations can be made.

  2. NAS Grid Benchmarks: A Tool for Grid Space Exploration

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.

  3. Celebrating the 65th anniversary of the Russian Federal Nuclear Center — All-Russian Research Institute of Experimental Physics (Scientific session of the Physical Sciences Division of the Russian Academy of Sciences, 6 October 2010)

    NASA Astrophysics Data System (ADS)

    2011-04-01

    A scientific session of the Physical Sciences Division of the Russian Academy of Sciences (RAS) took place on 6 October 2010 in the Conference Hall of the Lebedev Physical Institute, RAS (FIAN) on the occasion of the 65th anniversary of founding of the Russian Federal Nuclear Center — All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF).The agenda of the session announced on the website www.gpad.ac.ru of the RAS Physical Sciences Division listed the following reports: (1) Ilkaev R I (RFNC-VNIIEF, Sarov, Nizhny Novgorod region). Opening remarks "On the fundamental physics research programs at RFNC-VNIIEF" (2) Mikhailov A L (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Hydrodynamic instabilities in various media"; (3) Trunin R F (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Study of extreme states of metals using shock waves"; (4) Ivanovskii A V (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Explosive magnetic energy generators and their application in research"; (5) Podurets A M (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "X-ray studies of the structure of matter in shock waves"; (6) Garanin S G (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "High-power lasers in studies of the physics of hot, dense plasma and thermonuclear fusion"; (7) Selemir V D (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Physics research in ultrahigh magnetic fields"; (8) Mkhitar'yan L S (RFNC-VNIIEF, Sarov, Nizhny Novgorod region) "Gasdynamic thermonuclear fusion."Articles based on reports 1-7 are published below. An extended version of report 3 written as a review paper will be published in a later issue of Physics-Uspekhi. • Fundamental physics research at the All-Russian Research Institute of Experimental Physics, R I Ilkaev Physics-Uspekhi, 2011, Volume 54, Number 4, Pages 387-392 • Hydrodynamic instabilities, A L Mikhailov, N V Nevmerzhitskii, V A Raevskii Physics-Uspekhi, 2011, Volume 54, Number 4, Pages 392-397 • Extreme states of metals: investigation using shock

  4. Genetic-Annealing Algorithm in Grid Environment for Scheduling Problems

    NASA Astrophysics Data System (ADS)

    Cruz-Chávez, Marco Antonio; Rodríguez-León, Abelardo; Ávila-Melgar, Erika Yesenia; Juárez-Pérez, Fredy; Cruz-Rosales, Martín H.; Rivera-López, Rafael

    This paper presents a parallel hybrid evolutionary algorithm executed in a grid environment. The algorithm executes local searches using simulated annealing within a Genetic Algorithm to solve the job shop scheduling problem. Experimental results of the algorithm obtained in the "Tarantula MiniGrid" are shown. Tarantula was implemented by linking two clusters from different geographic locations in Mexico (Morelos-Veracruz). The technique used to link the two clusters and configure the Tarantula MiniGrid is described. The effects of latency in communication between the two clusters are discussed. It is shown that the evolutionary algorithm presented is more efficient working in Grid environments because it can carry out major exploration and exploitation of the solution space.

  5. Grid Erosion Modeling of the NEXT Ion Thruster Optics

    NASA Technical Reports Server (NTRS)

    Ernhoff, Jerold W.; Boyd, Iain D.; Soulas, George (Technical Monitor)

    2003-01-01

    Results from several different computational studies of the NEXT ion thruster optics are presented. A study of the effect of beam voltage on accelerator grid aperture wall erosion shows a non-monotonic, complex behavior. Comparison to experimental performance data indicates improvements in simulation of the accelerator grid current, as well as very good agreement with other quantities. Also examined is the effect of ion optics choice on the thruster life, showing that TAG optics provide better margin against electron backstreaming than NSTAR optics. The model is used to predict the change in performance with increasing accelerator grid voltage, showing that although the current collected on the accel grid downstream face increases, the erosion rate decreases. A study is presented for varying doubly-ionized Xenon current fraction. The results show that performance data is not extremely sensitive to the current fraction.

  6. On the Vortical-Flow Prediction Capability of an Unstructured-Grid Euler Solver

    NASA Technical Reports Server (NTRS)

    Ghaffari, Farhad

    1994-01-01

    The results from a concentrated computational effort are presented with the primary objective being directed at evaluating the vortical-flow-prediction capability of an unstructured-grid Euler solver. Both viscous and inviscid solutions, obtained from an established structured-grid method, along with an experimental wind-tunnel data are used as bench-mark measures to assess the validity of the unstructured-grid Euler results. Viscous effects on vortical flows are first identified by comparing the viscous and inviscid solutions obtained form the structured-grid method. Computational data analysis are then presented which reveal excellent correlations between the inviscid structured and unstructured-grid results in terms of off-surface flow structures, surface pressure distribution and the predicted longitudinal aerodynamic characteristics. The sensitivity of the unstructured-grid inviscid solutions to grid refinement is also discussed along with an analysis of the convergence and performance characteristics for each method.

  7. Qualitative Science in Experimental Time

    ERIC Educational Resources Information Center

    Eisenhart, Margaret

    2006-01-01

    This article addresses the "state of qualitative inquiry" in the sense of how that inquiry is being positioned in the current construction of a US national policy agenda for "scientifically based" education research. In the author's view, qualitative inquiry is being drowned out in the national agenda despite its ability to provide the kinds of…

  8. TIGER: Turbomachinery interactive grid generation

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.; Shih, Ming-Hsin; Janus, J. Mark

    1992-01-01

    A three dimensional, interactive grid generation code, TIGER, is being developed for analysis of flows around ducted or unducted propellers. TIGER is a customized grid generator that combines new technology with methods from general grid generation codes. The code generates multiple block, structured grids around multiple blade rows with a hub and shroud for either C grid or H grid topologies. The code is intended for use with a Euler/Navier-Stokes solver also being developed, but is general enough for use with other flow solvers. TIGER features a silicon graphics interactive graphics environment that displays a pop-up window, graphics window, and text window. The geometry is read as a discrete set of points with options for several industrial standard formats and NASA standard formats. Various splines are available for defining the surface geometries. Grid generation is done either interactively or through a batch mode operation using history files from a previously generated grid. The batch mode operation can be done either with a graphical display of the interactive session or with no graphics so that the code can be run on another computer system. Run time can be significantly reduced by running on a Cray-YMP.

  9. Grid generation using classical techniques

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1980-01-01

    A brief historical review of conformal mapping and its applications to problems in fluid mechanics and electromagnetism is presented. The use of conformal mapping as a grid generator is described. The philosophy of the 'closed form' approach and its application to a Neumann problem is discussed. Karman-Trefftz mappings and grids for ablated, three dimensional bodies are also discussed.

  10. Structured and unstructured grid generation.

    PubMed

    Thompson, J F; Weatherill, N P

    1992-01-01

    Current techniques in composite-block-structured grid generation and unstructured grid generation for general 3D geometries are summarized, including both algebraic and elliptic generation procedures for the former and Delaunay tessellations for the latter. Citations of relevant theory are given. Examples of applications for several geometries are included. PMID:1424687

  11. Intelligent automated surface grid generation

    NASA Technical Reports Server (NTRS)

    Yao, Ke-Thia; Gelsey, Andrew

    1995-01-01

    The goal of our research is to produce a flexible, general grid generator for automated use by other programs, such as numerical optimizers. The current trend in the gridding field is toward interactive gridding. Interactive gridding more readily taps into the spatial reasoning abilities of the human user through the use of a graphical interface with a mouse. However, a sometimes fruitful approach to generating new designs is to apply an optimizer with shape modification operators to improve an initial design. In order for this approach to be useful, the optimizer must be able to automatically grid and evaluate the candidate designs. This paper describes and intelligent gridder that is capable of analyzing the topology of the spatial domain and predicting approximate physical behaviors based on the geometry of the spatial domain to automatically generate grids for computational fluid dynamics simulators. Typically gridding programs are given a partitioning of the spatial domain to assist the gridder. Our gridder is capable of performing this partitioning. This enables the gridder to automatically grid spatial domains of wide range of configurations.

  12. Some Observations on Grid Convergence

    NASA Technical Reports Server (NTRS)

    Salas, manuel D.

    2006-01-01

    It is claimed that current practices in grid convergence studies, particularly in the field of external aerodynamics, are flawed. The necessary conditions to properly establish grid convergence are presented. A theoretical model and a numerical example are used to demonstrate these ideas.

  13. Science Grade 9, Science Curriculum Materials.

    ERIC Educational Resources Information Center

    Rochester City School District, NY.

    This curriculum guide is the third in a series of general science guides modified from the New York State Experimental Syllabus, Science 7-8-9 to meet the needs of students whose interests are in areas other than science. The guide is laboratory-oriented and contains many open ended, pupil activities in five activity blocks: orientation, forces at…

  14. Grid Interoperation with ARC middleware for the CMS experiment

    NASA Astrophysics Data System (ADS)

    Edelmann, Erik; Field, Laurence; Frey, Jaime; Grønager, Michael; Happonen, Kalle; Johansson, Daniel; Kleist, Josva; Klem, Jukka; Koivumäki, Jesper; Lindén, Tomas; Pirinen, Antti; Qing, Di

    2010-04-01

    The Compact Muon Solenoid (CMS) is one of the general purpose experiments at the CERN Large Hadron Collider (LHC). CMS computing relies on different grid infrastructures to provide computational and storage resources. The major grid middleware stacks used for CMS computing are gLite, Open Science Grid (OSG) and ARC (Advanced Resource Connector). Helsinki Institute of Physics (HIP) hosts one of the Tier-2 centers for CMS computing. CMS Tier-2 centers operate software systems for data transfers (PhEDEx), Monte Carlo production (ProdAgent) and data analysis (CRAB). In order to provide the Tier-2 services for CMS, HIP uses tools and components from both ARC and gLite grid middleware stacks. Interoperation between grid systems is a challenging problem and HIP uses two different solutions to provide the needed services. The first solution is based on gLite-ARC grid level interoperability. This allows to use ARC resources in CMS without modifying the CMS application software. The second solution is based on developing specific ARC plugins in CMS software.

  15. DNS of vibrating grid turbulence

    NASA Astrophysics Data System (ADS)

    Khujadze, G.; Oberlack, M.

    Direct numerical simulation of the turbulence generated at a grid vibrating normally to itself using spectral code [1] is presented. Due to zero mean shear there is no production of turbulence apart from the grid. Action of the grid is mimiced by the function implemented in the middle of the simulation box:f_i (x_1 ,x_2 ) = {n^2 S}/2left\\{ {left| {{δ _{i3} }/4\\cos left( {{2π }/Mx_1 } right)\\cos left. {left( {{2π }/Mx_2 } right)} right|} right.sin (nt) + {β _i }/4} right\\}, where M is the mesh size, S/2 - amplitude or stroke of the grid, n - frequency. β i are random numbers with uniform distribution. The simulations were performed for the following parameters: x 1, x 2 ∈ [-π; π], x 3 ∈ [-2π; 2π]; Re = nS 2/? = 1000; S/M = 2; Numerical grid: 128 × 128 × 256.

  16. DNS of vibrating grid turbulence

    NASA Astrophysics Data System (ADS)

    Khujadze, G.; Oberlack, M.

    Direct numerical simulation of the turbulence generated at a grid vibrating normally to itself using spectral code [1] is presented. Due to zero mean shear there is no production of turbulence apart from the grid. Action of the grid is mimiced by the function implemented in the middle of the simulation box:f_i (x_1 ,x_2 ) = {n^2 S}/2left{ {left| {{δ _{i3} }/4\\cos left( {{2π }/Mx_1 } right)\\cos left. {left( {{2π }/Mx_2 } right)} right|} right.sin (nt) + {β _i }/4} right}, where M is the mesh size, S/2 - amplitude or stroke of the grid, n - frequency. β i are random numbers with uniform distribution. The simulations were performed for the following parameters: x 1, x 2 ∈ [-π; π], x 3 ∈ [-2π; 2π]; Re = nS 2/? = 1000; S/M = 2; Numerical grid: 128 × 128 × 256.

  17. Framework for Interactive Parallel Dataset Analysis on the Grid

    SciTech Connect

    Alexander, David A.; Ananthan, Balamurali; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  18. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  19. Hybrid Scheduling Model for Independent Grid Tasks

    PubMed Central

    Shanthini, J.; Kalaikumaran, T.; Karthik, S.

    2015-01-01

    Grid computing facilitates the resource sharing through the administrative domains which are geographically distributed. Scheduling in a distributed heterogeneous environment is intrinsically very hard because of the heterogeneous nature of resource collection. Makespan and tardiness are two different measures of scheduling, and many of the previous researches concentrated much on reduction of makespan, which measures the machine utilization. In this paper, we propose a hybrid scheduling algorithm for scheduling independent grid tasks with the objective of reducing total weighted tardiness of grid tasks. Tardiness is to measure the due date performance, which has a direct impact on cost for executing the jobs. In this paper we propose BG_ATC algorithm which is a combination of best gap (BG) search and Apparent Tardiness Cost (ATC) indexing algorithm. Furthermore, we implemented these two algorithms in two different phases of the scheduling process. In addition to that, the comparison was made on results with various benchmark algorithms and the experimental results show that our algorithm outperforms the benchmark algorithms. PMID:26543897

  20. Grid Integration Studies: Data Requirements, Greening the Grid

    SciTech Connect

    Katz, Jessica

    2015-06-01

    A grid integration study is an analytical framework used to evaluate a power system with high penetration levels of variable renewable energy (VRE). A grid integration study simulates the operation of the power system under different VRE scenarios, identifying reliability constraints and evaluating the cost of actions to alleviate those constraints. These VRE scenarios establish where, how much, and over what timeframe to build generation and transmission capacity, ideally capturing the spatial diversity benefits of wind and solar resources. The results help build confidence among policymakers, system operators, and investors to move forward with plans to increase the amount of VRE on the grid.

  1. Single grid accelerator for an ion thrustor

    NASA Technical Reports Server (NTRS)

    Margosian, P. M.; Nakanishi, S. (Inventor)

    1973-01-01

    A single grid accelerator system for an ion thrustor is discussed. A layer of dielectric material is interposed between this metal grid and the chamber containing an ionized propellant for protecting the grid against sputtering erosion.

  2. Optimizing solar-cell grid geometry

    NASA Technical Reports Server (NTRS)

    Crossley, A. P.

    1969-01-01

    Trade-off analysis and mathematical expressions calculate optimum grid geometry in terms of various cell parameters. Determination of the grid geometry provides proper balance between grid resistance and cell output to optimize the energy conversion process.

  3. SimplevisGrid: grid services for visualization of diverse biomedical knowledge and molecular systems data.

    PubMed

    Stokes, Todd H; Wang, May D

    2009-01-01

    Biomedical data visualization is a great challenge due to the scale, complexity, and diversity of systems, system component interactions and experimental data. Standards for interoperable data are a good start to addressing these problems, but standardization of visualization technologies is an emerging topic. SimpleVisGrid builds on Cancer Biomedical Informatics Grid (caBIG) common infrastructure for cancer research, and clearly specifies and extends three standard data formats for inputs and outputs to grid services: comma-separated values (CSV), Portable Network Graphics (PNG), and Scalable Vector Graphics (SVG). Four prototype visualizations are available: 2D array data quality visualization, correlation heatmaps between high-dimensional data and associated meta-data, feature landscapes, and biochemical or semantic network graphs. The services and data model are prepared for submission for caBIG Silver-level compatibility review and for integration into automated research workflows. Making these tools available to caBIG developers and ultimately to biomedical researchers can (1) help with biomedical communication, discovery, and decision-making, (2) encourage more research on standardization of visualization formats, and (3) improve the efficiency of large data transfers across the grid. PMID:19964624

  4. SimpleVisGrid: Grid Services for Visualization of Diverse Biomedical Knowledge and Molecular Systems Data

    PubMed Central

    Stokes, Todd H.; Wang, May D.

    2016-01-01

    Biomedical data visualization is a great challenge due to the scale, complexity, and diversity of systems, system component interactions and experimental data. Standards for interoperable data are a good start to addressing these problems, but standardization of visualization technologies is an emerging topic. SimpleVisGrid builds on Cancer Biomedical Informatics Grid (caBIG) common infrastructure for cancer research, and clearly specifies and extends three standard data formats for inputs and outputs to grid services: comma-separated values (CSV), Portable Network Graphics (PNG), and Scalable Vector Graphics (SVG). Four prototype visualizations are available: 2D array data quality visualization, correlation heatmaps between high-dimensional data and associated meta-data, feature landscapes, and biochemical or semantic network graphs. The services and data model are prepared for submission for caBIG Silver-level compatibility review and for integration into automated research workflows. Making these tools available to caBIG developers and ultimately to biomedical researchers can (1) help with biomedical communication, discovery, and decision-making, (2) encourage more research on standardization of visualization formats, and (3) improve the efficiency of large data transfers across the grid. PMID:19964624

  5. Aeroacoustic Simulation of Nose Landing Gear on Adaptive Unstructured Grids With FUN3D

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Park, Michael A.; Lockhard, David P.

    2013-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed nose landing gear configuration that was tested in NASA Langley s closed-wall Basic Aerodynamic Research Tunnel (BART) and in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D, developed at NASA Langley Research center, is used to compute the unsteady flow field for this configuration. Starting with a coarse grid, a series of successively finer grids were generated using the adaptive gridding methodology available in the FUN3D code. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these computations. Time-averaged and instantaneous solutions obtained on these grids are compared with the measured data. In general, the correlation with the experimental data improves with grid refinement. A similar trend is observed for sound pressure levels obtained by using these CFD solutions as input to a FfowcsWilliams-Hawkings noise propagation code to compute the farfield noise levels. In general, the numerical solutions obtained on adapted grids compare well with the hand-tuned enriched fine grid solutions and experimental data. In addition, the grid adaption strategy discussed here simplifies the grid generation process, and results in improved computational efficiency of CFD simulations.

  6. Random grid fern for visual tracking

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Kai; Zhang, Jin; Li, YunSong

    2014-05-01

    Visual tracking is one of the significant research directions in computer vision. Although standard random ferns tracking method obtains a good performance for the random spatial arrangement of binary tests, the effect of the locality of image on ferns description ability are ignored and prevent them to describe the object more accurately and robustly. This paper proposes a novel spatial arrangement of binary tests to divide the bounding box into grids in order to keep more details of the image for visual tracking. Experimental results show that this method can improve tracking accuracy effectively.

  7. National Smart Water Grid

    SciTech Connect

    Beaulieu, R A

    2009-07-13

    The United States repeatedly experiences floods along the Midwest's large rivers and droughts in the arid Western States that cause traumatic environmental conditions with huge economic impact. With an integrated approach and solution these problems can be alleviated. Tapping into the Mississippi River and its tributaries, the world's third largest fresh water river system, during flood events will mitigate the damage of flooding and provide a new source of fresh water to the Western States. The trend of increased flooding on the Midwest's large rivers is supported by a growing body of scientific literature. The Colorado River Basin and the western states are experiencing a protracted multi-year drought. Fresh water can be pumped via pipelines from areas of overabundance/flood to areas of drought or high demand. Calculations document 10 to 60 million acre-feet (maf) of fresh water per flood event can be captured from the Midwest's Rivers and pumped via pipelines to the Colorado River and introduced upstream of Lake Powell, Utah, to destinations near Denver, Colorado, and used in areas along the pipelines. Water users of the Colorado River include the cities in southern Nevada, southern California, northern Arizona, Colorado, Utah, Indian Tribes, and Mexico. The proposed start and end points, and routes of the pipelines are documented, including information on right-of-ways necessary for state and federal permits. A National Smart Water Grid{trademark} (NSWG) Project will create thousands of new jobs for construction, operation, and maintenance and save billions in drought and flood damage reparations tax dollars. The socio-economic benefits of NWSG include decreased flooding in the Midwest; increased agriculture, and recreation and tourism; improved national security, transportation, and fishery and wildlife habitats; mitigated regional climate change and global warming such as increased carbon capture; decreased salinity in Colorado River water crossing the US

  8. D. Carlos de Bragança, a Pioneer of Experimental Marine Oceanography: Filling the Gap Between Formal and Informal Science Education

    NASA Astrophysics Data System (ADS)

    Faria, Cláudia; Pereira, Gonçalo; Chagas, Isabel

    2012-06-01

    The activities presented in this paper are part of a wider project that investigates the effects of infusing the history of science in science teaching, toward students' learning and attitude. Focused on the work of D. Carlos de Bragança, King of Portugal from 1889 to 1908, and a pioneer oceanographer, the activities are addressed at the secondary Biology curriculum (grade 10, ages 15, 16). The proposed activities include a pre-visit orientation task, two workshops performed in a science museum and a follow-up learning task. In class, students have to analyse original historical excerpts of the king's work, in order to discuss and reflect about the nature of science. In the museum, students actively participate in two workshops: biological classification and specimen drawing. All students considered the project relevant for science learning, stating that it was important not only for knowledge acquisition but also for the understanding of the nature of science. As a final remark we stress the importance of creating activities informed by the history of science as a foundation for improving motivation, sustaining effective science teaching and meaningful science learning, and as a vehicle to promote a closer partnership between schools and science museums.

  9. High energy collimating fine grids

    NASA Technical Reports Server (NTRS)

    Arrieta, Victor M.; Tuffias, Robert H.; Laferla, Raffaele

    1995-01-01

    The objective of this project was to demonstrate the fabrication of extremely tight tolerance collimating grids using a high-Z material, specifically tungsten. The approach taken was to fabricate grids by a replication method involving the coating of a silicon grid substrate with tungsten by chemical vapor deposition (CVD). A negative of the desired grid structure was fabricated in silicon using highly wafering techniques developed for the semiconductor industry and capable of producing the required tolerances. Using diamond wafering blades, a network of accurately spaced slots was machined into a single-crystal silicon surface. These slots were then filled with tungsten by CVD, via the hydrogen reduction of tungsten hexafluoride. Following tungsten deposition, the silicon negative was etched away to leave the tungsten collimating grid structure. The project was divided into five tasks: (1) identify materials of construction for the replica and final collimating grid structures; (2) identify and implement a micromachining technique for manufacturing the negative collimator replicas (performed by NASA/JPL); (3) develop a CVD technique and processing parameters suitable for the complete tungsten densification of the collimator replicas; (4) develop a chemical etching technique for the removal of the collimator replicas after the tungsten deposition process; and (5) fabricate and deliver tungsten collimating grid specimens.

  10. GridOPTICS Software System

    SciTech Connect

    Akyol, Bora A; Ciraci, PNNL Selim; Gibson, PNNL Tara; Rice, PNNL Mark; Sharma, PNNL Poorva; Yin, PNNL Jian; Allwardt, PNNL Craig; PNNL,

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allow power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.

  11. A Java commodity grid kit.

    SciTech Connect

    von Laszewski, G.; Foster, I.; Gawor, J.; Lane, P.; Mathematics and Computer Science

    2001-07-01

    In this paper we report on the features of the Java Commodity Grid Kit. The Java CoG Kit provides middleware for accessing Grid functionality from the Java framework. Java CoG Kit middleware is general enough to design a variety of advanced Grid applications with quite different user requirements. Access to the Grid is established via Globus protocols, allowing the Java CoG Kit to communicate also with the C Globus reference implementation. Thus, the Java CoG Kit provides Grid developers with the ability to utilize the Grid, as well as numerous additional libraries and frameworks developed by the Java community to enable network, Internet, enterprise, and peer-to peer computing. A variety of projects have successfully used the client libraries of the Java CoG Kit to access Grids driven by the C Globus software. In this paper we also report on the efforts to develop server side Java CoG Kit components. As part of this research we have implemented a prototype pure Java resource management system that enables one to run Globus jobs on platforms on which a Java virtual machine is supported, including Windows NT machines.

  12. GridOPTICS Software System

    Energy Science and Technology Software Center (ESTSC)

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allowmore » power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.« less

  13. Implicit large eddy simulation of a scalar mixing layer in fractal grid turbulence

    NASA Astrophysics Data System (ADS)

    Watanabe, Tomoaki; Sakai, Yasuhiko; Nagata, Kouji; Ito, Yasumasa; Hayase, Toshiyuki

    2016-07-01

    A scalar mixing layer in fractal grid turbulence is simulated by the implicit large eddy simulation (ILES) using low-pass filtering as an implicit subgrid-scale model. The square-type fractal grid with three fractal iterations is used for generating turbulence. The streamwise evolutions of the streamwise velocity statistics obtained in the ILES are in good agreement with the experimental results. The ILES results are used for investigating the development of the scalar mixing layer behind the fractal grid. The results show that the vertical development of the scalar mixing layer strongly depends on the spanwise location. Near the fractal grid, the scalar mixing layer rapidly develops just behind the largest grid bars owing to the vertical turbulent transport. The scalar mixing layer near the fractal grid also develops outside the largest grid bars because the scalar is transported between the outside and back of the largest grid bars by the spanwise turbulent transport. In the downstream region, the scalar mixing layer develops more rapidly near the grid centerline by the vertical turbulent transport and by the spanwise one which transports the scalar between the back of the largest grid bars and both the centerline and outer edge of the fractal grid. Then, the mean scalar profile becomes close to be homogeneous in the spanwise direction.

  14. Grid Visualization Tool

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steven

    2005-01-01

    The Grid Visualization Tool (GVT) is a computer program for displaying the path of a mobile robotic explorer (rover) on a terrain map. The GVT reads a map-data file in either portable graymap (PGM) or portable pixmap (PPM) format, representing a gray-scale or color map image, respectively. The GVT also accepts input from path-planning and activity-planning software. From these inputs, the GVT generates a map overlaid with one or more rover path(s), waypoints, locations of targets to be explored, and/or target-status information (indicating success or failure in exploring each target). The display can also indicate different types of paths or path segments, such as the path actually traveled versus a planned path or the path traveled to the present position versus planned future movement along a path. The program provides for updating of the display in real time to facilitate visualization of progress. The size of the display and the map scale can be changed as desired by the user. The GVT was written in the C++ language using the Open Graphics Library (OpenGL) software. It has been compiled for both Sun Solaris and Linux operating systems.

  15. Symbolic Constraint Maintenance Grid

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.

  16. National transmission grid study

    SciTech Connect

    Abraham, Spencer

    2003-05-31

    The National Energy Policy Plan directed the U.S. Department of Energy (DOE) to conduct a study to examine the benefits of establishing a national electricity transmission grid and to identify transmission bottlenecks and measures to address them. DOE began by conducting an independent analysis of U.S. electricity markets and identifying transmission system bottlenecks using DOE’s Policy Office Electricity Modeling System (POEMS). DOE’s analysis, presented in Section 2, confirms the central role of the nation’s transmission system in lowering costs to consumers through increased trade. More importantly, DOE’s analysis also confirms the results of previous studies, which show that transmission bottlenecks and related transmission system market practices are adding hundreds of millions of dollars to consumers’ electricity bills each year. A more detailed technical overview of the use of POEMS is provided in Appendix A. DOE led an extensive, open, public input process and heard a wide range of comments and recommendations that have all been considered.1 More than 150 participants registered for three public workshops held in Detroit, MI (September 24, 2001); Atlanta, GA (September 26, 2001); and Phoenix, AZ (September 28, 2001).

  17. Buildings-to-Grid Technical Opportunities: From the Grid Perspective

    SciTech Connect

    Kropski, Ben; Pratt, Rob

    2014-03-28

    This paper outlines the nature of the power grid, lists challenges and barriers to the implementation of a transactive energy ecosystem, and provides concept solutions to current technological impediments.

  18. Redirecting science

    SciTech Connect

    Aaserud, F.

    1990-01-01

    This book contains the following chapters. Science policy and fund-raising up to 1934; The Copenhagen spirit at work, late 1920's to mid-1930s; The refugee problem, 1933 to 1935; Experimental biology, late 1920s to 1935; and Consolidation of the transition, 1935 to 1940.

  19. Simulation of an Isolated Tiltrotor in Hover with an Unstructured Overset-Grid RANS Solver

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, Elizabeth M.; Biedron, Robert T.

    2009-01-01

    An unstructured overset-grid Reynolds Averaged Navier-Stokes (RANS) solver, FUN3D, is used to simulate an isolated tiltrotor in hover. An overview of the computational method is presented as well as the details of the overset-grid systems. Steady-state computations within a noninertial reference frame define the performance trends of the rotor across a range of the experimental collective settings. Results are presented to show the effects of off-body grid refinement and blade grid refinement. The computed performance and blade loading trends show good agreement with experimental results and previously published structured overset-grid computations. Off-body flow features indicate a significant improvement in the resolution of the first perpendicular blade vortex interaction with background grid refinement across the collective range. Considering experimental data uncertainty and effects of transition, the prediction of figure of merit on the baseline and refined grid is reasonable at the higher collective range- within 3 percent of the measured values. At the lower collective settings, the computed figure of merit is approximately 6 percent lower than the experimental data. A comparison of steady and unsteady results show that with temporal refinement, the dynamic results closely match the steady-state noninertial results which gives confidence in the accuracy of the dynamic overset-grid approach.

  20. Charting the collision between a seed coat fragment and newly-designed lint cleaner grid bars

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was run to determine how a seed coat fragment (SCF) reacts after colliding with newly-designed grid bars mounted on a saw-type lint cleaner simulator. A high-speed video camera recorded the action that took place. Ten experimental grid bars were tested. The included angle of the sha...

  1. Skeletal muscle grids for assessing current distributions from defibrillation shocks.

    PubMed

    Schmidt, J; Gatlin, B; Eason, J; Koomullil, G; Pilkington, T

    1992-01-01

    This paper utilizes a structured and an unstructured grid representation of a torso with an anisotropic skeletal muscle to assess current distributions from defibrillation shocks. The results show that a finite-element solution on an unstructured grid of 400,000 elements (60,000 nodes) achieves comparable current distributions with a finite-difference solution on a structured grid that uses approximately the same number of nodes. Moreover, a finite-element solution on a 65,000-element (10,500 nodes) unstructured grid yielded fractional percent current results within 5% of the finer grids. The structured and unstructured grid models are used to investigate recent interpretations of experimental data that concluded that more than 80% of the total defibrillation current is shunted by the anisotropic skeletal muscle thoracic cage. It is concluded that these interpretations, which were based on a one-dimensional resistive network representation of the three-dimensional defibrillation situation, overestimate by 25% the current shunted by the anisotropic thoracic cage. PMID:1424684

  2. ASCR Science Network Requirements

    SciTech Connect

    Dart, Eli; Tierney, Brian

    2009-08-24

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

  3. 3D laser inspection of fuel assembly grid spacers for nuclear reactors based on diffractive optical elements

    NASA Astrophysics Data System (ADS)

    Finogenov, L. V.; Lemeshko, Yu A.; Zav'yalov, P. S.; Chugui, Yu V.

    2007-06-01

    Ensuring the safety and high operation reliability of nuclear reactors takes 100% inspection of geometrical parameters of fuel assemblies, which include the grid spacers performed as a cellular structure with fuel elements. The required grid spacer geometry of assembly in the transverse and longitudinal cross sections is extremely important for maintaining the necessary heat regime. A universal method for 3D grid spacer inspection using a diffractive optical element (DOE), which generates as the structural illumination a multiple-ring pattern on the inner surface of a grid spacer cell, is investigated. Using some DOEs one can inspect the nomenclature of all produced grids. A special objective has been developed for forming the inner surface cell image. The problems of diffractive elements synthesis, projecting optics calculation, adjusting methods as well as calibration of the experimental measuring system are considered. The algorithms for image processing for different constructive elements of grids (cell, channel hole, outer grid spacer rim) and the experimental results are presented.

  4. Smart Wire Grid: Resisting Expectations

    SciTech Connect

    Ramsay, Stewart; Lowe, DeJim

    2014-03-03

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  5. Smart Wire Grid: Resisting Expectations

    ScienceCinema

    Ramsay, Stewart; Lowe, DeJim

    2014-04-09

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  6. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  7. Discretization formulas for unstructured grids

    NASA Technical Reports Server (NTRS)

    Baumeister, Kenneth J.

    1988-01-01

    The Galerkin weighted residual technique using linear triangular weight functions is employed to develop finite difference formula in cartesian coordinates for the Laplacian operator, first derivative operators and the function for unstructured triangular grids. The weighted residual coefficients associated with the weak formulation of the Laplacian operator are shown to agree with the Taylor series approach on a global average. In addition, a simple algorithm is presented to determine the Voronoi (finite difference) area of an unstructured grid.

  8. Reinventing Batteries for Grid Storage

    ScienceCinema

    Banerjee, Sanjoy

    2013-05-29

    The City University of New York's Energy Institute, with the help of ARPA-E funding, is creating safe, low cost, rechargeable, long lifecycle batteries that could be used as modular distributed storage for the electrical grid. The batteries could be used at the building level or the utility level to offer benefits such as capture of renewable energy, peak shaving and microgridding, for a safer, cheaper, and more secure electrical grid.

  9. Multi baseline Grid Software Correlator

    NASA Astrophysics Data System (ADS)

    Moritaka, Kimura; Nakajima, Junichi; Kondo, Tetsuro

    Software VLBI correlation is regarded as a solution for next generation VLBI. With a flexibility of the software correlation programming, appropriate scientific correlations by scientists are possible as well as the post processing. As the first experiment to handle Gbps VLBI data, multi baseline Grid correlator have been developing at CRL. The performance of software correlation adopted multi CPUs, SIMD architectures and Grid computing technology has nearly reached hardware correlator performance.

  10. Reinventing Batteries for Grid Storage

    SciTech Connect

    Banerjee, Sanjoy

    2012-01-01

    The City University of New York's Energy Institute, with the help of ARPA-E funding, is creating safe, low cost, rechargeable, long lifecycle batteries that could be used as modular distributed storage for the electrical grid. The batteries could be used at the building level or the utility level to offer benefits such as capture of renewable energy, peak shaving and microgridding, for a safer, cheaper, and more secure electrical grid.

  11. An Experimental Evaluation of the Effectiveness of the Biological Sciences Curriculum Study Special Materials Approach to Teaching Biology to the Slow Learner.

    ERIC Educational Resources Information Center

    Welford, John Mack

    Students (comparable in intelligence and ability) in slow-learning classes using either "Biological Sciences Curriculum Study (BSCS) Special Materials" or some other slow-learner biology materials, were compared on the basis of scores on the "Nelson Biology Test", the "Biological Sciences; Patterns and Processes Final Examination", and two short…

  12. Improving scientists' interaction with complex computational-visualization environments based on a distributed grid infrastructure.

    PubMed

    Kalawsky, R S; O'Brien, J; Coveney, P V

    2005-08-15

    The grid has the potential to transform collaborative scientific investigations through the use of closely coupled computational and visualization resources, which may be geographically distributed, in order to harness greater power than is available at a single site. Scientific applications to benefit from the grid include visualization, computational science, environmental modelling and medical imaging. Unfortunately, the diversity, scale and location of the required resources can present a dilemma for the scientific worker because of the complexity of the underlying technology. As the scale of the scientific problem under investigation increases so does the nature of the scientist's interaction with the supporting infrastructure. The increased distribution of people and resources within a grid-based environment can make resource sharing and collaborative interaction a critical factor to their success. Unless the technological barriers affecting user accessibility are reduced, there is a danger that the only scientists to benefit will be those with reasonably high levels of computer literacy. This paper examines a number of important human factors of user interaction with the grid and expresses this in the context of the science undertaken by RealityGrid, a project funded by the UK e-Science programme. Critical user interaction issues will also be highlighted by comparing grid computational steering with supervisory control systems for local and remote access to the scientific environment. Finally, implications for future grid developers will be discussed with a particular emphasis on how to improve the scientists' access to what will be an increasingly important resource. PMID:16099754

  13. Sensitivity of 30-cm mercury bombardment ion thruster characteristics to accelerator grid design

    NASA Technical Reports Server (NTRS)

    Rawlin, V. K.

    1978-01-01

    The design of ion optics for bombardment thrusters strongly influences overall performance and lifetime. The operation of a 30-cm thruster with accelerator grid open area fractions ranging from 43 to 24 percent, was evaluated and compared with previously published experimental and theoretical results. Ion optics properties measured included the beam current extraction capability, the minimum accelerator grid voltage to prevent backstreaming, ion beamlet diameter as a function of radial position on the grid and accelerator grid hole diameter, and the high energy, high angle ion beam edge location. Discharge chamber properties evaluated were propellant utilization efficiency, minimum discharge power per beam amp, and minimum discharge voltage.

  14. Sensitivity of 30-cm mercury bombardment ion thruster characteristics to accelerator grid design

    NASA Technical Reports Server (NTRS)

    Rawlin, V. K.

    1978-01-01

    The design of ion optics for bombardment thrusters strongly influences overall performance and lifetime. The operation of a 30 cm thruster with accelerator grid open area fractions ranging from 43 to 24 percent, was evaluated and compared with experimental and theoretical results. Ion optics properties measured included the beam current extraction capability, the minimum accelerator grid voltage to prevent backstreaming, ion beamlet diameter as a function of radial position on the grid and accelerator grid hole diameter, and the high energy, high angle ion beam edge location. Discharge chamber properties evaluated were propellant utilization efficiency, minimum discharge power per beam amp, and minimum discharge voltage.

  15. CDF GlideinWMS usage in Grid computing of high energy physics

    NASA Astrophysics Data System (ADS)

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor

    2010-04-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  16. Grid-Enabled High Energy Physics Research using a Beowulf Cluster

    NASA Astrophysics Data System (ADS)

    Mahmood, Akhtar

    2005-04-01

    At Edinboro University of Pennsylvania, we have built a 8-node 25 Gflops Beowulf Cluster with 2.5 TB of disk storage space to carry out grid-enabled, data-intensive high energy physics research for the ATLAS experiment via Grid3. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes. Once fully functional, the Cluster will be part of Grid3[www.ivdgl.org/grid3]. The current ATLAS simulation grid application, models the entire physical processes from the proton anti-proton collisions and detector's response to the collision debri through the complete reconstruction of the event from analyses of these responses. The end result is a detailed set of data that simulates the real physical collision event inside a particle detector. Grid is the new IT infrastructure for the 21^st century science -- a new computing paradigm that is poised to transform the practice of large-scale data-intensive research in science and engineering. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.

  17. CDF GlideinWMS usage in grid computing of high energy physics

    SciTech Connect

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor; /Fermilab

    2010-01-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  18. Grid-Optimization Program for Photovoltaic Cells

    NASA Technical Reports Server (NTRS)

    Daniel, R. E.; Lee, T. S.

    1986-01-01

    CELLOPT program developed to assist in designing grid pattern of current-conducting material on photovoltaic cell. Analyzes parasitic resistance losses and shadow loss associated with metallized grid pattern on both round and rectangular solar cells. Though performs sensitivity studies, used primarily to optimize grid design in terms of bus bar and grid lines by minimizing power loss. CELLOPT written in APL.

  19. Grid accounting service: state and future development

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-06-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  20. Grid accounting service: state and future development

    SciTech Connect

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-01-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.