Science.gov

Sample records for science grid experimental

  1. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  2. The Open Science Grid

    SciTech Connect

    Pordes, Ruth; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Wurthwein, Frank; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  3. New Science on the Open Science Grid

    SciTech Connect

    Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; /Fermilab /Florida U. /Chicago U. /Caltech /LBL, Berkeley /Wisconsin U., Madison /Indiana U. /Brookhaven /UC, San Diego

    2008-06-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  4. Space-based Science Operations Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    science experimenters. There is an international aspect to the Grid involving the America's Pathway (AMPath) network, the Chilean REUNA Research and Education Network and the University of Chile in Santiago that will further demonstrate how extensive these services can be used. From the user's perspective, the Prototype will provide a single interface and logon to these varied services without the complexity of knowing the where's and how's of each service. There is a separate and deliberate emphasis on security. Security will be addressed by specifically outlining the different approaches and tools used. Grid technology, unlike the Internet, is being designed with security in mind. In addition we will show the locations, configurations and network paths associated with each service and virtual organization. We will discuss the separate virtual organizations that we define for the varied user communities. These will include certain, as yet undetermined, space-based science functions and/or processes and will include specific virtual organizations required for public and educational outreach and science and engineering collaboration. We will also discuss the Grid Prototype performance and the potential for further Grid applications both space-based and ground based projects and processes. In this paper and presentation we will detail each service and how they are integrated using Grid

  5. Grid for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  6. Enabling Campus Grids with Open Science Grid Technology

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian; Fraser, Dan; Pordes, Ruth; Swanson, David

    2011-12-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  7. Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, Vickie; Chen, Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of 1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  8. Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Chen, Meili; Cobb, John W; Kohl, James Arthur; Miller, Stephen D; Speirs, David A; Vazhkudai, Sudharshan S

    2010-01-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of $1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  9. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  10. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:19579217

  11. TeraGrid Gateways for Earth Science

    NASA Astrophysics Data System (ADS)

    Wilkins-Diehr, Nancy

    2010-05-01

    The increasingly digital component of science today poses exciting challenges and opportunities for researchers. Whether it's streaming data from sensors to computations, tagging video in the study of language patterns or the use of geographic information systems to anticipate the spread of disease, the challenges are enormous and continue to grow. The existence of advanced cyberinfrastructure (CI) tools or science gateways can significantly increase the productivity of researchers facing the most difficult challenges - in some cases making the impossible possible. The TeraGrid Science Gateways program works to incorporate high end resources through these community-designed interfaces. This talk will present an overview of TeraGrid's gateway program and highlight several gateways in atmospheric science, earth sciences and geography and regional science, geophysics, global atmospheric research, materials research and seismology.

  12. Parallel Grid Manipulations in Earth Science Calculations

    NASA Technical Reports Server (NTRS)

    Sawyer, W.; Lucchesi, R.; daSilva, A.; Takacs, L. L.

    1999-01-01

    sparse interpolation with little data locality between the physical lat-lon grid and a pole rotated computational grid- can be solved efficiently and at the GFlop/s rates needed to solve tomorrow's high resolution earth science models. In the subsequent presentation we will discuss the design and implementation of PILGRIM as well as a number of the problems it is required to solve. Some conclusions will be drawn about the potential performance of the overall earth science models on the supercomputer platforms foreseen for these problems.

  13. Grids for Dummies: Featuring Earth Science Data Mining Application

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  14. The Neutron Science TeraGrid Gateway, a TeraGrid Science Gateway to Support the Spallation Neutron Source

    SciTech Connect

    Cobb, John W; Geist, Al; Kohl, James Arthur; Miller, Stephen D; Peterson, Peter F; Pike, Gregory; Reuter, Michael A; Swain, William; Vazhkudai, Sudharshan S; Vijayakumar, Nithya N

    2006-01-01

    The National Science Foundation's (NSF's) Extensible Terascale Facility (ETF), or TeraGrid [1] is entering its operational phase. An ETF science gateway effort is the Neutron Science TeraGrid Gateway (NSTG.) The Oak Ridge National Laboratory (ORNL) resource provider effort (ORNL-RP) during construction and now in operations is bridging a large scale experimental community and the TeraGrid as a large-scale national cyberinfrastructure. Of particular emphasis is collaboration with the Spallation Neutron Source (SNS) at ORNL. The U.S. Department of Energy's (DOE's) SNS [2] at ORNL will be commissioned in spring of 2006 as the world's brightest source of neutrons. Neutron science users can run experiments, generate datasets, perform data reduction, analysis, visualize results; collaborate with remotes users; and archive long term data in repositories with curation services. The ORNL-RP and the SNS data analysis group have spent 18 months developing and exploring user requirements, including the creation of prototypical services such as facility portal, data, and application execution services. We describe results from these efforts and discuss implications for science gateway creation. Finally, we show incorporation into implementation planning for the NSTG and SNS architectures. The plan is for a primarily portal-based user interaction supported by a service oriented architecture for functional implementation.

  15. Technology for a NASA Space-Based Science Operations Grid

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.

    2003-01-01

    This viewgraph representation presents an overview of a proposal to develop a space-based operations grid in support of space-based science experiments. The development of such a grid would provide a dynamic, secure and scalable architecture based on standards and next-generation reusable software and would enable greater science collaboration and productivity through the use of shared resources and distributed computing. The authors propose developing this concept for use on payload experiments carried aboard the International Space Station. Topics covered include: grid definitions, portals, grid development and coordination, grid technology and potential uses of such a grid.

  16. The Open Science Grid status and architecture

    SciTech Connect

    Pordes, Ruth; Petravick, Don; Kramer, Bill; Olsen, James D.; Livny, Miron; Roy, Gordon A.; Avery, Paul Ralph; Blackburn, Kent; Wenaus, Torre J.; Wuerthwein, Frank K.; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  17. The Open Science Grid status and architecture

    NASA Astrophysics Data System (ADS)

    Pordes, R.; Petravick, D.; Kramer, B.; Olson, D.; Livny, M.; Roy, A.; Avery, P.; Blackburn, K.; Wenaus, T.; Würthwein, F.; Foster, I.; Gardner, R.; Wilde, M.; Blatecky, A.; McGee, J.; Quick, R.

    2008-07-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  18. European grid services for global earth science

    NASA Astrophysics Data System (ADS)

    Brewer, S.; Sipos, G.

    2012-04-01

    This presentation will provide an overview of the distributed computing services that the European Grid Infrastructure (EGI) offers to the Earth Sciences community and also explain the processes whereby Earth Science users can engage with the infrastructure. One of the main overarching goals for EGI over the coming year is to diversify its user-base. EGI therefore - through the National Grid Initiatives (NGIs) that provide the bulk of resources that make up the infrastructure - offers a number of routes whereby users, either individually or as communities, can make use of its services. At one level there are two approaches to working with EGI: either users can make use of existing resources and contribute to their evolution and configuration; or alternatively they can work with EGI, and hence the NGIs, to incorporate their own resources into the infrastructure to take advantage of EGI's monitoring, networking and managing services. Adopting this approach does not imply a loss of ownership of the resources. Both of these approaches are entirely applicable to the Earth Sciences community. The former because researchers within this field have been involved with EGI (and previously EGEE) as a Heavy User Community and the latter because they have very specific needs, such as incorporating HPC services into their workflows, and these will require multi-skilled interventions to fully provide such services. In addition to the technical support services that EGI has been offering for the last year or so - the applications database, the training marketplace and the Virtual Organisation services - there now exists a dynamic short-term project framework that can be utilised to establish and operate services for Earth Science users. During this talk we will present a summary of various on-going projects that will be of interest to Earth Science users with the intention that suggestions for future projects will emerge from the subsequent discussions: • The Federated Cloud Task

  19. AstroGrid-D: Grid technology for astronomical science

    NASA Astrophysics Data System (ADS)

    Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve

    2011-02-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.

  20. Virtual Experiments on the Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, V. E.; Cobb, J. W.; Farhi, E.; Miller, S. D.; Taylor, M.

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted.

  1. Virtual Experiments on the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Farhi, Emmanuel N; Miller, Stephen D; Taylor, M

    2008-01-01

    The TeraGrid's outreach effort to the neutron science community is creating an environment that is encouraging the exploration of advanced cyberinfrastructure being incorporated into facility operations in a way that leverages facility operations to multiply the scientific output of its users, including many NSF supported scientists in many disciplines. The Neutron Science TeraGrid Gateway serves as an exploratory incubator for several TeraGrid projects. Virtual neutron scattering experiments from one exploratory project will be highlighted.

  2. Public storage for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  3. Migrating Open Science Grid to RPMs*

    NASA Astrophysics Data System (ADS)

    Roy, Alain

    2012-12-01

    We recently completed a significant transition in the Open Science Grid (OSG) in which we moved our software distribution mechanism from the useful but niche system called Pacman to a community-standard native package system, RPM. In this paper we explore some of the lessons learned during this transition as well as our earlier work, lessons that we believe are valuable not only for software distribution and packaging, but also for software engineering in a distributed computing environment where reliability is critical. We discuss the benefits found in moving to a community standard, including the abilities to reuse existing packaging, to donate existing packaging back to the community, and to leverage existing skills in the community. We describe our approach to testing in which we test our software against multiple versions of the OS, including pre-releases of the OS, in order to find surprises before our users do. Finally, we discuss our large-scale evaluation testing and community testing, which are essential for both quality and community acceptance.

  4. Open computing grid for molecular science and engineering.

    PubMed

    Sild, Sulev; Maran, Uko; Lomaka, Andre; Karelson, Mati

    2006-01-01

    Grid is an emerging infrastructure for distributed computing that provides secure and scalable mechanisms for discovering and accessing remote software and data resources. Applications built on this infrastructure have great potential for addressing and solving large scale chemical, pharmaceutical, and material science problems. The article describes the concept behind grid computing and will present the OpenMolGRID system that is an open computing grid for molecular science and engineering. This system provides grid enabled components, such as a data warehouse for chemical data, software for building QSPR/QSAR models, and molecular engineering tools for generating compounds with predefined chemical properties or biological activities. The article also provides an overview about the availability of chemical applications in the grid. PMID:16711713

  5. Direct experimental determination of Frisch grid inefficiency in ionization chamber

    NASA Astrophysics Data System (ADS)

    Khriachkov, V. A.; Goverdovski, A. A.; Ketlerov, V. V.; Mitrofanov, V. F.; Semenova, N. N.

    1997-07-01

    The present work describes the method of direct experimental determination of the Frisch grid inefficiency in an ionization chamber. The method is based on analysis of the anode signal after Waveform Digitizer. It is shown that the calculated grid inefficiency value can differ much from the measured ones.

  6. Pilot job accounting and auditing in Open Science Grid

    SciTech Connect

    Sfiligoi, Igor; Green, Chris; Quinn, Greg; Thain, Greg; /Wisconsin U., Madison

    2008-06-01

    The Grid accounting and auditing mechanisms were designed under the assumption that users would submit their jobs directly to the Grid gatekeepers. However, many groups are starting to use pilot-based systems, where users submit jobs to a centralized queue and are successively transferred to the Grid resources by the pilot infrastructure. While this approach greatly improves the user experience, it does disrupt the established accounting and auditing procedures. Open Science Grid deploys gLExec on the worker nodes to keep the pilot-related accounting and auditing information and centralizes the accounting collection with GRATIA.

  7. Grid Technology as a Cyber Infrastructure for Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    This paper describes how grids and grid service technologies can be used to develop an infrastructure for the Earth Science community. This cyberinfrastructure would be populated with a hierarchy of services, including discipline specific services such those needed by the Earth Science community as well as a set of core services that are needed by most applications. This core would include data-oriented services used for accessing and moving data as well as computer-oriented services used to broker access to resources and control the execution of tasks on the grid. The availability of such an Earth Science cyberinfrastructure would ease the development of Earth Science applications. With such a cyberinfrastructure, application work flows could be created to extract data from one or more of the Earth Science archives and then process it by passing it through various persistent services that are part of the persistent cyberinfrastructure, such as services to perform subsetting, reformatting, data mining and map projections.

  8. Nuclear test experimental science

    SciTech Connect

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S.

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research.

  9. A Grid Metadata Service for Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Negro, Alessandro; Aloisio, Giovanni

    2010-05-01

    Critical challenges for climate modeling researchers are strongly connected with the increasingly complex simulation models and the huge quantities of produced datasets. Future trends in climate modeling will only increase computational and storage requirements. For this reason the ability to transparently access to both computational and data resources for large-scale complex climate simulations must be considered as a key requirement for Earth Science and Environmental distributed systems. From the data management perspective (i) the quantity of data will continuously increases, (ii) data will become more and more distributed and widespread, (iii) data sharing/federation will represent a key challenging issue among different sites distributed worldwide, (iv) the potential community of users (large and heterogeneous) will be interested in discovery experimental results, searching of metadata, browsing collections of files, compare different results, display output, etc.; A key element to carry out data search and discovery, manage and access huge and distributed amount of data is the metadata handling framework. What we propose for the management of distributed datasets is the GRelC service (a data grid solution focusing on metadata management). Despite the classical approaches, the proposed data-grid solution is able to address scalability, transparency, security and efficiency and interoperability. The GRelC service we propose is able to provide access to metadata stored in different and widespread data sources (relational databases running on top of MySQL, Oracle, DB2, etc. leveraging SQL as query language, as well as XML databases - XIndice, eXist, and libxml2 based documents, adopting either XPath or XQuery) providing a strong data virtualization layer in a grid environment. Such a technological solution for distributed metadata management leverages on well known adopted standards (W3C, OASIS, etc.); (ii) supports role-based management (based on VOMS), which

  10. Unlocking the potential of smart grid technologies with behavioral science

    PubMed Central

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  11. Unlocking the potential of smart grid technologies with behavioral science.

    PubMed

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  12. ISS Space-Based Science Operations Grid for the Ground Systems Architecture Workshop (GSAW)

    NASA Technical Reports Server (NTRS)

    Welch, Clara; Bradford, Bob

    2003-01-01

    Contents include the following:What is grid? Benefits of a grid to space-based science operations. Our approach. Score of prototype grid. The security question. Short term objectives. Long term objectives. Space-based services required for operations. The prototype. Score of prototype grid. Prototype service layout. Space-based science grid service components.

  13. Optimal response to attacks on the open science grids.

    SciTech Connect

    Altunay, M.; Leyffer, S.; Linderoth, J. T.; Xie, Z.

    2011-01-01

    Cybersecurity is a growing concern, especially in open grids, where attack propagation is easy because of prevalent collaborations among thousands of users and hundreds of institutions. The collaboration rules that typically govern large science experiments as well as social networks of scientists span across the institutional security boundaries. A common concern is that the increased openness may allow malicious attackers to spread more readily around the grid. We consider how to optimally respond to attacks in open grid environments. To show how and why attacks spread more readily around the grid, we first discuss how collaborations manifest themselves in the grids and form the collaboration network graph, and how this collaboration network graph affects the security threat levels of grid participants. We present two mixed-integer program (MIP) models to find the optimal response to attacks in open grid environments, and also calculate the threat level associated with each grid participant. Given an attack scenario, our optimal response model aims to minimize the threat levels at unaffected participants while maximizing the uninterrupted scientific production (continuing collaborations). By adopting some of the collaboration rules (e.g., suspending a collaboration or shutting down a site), the model finds optimal response to subvert an attack scenario.

  14. Data Grid tools: enabling science on big distributed data

    NASA Astrophysics Data System (ADS)

    Allcock, Bill; Chervenak, Ann; Foster, Ian; Kesselman, Carl; Livny, Miron

    2005-01-01

    A particularly demanding and important challenge that we face as we attempt to construct the distributed computing machinery required to support SciDAC goals is the efficient, high-performance, reliable, secure, and policy-aware management of large-scale data movement. This problem is fundamental to diverse application domains including experimental physics (high energy physics, nuclear physics, light sources), simulation science (climate, computational chemistry, fusion, astrophysics), and large-scale collaboration. In each case, highly distributed user communities require high-speed access to valuable data, whether for visualization or analysis. The quantities of data involved (terabytes to petabytes), the scale of the demand (hundreds or thousands of users, data-intensive analyses, real-time constraints), and the complexity of the infrastructure that must be managed (networks, tertiary storage systems, network caches, computers, visualization systems) make the problem extremely challenging. Data management tools developed under the auspices of the SciDAC Data Grid Middleware project have become the de facto standard for data management in projects worldwide. Day in and day out, these tools provide the "plumbing" that allows scientists to do more science on an unprecedented scale in production environments.

  15. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  16. Enabling Science and Engineering Applications on the Grid

    SciTech Connect

    Seidel, Ed

    2004-08-25

    The Grid has the potential to fundamentally change the way science and engineering are done. Aggregate power of computing resources connected by networks - of the Grid - exceeds that of any single supercomputer by many orders of magnitude. At the same time, our ability to carry out computations of the scale and level of detail required, for example, to study the Universe, or simulate a rocket engine, are severely constrained by available computing power. Hence, such applications should be one of the main driving forces behind the development of Grid computing. I will discuss some large scale applications, including simulations of colliding black holes, and show how they are driving the development of Grid computing technology. Applications are already being developed that are not only aware of their needs, but also of the resources available to them on the Grid. They will be able to adapt themselves automatically to respond to their changing needs, to spawn off tasks on other resources, and to adapt to the changing characteristics of the Grid including machine and network loads and availability. I will discuss a number of innovative scenarios for computing on the Grid enabled by such technologies, and demonstrate how close these are to being a reality.

  17. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  18. Earth Science applications on Grid -advantages and limitations

    NASA Astrophysics Data System (ADS)

    Petitdidier, M.; Schwichtenberg, H.

    2012-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies…. Our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly… The technical challenge is to put together databases and computing resources to answer the ES challenges. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites, (2) new algorithms and methodologies have been developed using new technologies and compute resources. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity were deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted to decrease uncertainties by increasing the probability of occurrence via a larger number of runs. Some limitations are related to the combination of databases-outside the grid infrastructure- and grid compute resources; and to real-time applications that need resource reservation in order to insure results at given time. As a matter of fact ES scientists use different compute resources according to the phase of their application are used to work in large projects and share their results. They need a service-oriented architecture and a platform of

  19. DZero data-intensive computing on the Open Science Grid

    SciTech Connect

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.; /Fermilab

    2007-09-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  20. Open Science Grid: Linking Universities and Laboratories In National Cyberinfrastructure

    NASA Astrophysics Data System (ADS)

    Avery, Paul

    2011-10-01

    Open Science Grid is a consortium of researchers from universities and national laboratories that operates a national computing infrastructure serving large-scale scientific and engineering research. While OSG's scale has been primarily driven by the demands of the LHC experiments, it currently serves particle and nuclear physics, gravitational wave searches, digital astronomy, genomic science, weather forecasting, molecular modeling, structural biology and nanoscience. The OSG distributed computing facility links campus and regional computing resources and is a major component of the Worldwide LHC Computing Grid (WLCG) that handles the massive computing and storage needs of experiments at the Large Hadron Collider. This collaborative work has provided a wealth of results, including powerful new software tools and services; a uniform packaging scheme (the Virtual Data Toolkit) that simplifies software deployment across many sites in the US and Europe; integration of complex tools and services in large science applications; multiple education and outreach projects; and new approaches to integrating advanced network infrastructure in scientific computing applications. More importantly, OSG has provided unique collaborative opportunities between researchers in a variety of research disciplines.

  1. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  2. Experimental results of an iodine plasma in PEGASES gridded thruster

    NASA Astrophysics Data System (ADS)

    Grondein, Pascaline; Aanesland, Ane

    2015-09-01

    In the electric gridded thruster PEGASES, both positive and negative ions are expelled after extraction from an ion-ion plasma. This ion-ion plasma is formed downstream a localized magnetic field placed a few centimeters from the ionization region, trapping and cooling down the electron to allow a better attachment to an electronegative gas. For this thruster concept, iodine has emerged as the most attractive option. Heavy, under diatomic form and therefore good for high thrust, its low ionization threshold and high electronegativity lead to high ion-ion densities and low RF power. After the proof-of-concept of PEGASES using SF6 as propellant, we present here experimental results of an iodine plasma studied inside PEGASES thruster. At solid state at standard temperature and pressure, iodine is heated to sublimate, then injected inside the chamber where the neutral gas is heated and ionized. The whole injection system is heated to avoid deposition on surfaces and a mass flow controller allows a fine control on the neutral gas mass flow. A 3D translation stage inside the vacuum chamber allows volumetric plasma studies using electrostatic probes. The results are also compared with the global model dedicated to iodine as propellant for electric gridded thrusters. This work has been done within the LABEX Plas@par project, and received financial state aid managed by the Agence Nationale de la Recherche, as part of the programme ``Investissements d'avenir.''

  3. e-Science, caGrid, and Translational Biomedical Research

    PubMed Central

    Saltz, Joel; Kurc, Tahsin; Hastings, Shannon; Langella, Stephen; Oster, Scott; Ervin, David; Sharma, Ashish; Pan, Tony; Gurcan, Metin; Permar, Justin; Ferreira, Renato; Payne, Philip; Catalyurek, Umit; Caserta, Enrico; Leone, Gustavo; Ostrowski, Michael C.; Madduri, Ravi; Foster, Ian; Madhavan, Subhashree; Buetow, Kenneth H.; Shanbhag, Krishnakant; Siegel, Eliot

    2011-01-01

    Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies. PMID:21311723

  4. Physical Science Laboratory Manual, Experimental Version.

    ERIC Educational Resources Information Center

    Cooperative General Science Project, Atlanta, GA.

    Provided are physical science laboratory experiments which have been developed and used as a part of an experimental one year undergraduate course in general science for non-science majors. The experiments cover a limited number of topics representative of the scientific enterprise. Some of the topics are pressure and buoyancy, heat, motion,…

  5. Commissioning the HTCondor-CE for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Cartwright, T.; Frey, J.; Fajardo, E. M.; Lin, B.; Selmeci, M.; Tannenbaum, T.; Zvada, M.

    2015-12-01

    The HTCondor-CE is the next-generation gateway software for the Open Science Grid (OSG). This is responsible for providing a network service which authorizes remote users and provides a resource provisioning service (other well-known gateways include Globus GRAM, CREAM, Arc-CE, and Openstacks Nova). Based on the venerable HTCondor software, this new CE is simply a highly-specialized configuration of HTCondor. It was developed and adopted to provide the OSG with a more flexible, scalable, and easier-to-manage gateway software. Further, the focus of the HTCondor-CE is not job submission (as in GRAM or CREAM) but resource provisioning. This software does not exist in a vacuum: to deploy this gateway across the OSG, we had to integrate it with the CE configuration, deploy a corresponding information service, coordinate with sites, and overhaul our documentation.

  6. Grid infrastructure to support science portals for large scale instruments.

    SciTech Connect

    von Laszewski, G.; Foster, I.

    1999-09-29

    Soon, a new generation of scientific workbenches will be developed as a collaborative effort among various research institutions in the US. These scientific workbenches will be accessed in the Web via portals. Reusable components are needed to build such portals for different scientific disciplines, allowing uniform desktop access to remote resources. Such components will include tools and services enabling easy collaboration, job submission, job monitoring, component discovery, and persistent object storage. Based on experience gained from Grand Challenge applications for large-scale instruments, we demonstrate how Grid infrastructure components can be used to support the implementation of science portals. The availability of these components will simplify the prototype implementation of a common portal architecture.

  7. Who Needs Plants? Science (Experimental).

    ERIC Educational Resources Information Center

    Ropeik, Bernard H.; Kleinman, David Z.

    The basic elective course in introductory botany is designed for secondary students who probably will not continue study in plant science. The objectives of the course are to help the student 1) identify, compare and differentiate types of plants; 2) identify plant cell structures; 3) distinguish between helpful and harmful plants; 4) predict…

  8. SCEC Earthworks: A TeraGrid Science Gateway

    NASA Astrophysics Data System (ADS)

    Francoeur, H.; Muench, J.; Okaya, D.; Maechling, P.; Deelman, E.; Mehta, G.

    2006-12-01

    SCEC Earthworks is a scientific gateway designed to provide community wide access to the TeraGrid. Earthworks provides its users with a portal based interface for easily running anelastic wave propagation (AWM) simulations. Using Gridsphere and several portlets developed as a collaborative effort with IRIS, Earthworks enables users to run simulations without any knowledge of the underlying workflow technology needed to utilize the TeraGrid. The workflow technology behind Earthworks has been developed as a collaborative effort between SCEC and the Information Sciences Institute (ISI). Earthworks uses a complex software stack to translate abstract workflows defined by the user into a series of jobs that run on a number of computational resources. These computational resources include a combination of servers provided by SCEC, USC High Performance Computing Center and NSF TeraGrid supercomputer facilities. Workflows are constructed after input from the user is passed via a Java based interface to the Earthworks backend, where a DAX (directed acyclic graph in XML) is generated. This DAX describes each step of the workflow including its inputs, outputs, and arguments, as well as the parent child relationships between each process. The DAX is then handed off to the Virtual Data System (VDS) and Pegasus provided by ISI, which translate it from an abstract workflow to a concrete workflow by filling in logical file and application names with their physical path and location. This newly created DAG (directed acyclic graph) is handed off to the Condor scheduler. The bottom part of the software stack is a Globus installation at each site the provides local transfer and resource management capabilities. Resources across different sites are transparently managed and tracked by VDS which allows greater flexibility in running the workflows. After a workflow is completed, products and metadata are registered with integrated data management tools. This allows for metadata querying

  9. Secure Grid Services for Cooperative Work in Medicine and Life Science

    NASA Astrophysics Data System (ADS)

    Weisbecker, Anette; Falkner, Jürgen

    MediGRID provides a grid infrastructure to solve challenging problems in medical and life sciences by enhancing the productivity and by enabling locationindependent, interdisciplinary collaboration. The usage of grid technology has enabled the development of new application and services for research in medical and life sciences. In order to enlarge the range of services and to get a broader range of users sustainable business models are needed. In Services@MediGRID methods for monitoring, accounting, and billing which fulfilled the high security demands within medicine and life sciences will be developed. Also different requirements of academic and industrial grid customers are considered in order to establish the sustainable business models for grid computing.

  10. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  11. Reputation, Princing and the E-Science Grid

    NASA Astrophysics Data System (ADS)

    Anandasivam, Arun; Neumann, Dirk

    One of the fundamental aspects for an efficient Grid usage is the optimization of resource allocation among the participants. However, this has not yet materialized. Each user is a self-interested participant trying to maximize his utility whereas the utility is not only determined by the fastest completion time, but on the prices as well. Future revenues are influenced by users' reputation. Reputation mechanisms help to build trust between loosely coupled and geographically distributed participants. Providers need an incentive to reduce selfish cancellation of jobs and privilege own jobs. In this chapter we present first an offline scheduling mechanism with a fixed price. Jobs are collected by a broker and scheduled to machines. The goal of the broker is to balance the load and to maximize the revenue in the network. Consumers can submit their jobs according to their preferences, but taking the incentives of the broker into account. This mechanism does not consider reputation. In a second step a reputation-based pricing mechanism for a simple, but fair pricing of resources is analyzed. In e-Science researchers do not appreciate idiosyncratic pricing strategies and policies. Their interest lies in doing research in an efficient manner. Consequently, in our mechanism the price is tightly coupled to the reputation of a site to guarantee fairness of pricing and facilitate price determination. Furthermore, the price is not the only parameter as completion time plays an important role, when deadlines have to be met. We provide a flexible utility and decision model for every participant and analyze the outcome of our reputation-based pricing system via simulation.

  12. Grid Analysis and Display System (GrADS): A practical tool for earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III; Doty, Brian E.

    1991-01-01

    Viewgraphs on grid analysis and display system (GrADS): a practical tool for earth science visualization are presented. Topics covered include: GrADS design goals; data sets; and temperature profiles.

  13. Diagramming the path of a seed coat fragment on experimental lint cleaner grid bars

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was run to determine how a seed coat fragment reacts after colliding with newly-designed grid bars mounted on a lint cleaner simulator. A high-speed video camera recorded the action that took place. Ten experimental grid bars were used in the test. The included angle of the sharp to...

  14. An overview of Grid portal technologies for the development of HMR science gateways

    NASA Astrophysics Data System (ADS)

    D'Agostino, D.

    2012-04-01

    Grid portals and related technologies represent an easy and transparent way for scientists to interact with Distributed Computing Infrastructures (DCIs) as the Grid and the Cloud. Many toolkits and frameworks are available, both commercial and open source, but there is a lack of best practices, customization methodologies and dedicated high-level service repositories that allow a fast development of specialized scientific gateways in Europe. Starting from the US TeraGrid-XSEDE experience, in this contribution the most interesting portal toolkits and related European projects are analyzed with the perspective to develop a science gateway for HMR community within the the Distributed Research Infrastructure for Hydrometeorology (DRIHM) project.

  15. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1992-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  16. Experimenter's laboratory for visualized interactive science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Klemp, Marjorie K.; Lasater, Sally W.; Szczur, Marti R.; Klemp, Joseph B.

    1993-01-01

    The science activities of the 1990's will require the analysis of complex phenomena and large diverse sets of data. In order to meet these needs, we must take advantage of advanced user interaction techniques: modern user interface tools; visualization capabilities; affordable, high performance graphics workstations; and interoperatable data standards and translator. To meet these needs, we propose to adopt and upgrade several existing tools and systems to create an experimenter's laboratory for visualized interactive science. Intuitive human-computer interaction techniques have already been developed and demonstrated at the University of Colorado. A Transportable Applications Executive (TAE+), developed at GSFC, is a powerful user interface tool for general purpose applications. A 3D visualization package developed by NCAR provides both color-shaded surface displays and volumetric rendering in either index or true color. The Network Common Data Form (NetCDF) data access library developed by Unidata supports creation, access and sharing of scientific data in a form that is self-describing and network transparent. The combination and enhancement of these packages constitutes a powerful experimenter's laboratory capable of meeting key science needs of the 1990's. This proposal encompasses the work required to build and demonstrate this capability.

  17. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    NASA Astrophysics Data System (ADS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  18. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  19. Experimental Demonstration of a Self-organized Architecture for Emerging Grid Computing Applications on OBS Testbed

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Hong, Xiaobin; Wu, Jian; Lin, Jintong

    As Grid computing continues to gain popularity in the industry and research community, it also attracts more attention from the customer level. The large number of users and high frequency of job requests in the consumer market make it challenging. Clearly, all the current Client/Server(C/S)-based architecture will become unfeasible for supporting large-scale Grid applications due to its poor scalability and poor fault-tolerance. In this paper, based on our previous works [1, 2], a novel self-organized architecture to realize a highly scalable and flexible platform for Grids is proposed. Experimental results show that this architecture is suitable and efficient for consumer-oriented Grids.

  20. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  1. AMP: a science-driven web-based application for the TeraGrid

    NASA Astrophysics Data System (ADS)

    Woitaszek, M.; Metcalfe, T.; Shorrock, I.

    The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.

  2. Analysis of the Current Use, Benefit, and Value of the Open Science Grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2009-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by nonphysics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  3. Analysis of the current use, benefit, and value of the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Pordes, R.; Open Science Grid Executive Board, the; Weichel, J.

    2010-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by non-physics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  4. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1992-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science data set browsing, sampling and manipulation. The system will be coupled to a super computer in a distributed computing environment for near real-time interaction between scientists and computational results.

  5. The Grid Analysis and Display System (GRADS): A practical tool for Earth science visualization

    NASA Technical Reports Server (NTRS)

    Kinter, James L., III

    1993-01-01

    We propose to develop and enhance a workstation based grid analysis and display software system for Earth science dataset browsing, sampling and manipulation. The system will be coupled to a supercomputer in a distributed computing environment for near real-time interaction between scientists and computational results.

  6. Remote Job Testing for the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Miller, Stephen D; Reuter, Michael A; Smith, Bradford C

    2009-01-01

    Remote job execution gives neutron science facilities access to high performance computing such as the TeraGrid. A scientific community can use community software with a community certificate and account through a common interface of a portal. Results show this approach is successful, but with more testing and problem solving, we expect remote job executions to become more reliable.

  7. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  8. The GENIUS Grid Portal and robot certificates: a new tool for e-Science

    PubMed Central

    Barbera, Roberto; Donvito, Giacinto; Falzone, Alberto; La Rocca, Giuseppe; Milanesi, Luciano; Maggi, Giorgio Pietro; Vicario, Saverio

    2009-01-01

    Background Grid technology is the computing model which allows users to share a wide pletora of distributed computational resources regardless of their geographical location. Up to now, the high security policy requested in order to access distributed computing resources has been a rather big limiting factor when trying to broaden the usage of Grids into a wide community of users. Grid security is indeed based on the Public Key Infrastructure (PKI) of X.509 certificates and the procedure to get and manage those certificates is unfortunately not straightforward. A first step to make Grids more appealing for new users has recently been achieved with the adoption of robot certificates. Methods Robot certificates have recently been introduced to perform automated tasks on Grids on behalf of users. They are extremely useful for instance to automate grid service monitoring, data processing production, distributed data collection systems. Basically these certificates can be used to identify a person responsible for an unattended service or process acting as client and/or server. Robot certificates can be installed on a smart card and used behind a portal by everyone interested in running the related applications in a Grid environment using a user-friendly graphic interface. In this work, the GENIUS Grid Portal, powered by EnginFrame, has been extended in order to support the new authentication based on the adoption of these robot certificates. Results The work carried out and reported in this manuscript is particularly relevant for all users who are not familiar with personal digital certificates and the technical aspects of the Grid Security Infrastructure (GSI). The valuable benefits introduced by robot certificates in e-Science can so be extended to users belonging to several scientific domains, providing an asset in raising Grid awareness to a wide number of potential users. Conclusion The adoption of Grid portals extended with robot certificates, can really

  9. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    SciTech Connect

    Livny, Miron; Shank, James; Ernst, Michael; Blackburn, Kent; Goasguen, Sebastien; Tuts, Michael; Gibbons, Lawrence; Pordes, Ruth; Sliz, Piotr; Deelman, Ewa; Barnett, William; Olson, Doug; McGee, John; Cowles, Robert; Wuerthwein, Frank; Gardner, Robert; Avery, Paul; Wang, Shaowen; Lincoln, David Swanson

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  10. Fully Automated Single-Zone Elliptic Grid Generation for Mars Science Laboratory (MSL) Aeroshell and Canopy Geometries

    NASA Technical Reports Server (NTRS)

    kaul, Upender K.

    2008-01-01

    A procedure for generating smooth uniformly clustered single-zone grids using enhanced elliptic grid generation has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy. The procedure obviates the need for generating multizone grids for such geometries, as reported in the literature. This has been possible because the enhanced elliptic grid generator automatically generates clustered grids without manual prescription of decay parameters needed with the conventional approach. In fact, these decay parameters are calculated as decay functions as part of the solution, and they are not constant over a given boundary. Since these decay functions vary over a given boundary, orthogonal grids near any arbitrary boundary can be clustered automatically without having to break up the boundaries and the corresponding interior domains into various zones for grid generation.

  11. GENESIS SciFlo: Enabling Multi-Instrument Atmospheric Science Using Grid Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Tang, B.; Manipon, G.; Yunck, T.; Fetzer, E.; Braverman, A.; Dobinson, E.

    2004-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of web services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations will include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-strato-sphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we are developing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable web services and executable operators into a distributed computing flow (operator tree). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out

  12. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  13. Ultrasonic Technique for Experimental Investigation of Statistical Characteristics of Grid Generated Turbulence.

    NASA Astrophysics Data System (ADS)

    Andreeva, Tatiana; Durgin, William

    2001-11-01

    This paper focuses on ultrasonic measurements of a grid-generated turbulent flow using the travel time technique. In the present work an attempt to describe a turbulent flow by means of statistics of ultrasound wave propagation time is undertaken in combination with Kolmogorov (2/3)-power law. There are two objectives in current research work. The first one is to demonstrate an application of the travel-time ultrasonic technique for data acquisition in the grid-generated turbulence produced in a wind tunnel. The second one is to use the experimental data to verify or refute the analytically obtained expression for travel time dispersion as a function of velocity fluctuation metrics. The theoretical analysis and derivations of that formula are based on Kolmogorov theory. The series of experiment was conducted at different values of wind speeds and distances from the grid giving rise to different values of the dimensional turbulence characteristic coefficient K. Theoretical analysis, based on the experimental data reveals strong dependence of the turbulent characteristic K on the mean wind velocity. Tabulated values of the turbulent characteristic coefficient may be used for further understanding of the effect of turbulence on sound propagation.

  14. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid

  15. ReSS: A Resource Selection Service for the Open Science Grid

    SciTech Connect

    Garzoglio, Gabriele; Levshina, Tanya; Mhashilkar, Parag; Timm, Steve; /Fermilab

    2008-01-01

    The Open Science Grid offers access to hundreds of computing and storage resources via standard Grid interfaces. Before the deployment of an automated resource selection system, users had to submit jobs directly to these resources. They would manually select a resource and specify all relevant attributes in the job description prior to submitting the job. The necessity of a human intervention in resource selection and attribute specification hinders automated job management components from accessing OSG resources and it is inconvenient for the users. The Resource Selection Service (ReSS) project addresses these shortcomings. The system integrates condor technology, for the core match making service, with the gLite CEMon component, for gathering and publishing resource information in the Glue Schema format. Each one of these components communicates over secure protocols via web services interfaces. The system is currently used in production on OSG by the DZero Experiment, the Engagement Virtual Organization, and the Dark Energy. It is also the resource selection service for the Fermilab Campus Grid, FermiGrid. ReSS is considered a lightweight solution to push-based workload management. This paper describes the architecture, performance, and typical usage of the system.

  16. Thermoplastic Composites Reinforced with Textile Grids: Development of a Manufacturing Chain and Experimental Characterisation

    NASA Astrophysics Data System (ADS)

    Böhm, R.; Hufnagl, E.; Kupfer, R.; Engler, T.; Hausding, J.; Cherif, C.; Hufenbach, W.

    2013-12-01

    A significant improvement in the properties of plastic components can be achieved by introducing flexible multiaxial textile grids as reinforcement. This reinforcing concept is based on the layerwise bonding of biaxially or multiaxially oriented, completely stretched filaments of high-performance fibers, e.g. glass or carbon, and thermoplastic components, using modified warp knitting techniques. Such pre-consolidated grid-like textiles are particularly suitable for use in injection moulding, since the grid geometry is very robust with respect to flow pressure and temperature on the one hand and possesses an adjustable spacing to enable a complete filling of the mould cavity on the other hand. The development of pre-consolidated textile grids and their further processing into composites form the basis for providing tailored parts with a large number of additional integrated functions like fibrous sensors or electroconductive fibres. Composites reinforced in that way allow new product groups for promising lightweight structures to be opened up in future. The article describes the manufacturing process of this new composite class and their variability regarding reinforcement and function integration. An experimentally based study of the mechanical properties is performed. For this purpose, quasi-static and highly dynamic tensile tests have been carried out as well as impact penetration experiments. The reinforcing potential of the multiaxial grids is demonstrated by means of evaluating drop tower experiments on automotive components. It has been shown that the load-adapted reinforcement enables a significant local or global improvement of the properties of plastic components depending on industrial requirements.

  17. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    PubMed Central

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  18. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    PubMed

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  19. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  20. Experimental optimization of the FireFly 600 photovoltaic off-grid system.

    SciTech Connect

    Boyson, William Earl; Orozco, Ron; Ralph, Mark E.; Brown, Marlene Laura; King, David L.; Hund, Thomas D.

    2003-10-01

    A comprehensive evaluation and experimental optimization of the FireFly{trademark} 600 off-grid photovoltaic system manufactured by Energia Total, Ltd. was conducted at Sandia National Laboratories in May and June of 2001. This evaluation was conducted at the request of the manufacturer and addressed performance of individual system components, overall system functionality and performance, safety concerns, and compliance with applicable codes and standards. A primary goal of the effort was to identify areas for improvement in performance, reliability, and safety. New system test procedures were developed during the effort.

  1. Experimental Observation of a Periodically Oscillating Plasma Sphere in a Gridded Inertial Electrostatic Confinement Device

    SciTech Connect

    Park, J.; Nebel, R.A.; Stange, S.; Murali, S. Krupakar

    2005-07-01

    The periodically oscillating plasma sphere (POPS) [D. C. Barnes and R. A. Nebel, Phys. Plasmas 5, 2498 (1998).] oscillation has been observed in a gridded inertial electrostatic confinement device. In these experiments, ions in the virtual cathode exhibit resonant behavior when driven at the POPS frequency. Excellent agreement between the observed POPS resonance frequency and theoretical predictions has been observed for a wide range of potential well depths and for three different ion species. The results provide the first experimental validation of the POPS concept proposed by Barnes and Nebel [R. A. Nebel and D. C. Barnes, Fusion Technol. 34, 28 (1998).].

  2. Moving off the grid in an experimental, compressively sampled photonic link.

    PubMed

    Nichols, J M; McLaughlin, C V; Bucholtz, F

    2015-07-13

    Perhaps the largest obstacle to practical compressive sampling is an inability to accurately, and sparsely describe the data one seeks to recover due to poor choice of signal model parameters. In such cases the recovery process will yield artifacts, or in many cases, fail completely. This work represents the first demonstration of a solution to this so-called "off-grid" problem in an experimental, compressively sampled system. Specifically, we show that an Alternating Convex Search algorithm is able to significantly reduce these data model errors in harmonic signal recovery. PMID:26191864

  3. Materials Science and Materials Chemistry for Large Scale Electrochemical Energy Storage: From Transportation to Electrical Grid

    SciTech Connect

    Liu, Jun; Zhang, Jiguang; Yang, Zhenguo; Lemmon, John P.; Imhoff, Carl H.; Graff, Gordon L.; Li, Liyu; Hu, Jian Z.; Wang, Chong M.; Xiao, Jie; Xia, Guanguang; Viswanathan, Vilayanur V.; Baskaran, Suresh; Sprenkle, Vincent L.; Li, Xiaolin; Shao, Yuyan; Schwenzer, Birgit

    2013-02-15

    Large-scale electrical energy storage has become more important than ever for reducing fossil energy consumption in transportation and for the widespread deployment of intermittent renewable energy in electric grid. However, significant challenges exist for its applications. Here, the status and challenges are reviewed from the perspective of materials science and materials chemistry in electrochemical energy storage technologies, such as Li-ion batteries, sodium (sulfur and metal halide) batteries, Pb-acid battery, redox flow batteries, and supercapacitors. Perspectives and approaches are introduced for emerging battery designs and new chemistry combinations to reduce the cost of energy storage devices.

  4. Experimental control requirements for life sciences

    NASA Technical Reports Server (NTRS)

    Berry, W. E.; Sharp, J. C.

    1978-01-01

    The Life Sciences dedicated Spacelab will enable scientists to test hypotheses in various disciplines. Building upon experience gained in mission simulations, orbital flight test experiments, and the first three Spacelab missions, NASA will be able to progressively develop the engineering and management capabilities necessary for the first Life Sciences Spacelab. Development of experiments for these missions will require implementation of life-support systems not previously flown in space. Plant growth chambers, animal holding facilities, aquatic specimen life-support systems, and centrifuge-mounted specimen holding units are examples of systems currently being designed and fabricated for flight.

  5. Visual monitoring of autonomous life sciences experimentation

    NASA Technical Reports Server (NTRS)

    Blank, G. E.; Martin, W. N.

    1987-01-01

    The design and implementation of a computerized visual monitoring system to aid in the monitoring and control of life sciences experiments on board a space station was investigated. A likely multiprocessor design was chosen, a plausible life science experiment with which to work was defined, the theoretical issues involved in the programming of a visual monitoring system for the experiment was considered on the multiprocessor, a system for monitoring the experiment was designed, and simulations of such a system was implemented on a network of Apollo workstations.

  6. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  7. RAON experimental facilities for nuclear science

    SciTech Connect

    Kwon, Y. K.; Kim, Y. K.; Komatsubara, T.; Moon, J. Y.; Park, J. S.; Shin, T. S.; Kim, Y. J.

    2014-05-02

    The Rare Isotope Science Project (RISP) was established in December 2011 and has put quite an effort to carry out the design and construction of the accelerator complex facility named “RAON”. RAON is a rare isotope (RI) beam facility that aims to provide various RI beams of proton-and neutron-rich nuclei as well as variety of stable ion beams of wide ranges of energies up to a few hundreds MeV/nucleon for the researches in basic science and application. Proposed research programs for nuclear physics and nuclear astrophysics at RAON include studies of the properties of exotic nuclei, the equation of state of nuclear matter, the origin of the universe, process of nucleosynthesis, super heavy elements, etc. Various high performance magnetic spectrometers for nuclear science have been designed, which are KOBRA (KOrea Broad acceptance Recoil spectrometer and Apparatus), LAMPS (Large Acceptance Multi-Purpose Spectrometer), and ZDS (Zero Degree Spectrometer). The status of those spectrometers for nuclear science will be presented with a brief report on the RAON.

  8. Environmental Science, Grade 9. Experimental Curriculum Bulletin.

    ERIC Educational Resources Information Center

    Bernstein, Leonard, Ed.

    This is the teacher's guide for the required, interdisciplinary, ninth-year environmental science course for the New York City Schools. One hundred twenty lesson plans, divided into nine units, are presented. Areas of study include the living and non-living environment, ecosystems, population, urban ecology, energy and technology, pollution, and…

  9. Geomorphology, Science (Experimental): 5343.09.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    Performance objectives are stated for this secondary school instructional unit concerned with aspects of earth science with emphases on the internal and external forces that bring about changes in the earth's crust. Lists of films and state-adopted and other texts are presented. Included are a course outline summarizing the unit content; numerous…

  10. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  11. Nuclear Test-Experimental Science: Annual report, fiscal year 1988

    SciTech Connect

    Struble, G.L.; Donohue, M.L.; Bucciarelli, G.; Hymer, J.D.; Kirvel, R.D.; Middleton, C.; Prono, J.; Reid, S.; Strack, B.

    1988-01-01

    Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challenges and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.

  12. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    PubMed

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing). PMID:21521586

  13. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  14. Space materials science experimental facilities in China

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Jin, Weiqing

    Three typical facilities for materials science research under microgravity in China are introduced in this paper. The multi-task materials processing facility was developed for crystal growth and alloy solidification onboard Chinese Shenzhou spacecrafts, and more than ten types of different materials had been processed successfully in space. The in-situ observation facility was designed for mechanism research of oxide single crystals in space, and it had been carried into space onboard both Chinese recoverable satellite and Shenzhou spacecraft. The comprehensive materials processing facility is recently developed for utilization onboard the future spacelab in the manned spaceflight project in China. Both the achievement and the recent progress of materials research hardwares in China will also be summarized in this paper.

  15. Modular experimental platform for science and applications

    NASA Technical Reports Server (NTRS)

    Hill, A. S.

    1984-01-01

    A modularized, standardized spacecraft bus, known as MESA, suitable for a variety of science and applications missions is discussed. The basic bus consists of a simple structural arrangement housing attitude control, telemetry/command, electrical power, propulsion and thermal control subsystems. The general arrangement allows extensive subsystem adaptation to mission needs. Kits provide for the addition of tape recorders, increased power levels and propulsion growth. Both 3-axis and spin stabilized flight proven attitude control subsystems are available. The MESA bus can be launched on Ariane, as a secondary payload for low cost, or on the STS with a PAM-D or other suitable upper stage. Multi-spacecraft launches are possible with either booster. Launch vehicle integration is simple and cost-effective. The low cost of the MESA bus is achieved by the extensive utilization of existing subsystem design concepts and equipment, and efficient program management and test integration techniques.

  16. Experimental investigation of the dynamics of a vibrating grid in superfluid 4He over a range of temperatures and pressures.

    PubMed

    Charalambous, D; Skrbek, L; Hendry, P C; McClintock, P V E; Vinen, W F

    2006-09-01

    In an earlier paper [Nichol, Phys. Rev. E, 70, 056307 (2004)] some of the present authors presented the results of an experimental study of the dynamics of a stretched grid driven into vibration at or near its resonant frequency in isotopically pure superfluid 4He over a range of pressures at a very low temperature, where the density of normal fluid is negligible. In this paper we present the results of a similar study, based on a different grid, but now including the temperature range where the normal fluid density is no longer insignificant. The new grid is very similar to the old one except for a small difference in the character of its surface roughness. In many respects the results at low temperature are similar to those for the old grid. At low amplitudes the results are somewhat history dependent, but in essence there is no damping greater than that in vacuo. At a critical amplitude corresponding to a velocity of about 50 mms(-1) there is a sudden and large increase in damping, which can be attributed to the generation of new vortex lines. Strange shifts in the resonant frequency at intermediate amplitudes observed with the old grid are no longer seen, however they must therefore have been associated with the different surface roughness, or perhaps were due simply to some artifact of the old grid, the details of which we are currently unable to determine. With the new grid we have studied both the damping at low amplitudes due to excitations of the normal fluid, and the dependence of the supercritical damping on temperature. We present evidence that in helium at low amplitudes there may be some enhancement in the effective mass of the grid in addition to that associated with potential flow of the helium. In some circumstances small satellite resonances are seen near the main fundamental grid resonance, which are attributed to coupling to some other oscillatory system within the experimental cell. PMID:17025743

  17. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  18. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  19. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  20. The Structure of Scientific Arguments by Secondary Science Teachers: Comparison of Experimental and Historical Science Topics

    ERIC Educational Resources Information Center

    Gray, Ron; Kang, Nam-Hwa

    2014-01-01

    Just as scientific knowledge is constructed using distinct modes of inquiry (e.g. experimental or historical), arguments constructed during science instruction may vary depending on the mode of inquiry underlying the topic. The purpose of this study was to examine whether and how secondary science teachers construct scientific arguments during…

  1. Environmental Science. An Experimental Programme for Primary Teachers.

    ERIC Educational Resources Information Center

    Linke, R. D.

    An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…

  2. An Illustration of the Experimenter Expectancy Effect in School Science

    ERIC Educational Resources Information Center

    Allen, Michael; Briten, Elizabeth

    2012-01-01

    Two groups of year 6 pupils (age 10-11 years) each experienced science practical lessons that were essentially identical but for one difference: one group (theory-led) were told by the teacher what result they should expect, and the other group (hypothetico-deductive) were not. The theory-led group demonstrated experimental bias, recording results…

  3. The NASA/GSFC Advanced Data Grid: A Prototype for Future Earth Science Ground System Architectures

    NASA Technical Reports Server (NTRS)

    Gasster, Samuel D.; Lee, Craig; Davis, Brooks; Clark, Matt; AuYeung, Mike; Wilson, John R.; Ladwig, Debra M.

    2003-01-01

    Contents include the following: Background and motivation. Grid computing concepts. Advanced data grid (ADG) prototype development. ADG requirements and operations concept. ADG architecture. ADG implementation. ADG test plan. ADG schedule. Summary and status.

  4. The Virtual Kidney: an eScience interface and Grid portal.

    PubMed

    Harris, Peter J; Buyya, Rajkumar; Chu, Xingchen; Kobialka, Tom; Kazmierczak, Ed; Moss, Robert; Appelbe, William; Hunter, Peter J; Thomas, S Randall

    2009-06-13

    The Virtual Kidney uses a web interface and distributed computing to provide experimental scientists and analysts with access to computational simulations and knowledge databases hosted in geographically separated laboratories. Users can explore a variety of complex models without requiring the specific programming environment in which applications have been developed. This initiative exploits high-bandwidth communication networks for collaborative research and for shared access to knowledge resources. The Virtual Kidney has been developed within a specialist community of renal scientists but is transferable to other areas of research requiring interaction between published literature and databases, theoretical models and simulations and the formulation of effective experimental designs. A web-based three-dimensional interface provides access to experimental data, a parameter database and mathematical models. A multi-scale kidney reconstruction includes blood vessels and serially sectioned nephrons. Selection of structures provides links to the database, returning parameter values and extracts from the literature. Models are run locally or remotely with a Grid resource broker managing scheduling, monitoring and visualization of simulation results and application, credential and resource allocation. Simulation results are viewed graphically or as scaled colour gradients on the Virtual Kidney structures, allowing visual and quantitative appreciation of the effects of simulated parameter changes. PMID:19414450

  5. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  6. SciFlo: Semantically-Enabled Grid Workflow for Collaborative Science

    NASA Astrophysics Data System (ADS)

    Yunck, T.; Wilson, B. D.; Raskin, R.; Manipon, G.

    2005-12-01

    SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (WS-* standards and the Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable SOAP Services, native executables, local command-line scripts, and python codes into a distributed computing flow (a graph of operators). SciFlo's XML dataflow documents can be a mixture of concrete operators (fully bound operations) and abstract template operators (late binding via semantic lookup). All data objects and operators can be both simply typed (simple and complex types in XML schema) and semantically typed using controlled vocabularies (linked to OWL ontologies such as SWEET). By exploiting ontology-enhanced search and inference, one can discover (and automatically invoke) Web Services and operators that have been semantically labeled as performing the desired transformation, and adapt a particular invocation to the proper interface (number, types, and meaning of inputs and outputs). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out an HTML form or directly authoring the underlying XML dataflow document, and results are returned directly to the scientist's desktop. A Visual Programming tool is also being developed, but it is not required. Once an analysis has been specified for a granule or day of data, it can be easily repeated with different control parameters and over months or years of data. SciFlo uses and preserves semantics, and also generates and infers new semantic annotations. Specifically, the SciFlo engine uses semantic metadata to

  7. Experimental Study of Two Phase Flow Behavior Past BWR Spacer Grids

    SciTech Connect

    Ratnayake, Ruwan K.; Hochreiter, L.E.; Ivanov, K.N.; Cimbala, J.M.

    2002-07-01

    Performance of best estimate codes used in the nuclear industry can be significantly improved by reducing the empiricism embedded in their constitutive models. Spacer grids have been found to have an important impact on the maximum allowable Critical Heat Flux within the fuel assembly of a nuclear reactor core. Therefore, incorporation of suitable spacer grids models can improve the critical heat flux prediction capability of best estimate codes. Realistic modeling of entrainment behavior of spacer grids requires understanding the different mechanisms that are involved. Since visual information pertaining to the entrainment behavior of spacer grids cannot possibly be obtained from operating nuclear reactors, experiments have to be designed and conducted for this specific purpose. Most of the spacer grid experiments available in literature have been designed in view of obtaining quantitative data for the purpose of developing or modifying empirical formulations for heat transfer, critical heat flux or pressure drop. Very few experiments have been designed to provide fundamental information which can be used to understand spacer grid effects and phenomena involved in two phase flow. Air-water experiments were conducted to obtain visual information on the two-phase flow behavior both upstream and downstream of Boiling Water Reactor (BWR) spacer grids. The test section was designed and constructed using prototypic dimensions such as the channel cross-section, rod diameter and other spacer grid configurations of a typical BWR fuel assembly. The test section models the flow behavior in two adjacent sub channels in the BWR core. A portion of a prototypic BWR spacer grid accounting for two adjacent channels was used with industrial mild steel rods for the purpose of representing the channel internals. Symmetry was preserved in this practice, so that the channel walls could effectively be considered as the channel boundaries. Thin films were established on the rod surfaces

  8. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  9. OASIS: a data and software distribution service for Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; De Stefano, J.; Hover, J.; Quick, R.; Teige, S.

    2014-06-01

    The Open Science Grid encourages the concept of software portability: a user's scientific application should be able to run at as many sites as possible. It is necessary to provide a mechanism for OSG Virtual Organizations to install software at sites. Since its initial release, the OSG Compute Element has provided an application software installation directory to Virtual Organizations, where they can create their own sub-directory, install software into that sub-directory, and have the directory shared on the worker nodes at that site. The current model has shortcomings with regard to permissions, policies, versioning, and the lack of a unified, collective procedure or toolset for deploying software across all sites. Therefore, a new mechanism for data and software distributing is desirable. The architecture for the OSG Application Software Installation Service (OASIS) is a server-client model: the software and data are installed only once in a single place, and are automatically distributed to all client sites simultaneously. Central file distribution offers other advantages, including server-side authentication and authorization, activity records, quota management, data validation and inspection, and well-defined versioning and deletion policies. The architecture, as well as a complete analysis of the current implementation, will be described in this paper.

  10. Systematic control of experimental inconsistency in combinatorial materials science.

    PubMed

    Sharma, Asish Kumar; Kulshreshtha, Chandramouli; Sohn, Keemin; Sohn, Kee-Sun

    2009-01-01

    We developed a method to systematically control experimental inconsistency, which is one of the most troublesome and difficult problems in high-throughput combinatorial experiments. The topic of experimental inconsistency is never addressed, even though all scientists in the field of combinatorial materials science face this very serious problem. Experimental inconsistency and material property were selected as dual objective functions that were simultaneously optimized. Specifically, in an attempt to search for promising phosphors with high reproducibility, photoluminescence (PL) intensity was maximized, and experimental inconsistency was minimized by employing a multiobjective evolutionary optimization-assisted combinatorial materials search (MOEO combinatorial material search) strategy. A tetravalent manganese-doped alkali earth germanium/titanium oxide system was used as a model system to be screened using MOEO combinatorial materials search. As a result of MOEO reiteration, we identified a halide-detached deep red phosphor with improved PL intensity and reliable reproducibility. PMID:19061418

  11. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    SciTech Connect

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  12. Development of experimental systems for material sciences under microgravity

    NASA Technical Reports Server (NTRS)

    Tanii, Jun; Obi, Shinzo; Kamimiyata, Yotsuo; Ajimine, Akio

    1988-01-01

    As part of the Space Experiment Program of the Society of Japanese Aerospace Companies, three experimental systems (G452, G453, G454) have been developed for materials science studies under microgravity by the NEC Corporation. These systems are to be flown as Get Away Special payloads for studying the feasibility of producing new materials. Together with the experimental modules carrying the hardware specific to the experiment, the three systems all comprise standard subsystems consisting of a power supply, sequence controller, temperature controller, data recorder, and video recorder.

  13. Minimum Learning Essentials: Science. Chemistry, Earth Science, Biology, Physics, General Science. Experimental Edition 0/4.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This guide presents the "minimum teaching essentials" published by the New York City Board of Education, for science education in grades 9-12. Covered are: biology, physics, earth science, and chemistry. Work study skills for all subjects are given with content areas, performance objectives, and suggested classroom activities. (APM)

  14. Experimental Physical Sciences Vistas: MaRIE (draft)

    SciTech Connect

    Shlachter, Jack

    2010-09-08

    To achieve breakthrough scientific discoveries in the 21st century, a convergence and integration of world-leading experimental facilities and capabilities with theory, modeling, and simulation is necessary. In this issue of Experimental Physical Sciences Vistas, I am excited to present our plans for Los Alamos National Laboratory's future flagship experimental facility, MaRIE (Matter-Radiation Interactions in Extremes). MaRIE is a facility that will provide transformational understanding of matter in extreme conditions required to reduce or resolve key weapons performance uncertainties, develop the materials needed for advanced energy systems, and transform our ability to create materials by design. Our unique role in materials science starting with the Manhattan Project has positioned us well to develop a contemporary materials strategy pushing the frontiers of controlled functionality - the design and tailoring of a material for the unique demands of a specific application. Controlled functionality requires improvement in understanding of the structure and properties of materials in order to synthesize and process materials with unique characteristics. In the nuclear weapons program today, improving data and models to increase confidence in the stockpile can take years from concept to new knowledge. Our goal with MaRIE is to accelerate this process by enhancing predictive capability - the ability to compute a priori the observables of an experiment or test and pertinent confidence intervals using verified and validated simulation tools. It is a science-based approach that includes the use of advanced experimental tools, theoretical models, and multi-physics codes, simultaneously dealing with multiple aspects of physical operation of a system that are needed to develop an increasingly mature predictive capability. This same approach is needed to accelerate improvements to other systems such as nuclear reactors. MaRIE will be valuable to many national security

  15. Animal experimentation in forensic sciences: How far have we come?

    PubMed

    Cattaneo, C; Maderna, E; Rendinelli, A; Gibelli, D

    2015-09-01

    In the third millennium where ethical, ethological and cultural evolution seem to be leading more and more towards an inter-species society, the issue of animal experimentation is a moral dilemma. Speaking from a self-interested human perspective, avoiding all animal testing where human disease and therapy are concerned may be very difficult or even impossible; such testing may not be so easily justifiable when suffering-or killing-of non human animals is inflicted for forensic research. In order to verify how forensic scientists are evolving in this ethical issue, we undertook a systematic review of the current literature. We investigated the frequency of animal experimentation in forensic studies in the past 15 years and trends in publication in the main forensic science journals. Types of species, lesions inflicted, manner of sedation or anesthesia and euthanasia were examined in a total of 404 articles reviewed, among which 279 (69.1%) concerned studies involving animals sacrificed exclusively for the sake of the experiment. Killing still frequently includes painful methods such as blunt trauma, electrocution, mechanical asphyxia, hypothermia, and even exsanguination; of all these animals, apparently only 60.8% were anesthetized. The most recent call for a severe reduction if not a total halt to the use of animals in forensic sciences was made by Bernard Knight in 1992. In fact the principle of reduction and replacement, frequently respected in clinical research, must be considered the basis for forensic science research needing animals. PMID:26216717

  16. Teaching science problem solving: An overview of experimental work

    NASA Astrophysics Data System (ADS)

    Taconis, R.; Ferguson-Hessler, M. G. M.; Broekkamp, H.

    2001-04-01

    The traditional approach to teaching science problem solving is having the students work individually on a large number of problems. This approach has long been overtaken by research suggesting and testing other methods, which are expected to be more effective. To get an overview of the characteristics of good and innovative problem-solving teaching strategies, we performed an analysis of a number of articles published between 1985 and 1995 in high-standard international journals, describing experimental research into the effectiveness of a wide variety of teaching strategies for science problem solving. To characterize the teaching strategies found, we used a model of the capacities needed for effective science problem solving, composed of a knowledge base and a skills base. The relations between the cognitive capacities required by the experimental or control treatments and those of the model were specified and used as independent variables. Other independent variables were learning conditions such as feedback and group work. As a dependent variable we used standardized learning effects. We identified 22 articles describing 40 experiments that met the standards we deemed necessary for a meta-analysis. These experiments were analyzed both with quantitative (correlational) methods and with a systematic qualitative method. A few of the independent variables were found to characterize effective strategies for teaching science problem solving. Effective treatments all gave attention to the structure and function (the schemata) of the knowledge base, whereas attention to knowledge of strategy and the practice of problem solving turned out to have little effect. As for learning conditions, both providing the learners with guidelines and criteria they can use in judging their own problem-solving process and products, and providing immediate feedback to them were found to be important prerequisites for the acquisition of problem-solving skills. Group work did not lead to

  17. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  18. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  19. Boundary condition identification for a grid model by experimental and numerical dynamic analysis

    NASA Astrophysics Data System (ADS)

    Mao, Qiang; Devitis, John; Mazzotti, Matteo; Bartoli, Ivan; Moon, Franklin; Sjoblom, Kurt; Aktan, Emin

    2015-04-01

    There is a growing need to characterize unknown foundations and assess substructures in existing bridges. It is becoming an important issue for the serviceability and safety of bridges as well as for the possibility of partial reuse of existing infrastructures. Within this broader contest, this paper investigates the possibility of identifying, locating and quantifying changes of boundary conditions, by leveraging a simply supported grid structure with a composite deck. Multi-reference impact tests are operated for the grid model and modification of one supporting bearing is done by replacing a steel cylindrical roller with a roller of compliant material. Impact based modal analysis provide global modal parameters such as damped natural frequencies, mode shapes and flexibility matrix that are used as indicators of boundary condition changes. An updating process combining a hybrid optimization algorithm and the finite element software suit ABAQUS is presented in this paper. The updated ABAQUS model of the grid that simulates the supporting bearing with springs is used to detect and quantify the change of the boundary conditions.

  20. Effects of mesh style and grid convergence on particle deposition in bifurcating airway models with comparisons to experimental data.

    PubMed

    Longest, P Worth; Vinchurkar, Samir

    2007-04-01

    A number of research studies have employed a wide variety of mesh styles and levels of grid convergence to assess velocity fields and particle deposition patterns in models of branching biological systems. Generating structured meshes based on hexahedral elements requires significant time and effort; however, these meshes are often associated with high quality solutions. Unstructured meshes that employ tetrahedral elements can be constructed much faster but may increase levels of numerical diffusion, especially in tubular flow systems with a primary flow direction. The objective of this study is to better establish the effects of mesh generation techniques and grid convergence on velocity fields and particle deposition patterns in bifurcating respiratory models. In order to achieve this objective, four widely used mesh styles including structured hexahedral, unstructured tetrahedral, flow adaptive tetrahedral, and hybrid grids have been considered for two respiratory airway configurations. Initial particle conditions tested are based on the inlet velocity profile or the local inlet mass flow rate. Accuracy of the simulations has been assessed by comparisons to experimental in vitro data available in the literature for the steady-state velocity field in a single bifurcation model as well as the local particle deposition fraction in a double bifurcation model. Quantitative grid convergence was assessed based on a grid convergence index (GCI), which accounts for the degree of grid refinement. The hexahedral mesh was observed to have GCI values that were an order of magnitude below the unstructured tetrahedral mesh values for all resolutions considered. Moreover, the hexahedral mesh style provided GCI values of approximately 1% and reduced run times by a factor of 3. Based on comparisons to empirical data, it was shown that inlet particle seedings should be consistent with the local inlet mass flow rate. Furthermore, the mesh style was found to have an observable

  1. Experimental demonstration of OpenFlow-based control plane for elastic lightpath provisioning in Flexi-Grid optical networks.

    PubMed

    Zhang, Jiawei; Zhang, Jie; Zhao, Yongli; Yang, Hui; Yu, Xiaosong; Wang, Lei; Fu, Xihua

    2013-01-28

    Due to the prominent performance on networking virtualization and programmability, OpenFlow is widely regarded as a promising control plane technology in packet-switched IP networks as well as wavelength-switched optical networks. For the purpose of applying software programmable feature to future optical networks, we propose an OpenFlow-based control plane in Flexi-Grid optical networks. Experimental results demonstrate its feasibility of dynamic lightpath establishment and adjustment via extended OpenFlow protocol. Wireshark captures of the signaling procedure are printed out. Additionally, the overall latency including signaling and hardware for lightpath setup and adjustment is also reported. PMID:23389119

  2. Science of Geological Carbon Sequestration: Integration of Experimentation and Simulation.

    SciTech Connect

    Zhang, D.; Hall, M. L.; Higdon, D.; Hollis, W. K.; Kaszuba, J.; Lichtner, P.; Pawar, R.; Zhao, Y.; Chen, S.; Grigg, R.

    2003-08-04

    This LDRD-DR will develop and enhance the science and technology needed to safely and effectively sequester carbon dioxide (CO[sub 2]) in geologic formations for the long term. There is consensus in the scientific community that increased levels of greenhouse gases such as CO[sub 2] are adversely affecting the global environment as evidenced by recent trends in global warming and dramatic changes in weather patterns. Geologic sequestration represents an immediately available, low-cost option for mitigating the global environmental impact of C0[sub 2] by removing large amounts of the gas from the atmosphere. The main limitation of this approach is the limited knowledge of the fundamental science that governs the physical and chemical behavior of (supercritical) CO[sub 2] during and after injection into the host geologic environment. Key scientific issues revolve around determination of the ultimate fate of injected CO[sub 2] which is governed by permeability/porosity relations in the multi-phase CO[sub 2]-brine(-oil) systems as well as the reactivity and integrity of the host rock. We propose a combined experimental and theoretical investigation to determine key parameters and incorporate them into coupled microscopic and macroscopic numerical CO[sub 2] flow and reaction models. This problem provides an excellent opportunity to utilize unique LANL resources including the Supercritical Fluids Facility (SCRUB) for dynamic (flow-through) studies of supercritical CO[sub 2] (scCO[sub 2]); LANSCE for microscale investigation of pore structure and reaction products; and hydrothermal reaction laboratories for long-term flow and reaction studies. These facilities will allow us to obtain crucial experimental data that could not be easily obtained at any other research facility in the world. The experimental data will be used to develop and validate coupled flow and reaction models that build on existing state-of-the-art modeling capabilities in EES, T and D Divisions. Carbon

  3. Open Science Grid (OSG) Ticket Synchronization: Keeping Your Home Field Advantage In A Distributed Environment

    NASA Astrophysics Data System (ADS)

    Gross, Kyle; Hayashi, Soichi; Teige, Scott; Quick, Robert

    2012-12-01

    Large distributed computing collaborations, such as the Worldwide LHC Computing Grid (WLCG), face many issues when it comes to providing a working grid environment for their users. One of these is exchanging tickets between various ticketing systems in use by grid collaborations. Ticket systems such as Footprints, RT, Remedy, and ServiceNow all have different schema that must be addressed in order to provide a reliable exchange of information between support entities and users in different grid environments. To combat this problem, OSG Operations has created a ticket synchronization interface called GOC-TX that relies on web services instead of error-prone email parsing methods of the past. Synchronizing tickets between different ticketing systems allows any user or support entity to work on a ticket in their home environment, thus providing a familiar and comfortable place to provide updates without having to learn another ticketing system. The interface is built in a way that it is generic enough that it can be customized for nearly any ticketing system with a web-service interface with only minor changes. This allows us to be flexible and rapidly bring new ticket synchronization online. Synchronization can be triggered by different methods including mail, web services interface, and active messaging. GOC-TX currently interfaces with Global Grid User Support (GGUS) for WLCG, Remedy at Brookhaven National Lab (BNL), and Request Tracker (RT) at the Virtual Data Toolkit (VDT). Work is progressing on the Fermi National Accelerator Laboratory (FNAL) ServiceNow synchronization. This paper will explain the problems faced by OSG and how they led OSG to create and implement this ticket synchronization system along with the technical details that allow synchronization to be preformed at a production level.

  4. Analyzing Sustainable Energy Opportunities for a Small Scale Off-Grid Facility: A Case Study at Experimental Lakes Area (ELA), Ontario

    NASA Astrophysics Data System (ADS)

    Duggirala, Bhanu

    This thesis explored the opportunities to reduce energy demand and renewable energy feasibility at an off-grid science "community" called the Experimental Lakes Area (ELA) in Ontario. Being off-grid, ELA is completely dependent on diesel and propane fuel supply for all its electrical and heating needs, which makes ELA vulnerable to fluctuating fuel prices. As a result ELA emits a large amount of greenhouse gases (GHG) for its size. Energy efficiency and renewable energy technologies can reduce energy consumption and consequently energy cost, as well as GHG. Energy efficiency was very important to ELA due to the elevated fuel costs at this remote location. Minor upgrades to lighting, equipment and building envelope were able to reduce energy costs and reduce load. Efficient energy saving measures were recommended that save on operating and maintenance costs, namely, changing to LED lights, replacing old equipment like refrigerators and downsizing of ice makers. This resulted in a 4.8% load reduction and subsequently reduced the initial capital cost for biomass by 27,000, by 49,500 for wind power and by 136,500 for solar power. Many alternative energies show promise as potential energy sources to reduce the diesel and propane consumption at ELA including wind energy, solar heating and biomass. A biomass based CHP system using the existing diesel generators as back-up has the shortest pay back period of the technologies modeled. The biomass based CHP system has a pay back period of 4.1 years at 0.80 per liter of diesel, as diesel price approaches $2.00 per liter the pay back period reduces to 0.9 years, 50% the generation cost compared to present generation costs. Biomass has been successfully tried and tested in many off-grid communities particularly in a small-scale off-grid setting in North America and internationally. Also, the site specific solar and wind data show that ELA has potential to harvest renewable resources and produce heat and power at competitive

  5. Experimental Evaluation of Load Rejection Over-Voltage from Grid-Tied Solar Inverters

    SciTech Connect

    Nelson, Austin; Hoke, Anderson; Chakraborty, Sudipta; Ropp, Michael; Chebahtah, Justin; Wang, Trudie; Zimmerly, Brian

    2015-06-14

    This paper investigates the impact of load rejection over-voltage (LRO) from commercially available grid-tied photovoltaic (PV) inverters. LRO can occur when a breaker opens and the power output from a distributed energy resource (DER) exceeds the load. Simplified models of current-controlled inverters can over-predict LRO magnitudes, thus it is useful to quantify the effect through laboratory testing. The load rejection event was replicated using a hardware testbed at the National Renewable Energy Laboratory (NREL), and a set of commercially available PV inverters was tested to quantify the impact of LRO for a range of generation-to-load ratios. The magnitude and duration of the over-voltage events are reported in this paper along with a discussion of characteristic inverter output behavior. The results for the inverters under test showed that maximum over-voltage magnitudes were less than 200% of nominal voltage, and much lower in many test cases. These research results are important because utilities that interconnect inverter-based DER need to understand their characteristics under abnormal grid conditions.

  6. Pre-Service Teachers' Use of Improvised and Virtual Laboratory Experimentation in Science Teaching

    ERIC Educational Resources Information Center

    Bhukuvhani, Crispen; Kusure, Lovemore; Munodawafa, Violet; Sana, Abel; Gwizangwe, Isaac

    2010-01-01

    This research surveyed 11 purposely sampled Bindura University of Science Education (Zimbabwe) Bachelor of Science Education Honours Part III pre-service science teachers' use of improvised and virtual laboratory experimentation in science teaching. A self-designed four-point Likert scale twenty-item questionnaire was used. SPSS Version 10 was…

  7. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  8. EverVIEW: A visualization platform for hydrologic and Earth science gridded data

    NASA Astrophysics Data System (ADS)

    Romañach, Stephanie S.; McKelvy, Mark; Suir, Kevin; Conzelmann, Craig

    2015-03-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  9. The Distinction between Experimental and Historical Sciences as a Framework for Improving Classroom Inquiry

    ERIC Educational Resources Information Center

    Gray, Ron

    2014-01-01

    Inquiry experiences in secondary science classrooms are heavily weighted toward experimentation. We know, however, that many fields of science (e.g., evolutionary biology, cosmology, and paleontology), while they may utilize experiments, are not justified by experimental methodologies. With the focus on experimentation in schools, these fields of…

  10. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching. PMID:23482120

  11. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation, and Research

    NASA Astrophysics Data System (ADS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-06-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing -1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  12. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-01-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  13. Students' epistemologies about experimental physics: Validating the Colorado Learning Attitudes about Science Survey for experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-06-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder and elsewhere, we developed the Colorado Learning Attitudes about Science Survey for experimental physics (E-CLASS). Previous work with this assessment has included establishing the accuracy and clarity of the instrument through student interviews and preliminary testing. Several years of data collection at multiple institutions has resulted in a growing national data set of student responses. Here, we report on results of the analysis of these data to investigate the statistical validity and reliability of the E-CLASS as a measure of students' epistemologies for a broad student population. We find that the E-CLASS demonstrates an acceptable level of both validity and reliability on measures of item and test discrimination, test-retest reliability, partial-sample reliability, internal consistency, concurrent validity, and convergent validity. We also examine students' responses using principal component analysis and find that, as expected, the E-CLASS does not exhibit strong factors (a.k.a. categories).

  14. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  15. SEE-GRID eInfrastructure for Regional eScience

    NASA Astrophysics Data System (ADS)

    Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel

    In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e

  16. Grid oscillators

    NASA Technical Reports Server (NTRS)

    Popovic, Zorana B.; Kim, Moonil; Rutledge, David B.

    1988-01-01

    Loading a two-dimensional grid with active devices offers a means of combining the power of solid-state oscillators in the microwave and millimeter-wave range. The grid structure allows a large number of negative resistance devices to be combined. This approach is attractive because the active devices do not require an external locking signal, and the combining is done in free space. In addition, the loaded grid is a planar structure amenable to monolithic integration. Measurements on a 25-MESFET grid at 9.7 GHz show power-combining and frequency-locking without an external locking signal, with an ERP of 37 W. Experimental far-field patterns agree with theoretical results obtained using reciprocity.

  17. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William W.; Schuster, David; Adams, Betty; Applegate, Brooks; Skjold, Brandy; Undreiu, Adriana; Loving, Cathleen C.; Gobert, Janice D.

    2010-01-01

    There are continuing educational and political debates about "inquiry" versus "direct" teaching of science. Traditional science instruction has been largely direct but in the US, recent national and state science education standards advocate inquiry throughout K-12 education. While inquiry-based instruction has the advantage of modelling aspects…

  18. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    NASA Astrophysics Data System (ADS)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  19. Data Grids: a new computational infrastructure for data-intensive science.

    PubMed

    Avery, Paul

    2002-06-15

    Twenty-first-century scientific and engineering enterprises are increasingly characterized by their geographic dispersion and their reliance on large data archives. These characteristics bring with them unique challenges. First, the increasing size and complexity of modern data collections require significant investments in information technologies to store, retrieve and analyse them. Second, the increased distribution of people and resources in these projects has made resource sharing and collaboration across significant geographic and organizational boundaries critical to their success. In this paper I explore how computing infrastructures based on Data Grids offer data-intensive enterprises a comprehensive, scalable framework for collaboration and resource sharing. A detailed example of a Data Grid framework is presented for a Large Hadron Collider experiment, where a hierarchical set of laboratory and university resources comprising petaflops of processing power and a multi-petabyte data archive must be efficiently used by a global collaboration. The experience gained with these new information systems, providing transparent managed access to massive distributed data collections, will be applicable to large-scale, data-intensive problems in a wide spectrum of scientific and engineering disciplines, and eventually in industry and commerce. Such systems will be needed in the coming decades as a central element of our information-based society. PMID:12804274

  20. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William; Schuster, David; Adams, Betty

    2010-01-01

    It is evident that "experientially-based" instruction and "active student engagement" are advantageous for effective science learning. However, "hands-on" and "minds-on" aspects can occur in both inquiry and direct science instruction, and convincing comparative evidence for the superiority of either mode remains rare. Thus, the pertinent question…

  1. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    SciTech Connect

    Jablonowski, Christiane

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  2. Vanguard: A New Science Mission For Experimental Astrobiology

    NASA Astrophysics Data System (ADS)

    Ellery, A.; Wynn-Williams, D.; Edwards, H.; Dickensheets, D.; Welch, C.; Curley, A.

    As an alternative to technically and financially problemat ic sample return missions, a rover-mounted laser Raman spectrometer sensitive to biomolecules and their mineral substrata is a promising alternative in the search for evidence of former life on Mars. We presented a new remote in situ analysis package being designed for experimental astrobiology on terrestrial-type planetary surfaces. The science is based on the hypothesis that if life arose on Mars, the selective pressure of solar radiation would have led to the evolution of pigmented systems to harness the energy of sunlight and to protect cells from concurrent UV stress. Microbial communities would have therefore become stratified by the light gradient, and our remote system would penetrate the near-subsurface profile in a vertical transect of horizontal strata in ancient sediments (such as palaeolake beds). The system will include an extensive array of robotic support to translocate and deploy a Raman spectrometer detectors beneath the surface of Mars ­ it will comprise of a base station lander to support communications, a robotic micro-rover to permit well- separated triplicate profiles made by three ground-penetrating moles mounted in a vertical configuration. Each mole will deploy a tether carrying fibre optic cables coupling the Raman spectrometer onboard the rover and the side-scanning sensor head on the mole. The complete system has been named Vanguard, and it represents a close collaboration between a space robotics engineer (Ellery), an astrobiologist (Wynn-Williams), a molecular spectroscopist (Edwards), an opto-electronic technologist (Dickensheets), a spacecraft engineer (Welch) and a robotic vision specialist (Curley). The autonomy requirement for the Vanguard instrument requires that significant scientific competence is imparted to the instrument through an expert system to ensure that quick-look analysis is performed onboard in real-time as the mole penetrates beneath the surface. Onboard

  3. Experimental Investigation of the Behavior of Sub-Grid Scale Motions in Turbulent Shear Flow

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian

    1992-01-01

    Experiments have been carried out on a vertical jet of helium issuing into a co-flow of air at a fixed exit velocity ratio of 2.0. At all the experimental conditions studied, the flow exhibits a strong self excited periodicity. The natural frequency behavior of the jet, the underlying fine-scale flow structure, and the transition to turbulence have been studied over a wide range of flow conditions. The experiments were conducted in a variable pressure facility which made it possible to vary the Reynolds number and Richardson number independently. A stroboscopic schlieren system was used for flow visualization and single-component Laser Doppler Anemometry was used to measure the axial component of velocity. The flow exhibits several interesting features. The presence of co-flow eliminates the random meandering typical of buoyant plumes in a quiescent environment and the periodicity of the helium jet under high Richardson number conditions is striking. Under these conditions transition to turbulence consists of a rapid but highly structured and repeatable breakdown and intermingling of jet and freestream fluid. At Ri = 1.6 the three-dimensional structure of the flow is seen to repeat from cycle to cycle. The point of transition moves closer to the jet exit as either the Reynolds number or the Richardson number increases. The wavelength of the longitudinal instability increases with Richardson number. At low Richardson numbers, the natural frequency scales on an inertial time scale. At high Richardson number the natural frequency scales on a buoyancy time scale. The transition from one flow regime to another occurs over a narrow range of Richardson numbers from 0.7 to 1. A buoyancy Strouhal number is used to correlate the high Richardson number frequency behavior.

  4. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  5. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    ERIC Educational Resources Information Center

    Allen, Michael; Coole, Hilary

    2012-01-01

    This paper describes a randomised educational experiment (n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from…

  6. Geometric and Applied Optics, Science (Experimental): 5318.04.

    ERIC Educational Resources Information Center

    Sanderson, Robert C.

    This unit of instruction presents a laboratory-oriented course which relates the sources and behaviors of light to man's control and uses of light. Successful completion of Algebra I and Plane Geometry is strongly recommended as indicators of success. The course is recommended if the student plans further studies in science, optical technology, or…

  7. Early Adolescence: Using Consumer Science to Develop Experimental Techniques.

    ERIC Educational Resources Information Center

    Padilla, Michael

    1981-01-01

    Describes several consumer science activities useful for introducing process skills for the middle/junior high school student. Activities described include testing laundry detergent effectiveness for stain removal, comparison of quantities in fast foods, and various activities concerning tests of product claims. (DS)

  8. Learning Political Science with Prediction Markets: An Experimental Study

    ERIC Educational Resources Information Center

    Ellis, Cali Mortenson; Sami, Rahul

    2012-01-01

    Prediction markets are designed to aggregate the information of many individuals to forecast future events. These markets provide participants with an incentive to seek information and a forum for interaction, making markets a promising tool to motivate student learning. We carried out a quasi-experiment in an introductory political science class…

  9. Accounting for reciprocal host-microbiome interactions in experimental science.

    PubMed

    Stappenbeck, Thaddeus S; Virgin, Herbert W

    2016-06-01

    Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research. PMID:27279212

  10. Grid Architecture 2

    SciTech Connect

    Taft, Jeffrey D.

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  11. FermiGrid

    SciTech Connect

    Yocum, D.R.; Berman, E.; Canal, P.; Chadwick, K.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Sharma, N.; Timm, S.; /Fermilab

    2007-05-01

    As one of the founding members of the Open Science Grid Consortium (OSG), Fermilab enables coherent access to its production resources through the Grid infrastructure system called FermiGrid. This system successfully provides for centrally managed grid services, opportunistic resource access, development of OSG Interfaces for Fermilab, and an interface to the Fermilab dCache system. FermiGrid supports virtual organizations (VOs) including high energy physics experiments (USCMS, MINOS, D0, CDF, ILC), astrophysics experiments (SDSS, Auger, DES), biology experiments (GADU, Nanohub) and educational activities.

  12. The DOE SunShot Initiative: Science and Technology to enable Solar Electricity at Grid Parity

    NASA Astrophysics Data System (ADS)

    Ramesh, Ramamoorthy

    2012-02-01

    The SunShot Initiative's mission is to develop solar energy technologies through a collaborative national push to make solar Photovoltaic (PV) and Concentrated Solar Power (CSP) energy technologies cost-competitive with fossil fuel based energy by reducing the cost of solar energy systems by ˜ 75 percent before 2020. Reducing the total installed cost for utility-scale solar electricity to roughly 6 cents per kilowatt hour (1/Watt) without subsidies will result in rapid, large-scale adoption of solar electricity across the United States and the world. Achieving this goal will require significant reductions and technological innovations in all PV system components, namely modules, power electronics, and balance of systems (BOS), which includes all other components and costs required for a fully installed system including permitting and inspection costs. This investment will re-establish American technological and market leadership, improve the nation's energy security, strengthen U.S. economic competitiveness and catalyze domestic economic growth in the global clean energy race. SunShot is a cooperative program across DOE, involving the Office of Science, the Office of Energy Efficiency and Renewable Energy and ARPA-E.

  13. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  14. An Experimental Clinical Science Fellowship in Cardiovascular-Renal

    ERIC Educational Resources Information Center

    Chasis, Herbert; Campbell, Charles I.

    1974-01-01

    Describes the New York Heart Association's experimental program aimed at evaluating a method of developing physicians disciplined by research and competent both as teachers and in the care of patients (clinical scientists). (Author)

  15. Experimental evaluation of fiber-interspaced antiscatter grids for large patient imaging with digital x-ray systems

    NASA Astrophysics Data System (ADS)

    Fetterly, Kenneth A.; Schueler, Beth A.

    2007-08-01

    Radiographic imaging of large patients is compromised by x-ray scatter. Optimization of digital x-ray imaging systems used for projection radiography requires the use of the best possible antiscatter grid. The performance of antiscatter grids used in conjunction with digital x-ray imaging systems can be characterized through measurement of the signal-to-noise ratio (SNR) improvement factor (KSNR). The SNR improvement factor of several linear, focused antiscatter grids was determined from measurements of the fundamental primary and scatter transmission fraction measurements of the grids as well as the inherent scatter-to-primary ratio (SPR) of the x-ray beam and scatter phantom. The inherent SPR and scatter transmission fraction was measured using a graduated lead beam stop method. The KSNR of eight grids with line rates (N) in the range 40 to 80 cm-1 and ratios (r) in the range 8:1 to 15:1 was measured. All of the grids had fiber interspace material and carbon-fiber covers. The scatter phantom used was Solid Water® with thickness 10 to 50 cm, and a 30 × 30 cm2 field of view was used. All measurements were acquired using a 104 kVp x-ray beam. The SPR of the non-grid imaging condition ranged from 2.55 for the 10 cm phantom to 25.9 for the 50 cm phantom. The scatter transmission fractions ranged from a low of 0.083 for the N50 r15 grid to a high of 0.22 for the N40 r8 grid and the primary transmission fractions ranged from a low of 0.69 for the N80 r15 grid to 0.76 for the N40 r8 grid. The SNR improvement factors ranged from 1.2 for the 10 cm phantom and N40 r8 grid to 2.09 for the 50 cm phantom and the best performing N50 r15, N44 r15 and N40 r14 grids.

  16. An Experimental Evaluation of the Effects of ESCP and General Science on the Development of Interdisciplinary Science Concepts by Ninth Grade Students.

    ERIC Educational Resources Information Center

    Coleman, Esther Montague

    This study was an experimental evaluation of achievement in understanding interdisciplinary science concepts by ninth grade students enrolled in two different integrated science courses. The experimental group used "Investigating the Earth", the textbook/laboratory program, developed by the Earth Science Curriculum Project (ESCP) staff. The…

  17. Space Science Education: An Experimental Study. Report of the Study Commission on Space Science Education.

    ERIC Educational Resources Information Center

    Vick, Raymond

    The implications of space science terminology and concepts for elementary science teaching are explored. Twenty-two concepts were identified which elementary and junior high school teachers were invited to introduce in their teaching. Booklets explaining the concepts were distributed together with report forms for teacher feedback. The numbers of…

  18. Life Science Research and Drug Discovery at the Turn of the 21st Century: The Experience of SwissBioGrid

    PubMed Central

    den Besten, Matthijs; Thomas, Arthur J; Schroeder, Ralph

    2009-01-01

    Background It is often said that the life sciences are transforming into an information science. As laboratory experiments are starting to yield ever increasing amounts of data and the capacity to deal with those data is catching up, an increasing share of scientific activity is seen to be taking place outside the laboratories, sifting through the data and modelling “in-silico” the processes observed “in-vitro.” The transformation of the life sciences and similar developments in other disciplines have inspired a variety of initiatives around the world to create technical infrastructure to support the new scientific practices that are emerging. The e-Science programme in the United Kingdom and the NSF Office for Cyberinfrastructure are examples of these. In Switzerland there have been no such national initiatives. Yet, this has not prevented scientists from exploring the development of similar types of computing infrastructures. In 2004, a group of researchers in Switzerland established a project, SwissBioGrid, to explore whether Grid computing technologies could be successfully deployed within the life sciences. This paper presents their experiences as a case study of how the life sciences are currently operating as an information science and presents the lessons learned about how existing institutional and technical arrangements facilitate or impede this operation. Results SwissBioGrid was established to provide computational support to two pilot projects: one for proteomics data analysis, and the other for high-throughput molecular docking (“virtual screening”) to find new drugs for neglected diseases (specifically, for dengue fever). The proteomics project was an example of a large-scale data management problem, applying many different analysis algorithms to Terabyte-sized datasets from mass spectrometry, involving comparisons with many different reference databases; the virtual screening project was more a purely computational problem, modelling the

  19. The Beliefs and Behaviors of Pupils in an Experimental School: The Science Lab.

    ERIC Educational Resources Information Center

    Lancy, David F.

    This booklet, the second in a series, reports on the results of a year-long research project conducted in an experimental school associated with the Learning Research and Development Center, University of Pittsburgh. Specifically, this is a report of findings pertaining to one major setting in the experimental school, the science lab. The science…

  20. Is Physicality an Important Aspect of Learning through Science Experimentation among Kindergarten Students?

    ERIC Educational Resources Information Center

    Zacharia, Zacharias C.; Loizou, Eleni; Papaevripidou, Marios

    2012-01-01

    The purpose of this study was to investigate whether physicality (actual and active touch of concrete material), as such, is a necessity for science experimentation learning at the kindergarten level. We compared the effects of student experimentation with Physical Manipulatives (PM) and Virtual Manipulatives (VM) on kindergarten students'…

  1. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor.

    PubMed

    Singh, M J; De Esch, H P L

    2010-01-01

    This paper describes the physics design of a 100 keV, 60 A H(-) accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated. PMID:20113091

  2. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor

    SciTech Connect

    Singh, M. J.; De Esch, H. P. L.

    2010-01-15

    This paper describes the physics design of a 100 keV, 60 A H{sup -} accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated.

  3. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    NASA Astrophysics Data System (ADS)

    Allen, Michael; Coole, Hilary

    2012-06-01

    This paper describes a randomised educational experiment ( n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from learners that were triggered by their own confirmation biases. The treatment group showed superior learning gains to control at post-test immediately after the lesson, although benefits had dissipated after 6 weeks. Findings are discussed with reference to the conceptual change paradigm and to the importance of feeling emotion during a learning experience, having implications for the teaching of pedagogies to adults that have been previously shown to be successful with children.

  4. An 11-year global gridded aerosol optical thickness reanalysis (v1.0) for atmospheric and climate sciences

    NASA Astrophysics Data System (ADS)

    Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.

    2016-04-01

    While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how

  5. Nonlethal suppression: from basic science to operationally relevant experimentation

    NASA Astrophysics Data System (ADS)

    Servatius, Richard J.; Beck, Kevin D.

    2006-05-01

    Use of force justification, second nature to law enforcement personnel, is increasingly considered by military personnel especially in military operations on urban terrain (MOUT) scenarios. In these situations, military and civilian law enforcement objectives are similar: exert control over individuals and groups with minimum force. Although the list of potential devices and systems grow, empirical demonstrations of effectiveness are lacking. Here, a position is presented regarding approaches to experimental analysis of nonlethal (a.k.a., less-than-lethal and less lethal) technologies and solutions. Appreciation of the concepts of suppression and its attendant behavioral variables will advance the development of nonlethal weapons and systems (NLW&S).

  6. Students' Epistemologies about Experimental Physics: Validating the Colorado Learning Attitudes about Science Survey for Experimental Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-01-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder…

  7. Hybrid Grid Generation Using NW Grid

    SciTech Connect

    Jones-Oliveira, Janet B.; Oliveira, Joseph S.; Trease, Lynn L.; Trease, Harold E.; B.K. Soni, J. Hauser, J.F. Thompson, P.R. Eiseman

    2000-09-01

    We describe the development and use of a hybrid n-dimensional grid generation system called NWGRID. The Applied Mathematics Group at Pacific Northwest National Laboratory (PNNL) is developing this tool to support the Laboratory's computational science efforts in chemistry, biology, engineering and environmental (subsurface and atmospheric) modeling. NWGRID is the grid generation system, which is designed for multi-scale, multi-material, multi-physics, time-dependent, 3-D, hybrid grids that are either statically adapted or evolved in time. NWGRID'S capabilities include static and dynamic grids, hybrid grids, managing colliding surfaces, and grid optimization[using reconnections, smoothing, and adaptive mesh refinement (AMR) algorithms]. NWGRID'S data structure can manage an arbitrary number of grid objects, each with an arbitrary number of grid attributes. NWGRID uses surface geometry to build volumes by using combinations of Boolean operators and order relations. Point distributions can be input, generated using either ray shooting techniques or defined point-by-point. Connectivity matrices are then generated automatically for all variations of hybrid grids.

  8. The Art and Science of Experimentation in Quantum Physics

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2010-05-01

    Taking its historical point of departure in Heisenberg's work, this article offers a view of quantum mechanics as, arguably, the first truly experimental and truly mathematical physical theory, that is, a theory concerned with experimenting with nature and mathematics alike. It is truly experimental because it is not, as in classical physics, merely the independent behavior of the system considered, in other words, what happens in any event, that we track, but what kind of experiments we perform that defines what happens. By the same token, the theory is also truly mathematical because, at least in the interpretation adopted here, its mathematical formalism does not stand in the service of a mathematical description of (quantum) physical processes in space and time in the way the formalism of classical physics does, but is only used to predict the outcomes of relevant experiments. It also follows that quantum theories experiment more freely with mathematics itself, since we invent predictive mathematical schemes, rather than proceed by refining mathematically our phenomenal representations of nature, which process constrains us in classical mechanics.

  9. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  10. Considerations for Life Science experimentation on the Space Shuttle.

    PubMed

    Souza, K A; Davies, P; Rossberg Walker, K

    1992-10-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions. PMID:11537654

  11. Social Science and Neuroscience beyond Interdisciplinarity: Experimental Entanglements

    PubMed Central

    Callard, Felicity

    2015-01-01

    This article is an account of the dynamics of interaction across the social sciences and neurosciences. Against an arid rhetoric of ‘interdisciplinarity’, it calls for a more expansive imaginary of what experiment – as practice and ethos – might offer in this space. Arguing that opportunities for collaboration between social scientists and neuroscientists need to be taken seriously, the article situates itself against existing conceptualizations of these dynamics, grouping them under three rubrics: ‘critique’, ‘ebullience’ and ‘interaction’. Despite their differences, each insists on a distinction between sociocultural and neurobiological knowledge, or does not show how a more entangled field might be realized. The article links this absence to the ‘regime of the inter-’, an ethic of interdisciplinarity that guides interaction between disciplines on the understanding of their pre-existing separateness. The argument of the paper is thus twofold: (1) that, contra the ‘regime of the inter-’, it is no longer practicable to maintain a hygienic separation between sociocultural webs and neurobiological architecture; (2) that the cognitive neuroscientific experiment, as a space of epistemological and ontological excess, offers an opportunity to researchers, from all disciplines, to explore and register this realization. PMID:25972621

  12. Considerations for Life Science experimentation on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Davies, P.; Rossberg Walker, K.

    1992-01-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions.

  13. Science and society: different bioethical approaches towards animal experimentation.

    PubMed

    Brom, Frans W A

    2002-01-01

    respect their integrity. By weighing these prima facie duties, the moral problem of animal experimentation exists in finding which duty actually has to be considered as the decisive duty. It will be argued that these three views, even though they will all justify animal experimentation to some extent, will do so in practice under different conditions. Many current conflicts regarding the use of animals for research may be better understood in light of the conflict between the three bioethical perspectives provided by these views. PMID:12098014

  14. Challenges facing production grids

    SciTech Connect

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  15. The Earth System Grid Federation (ESGF): Climate Science Infrastructure for Large-scale Data Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2015-12-01

    Progress in understanding and predicting climate change requires advanced tools to securely store, manage, access, process, analyze, and visualize enormous and distributed data sets. Only then can climate researchers understand the effects of climate change across all scales and use this information to inform policy decisions. With the advent of major international climate modeling intercomparisons, a need emerged within the climate-change research community to develop efficient, community-based tools to obtain relevant meteorological and other observational data, develop custom computational models, and export analysis tools for climate-change simulations. While many nascent efforts to fill these gaps appeared, they were not integrated and therefore did not benefit from collaborative development. Sharing huge data sets was difficult, and the lack of data standards prevented the merger of output data from different modeling groups. Thus began one of the largest-ever collaborative data efforts in climate science, resulting in the Earth System Grid Federation (ESGF), which is now used to disseminate model, observational, and reanalysis data for research assessed by the Intergovernmental Panel on Climate Change (IPCC). Today, ESGF is an open-source petabyte-level data storage and dissemination operational code-base that manages secure resources essential for climate change study. It is designed to remain robust even as data volumes grow exponentially. The internationally distributed, peer-to-peer ESGF "data cloud" archive represents the culmination of an effort that began in the late 1990s. ESGF portals are gateways to scientific data collections hosted at sites around the globe that allow the user to register and potentially access the entire ESGF network of data and services. The growing international interest in ESGF development efforts has attracted many others who want to make their data more widely available and easy to use. For example, the World Climate

  16. "Exploratory experimentation" as a probe into the relation between historiography and philosophy of science.

    PubMed

    Schickore, Jutta

    2016-02-01

    This essay utilizes the concept "exploratory experimentation" as a probe into the relation between historiography and philosophy of science. The essay traces the emergence of the historiographical concept "exploratory experimentation" in the late 1990s. The reconstruction of the early discussions about exploratory experimentation shows that the introduction of the concept had unintended consequences: Initially designed to debunk philosophical ideas about theory testing, the concept "exploratory experimentation" quickly exposed the poverty of our conceptual tools for the analysis of experimental practice. Looking back at a number of detailed analyses of experimental research, we can now appreciate that the concept of exploratory experimentation is too vague and too elusive to fill the desideratum whose existence it revealed. PMID:26774065

  17. Spline for blade grids design

    NASA Astrophysics Data System (ADS)

    Korshunov, Andrei; Shershnev, Vladimir; Korshunova, Ksenia

    2015-08-01

    Methods of designing blades grids of power machines, such as equal thickness shape built on middle-line arc, or methods based on target stress spreading were invented long time ago, well described and still in use. Science and technology has moved far from that time and laboriousness of experimental research, which were involving unique equipment, requires development of new robust and flexible methods of design, which will determine the optimal geometry of flow passage.This investigation provides simple and universal method of designing blades, which, in comparison to the currently used methods, requires significantly less input data but still provides accurate results. The described method is purely analytical for both concave and convex sides of the blade, and therefore lets to describe the curve behavior down the flow path at any point. Compared with the blade grid designs currently used in industry, geometric parameters of the designs constructed with this method show the maximum deviation below 0.4%.

  18. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    PubMed

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  19. Views of the STS-5 Science Press briefing with Student Experimenters

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Views of the STS-5 Science Press briefing with Student Experimenters. Photos include Michelle Issel of Wallingfor, Connecticut showing her studen experiment dealing with the formation of crystals in a weightless environment (37862); Aaron Gillette of Winter Haven, Florida displaying his student experiment dealing with the growth of Porifera in zero gravity (37863).

  20. The Role of the Scientific Discovery Narrative in Middle School Science Education: An Experimental Study

    ERIC Educational Resources Information Center

    Arya, Diana J.; Maul, Andrew

    2012-01-01

    In an experimental study (N = 209), the authors compared the effects of exposure to typical middle-school written science content when presented in the context of the scientific discovery narrative and when presented in a more traditional nonnarrative format on 7th and 8th grade students in the United States. The development of texts was…

  1. Factors Influencing Students' Choice(s) of Experimental Science Subjects within the International Baccalaureate Diploma Programme

    ERIC Educational Resources Information Center

    James, Kieran

    2007-01-01

    This article outlines a study conducted in Finland and Portugal into the reasons why International Baccalaureate (IB) Diploma Programme (DP) students choose particular Experimental Science (Group 4) subjects. Its findings suggest that interest, enjoyment, university course and career requirements have most influence on students' choices.…

  2. Opening Possibilities in Experimental Science and Its History: Critical Explorations with Pendulums and Singing Tubes

    ERIC Educational Resources Information Center

    Cavicchi, Elizabeth

    2008-01-01

    A teacher and a college student explore experimental science and its history by reading historical texts, and responding with replications and experiments of their own. A curriculum of ever-widening possibilities evolves in their ongoing interactions with each other, history, and such materials as pendulums, flame, and resonant singing tubes.…

  3. Mathematics and Experimental Sciences in the FRG-Upper Secondary Schools. Occasional Paper 40.

    ERIC Educational Resources Information Center

    Steiner, Hans-Georg

    The mathematics and experimental science courses in the programs of the upper secondary school in the Federal Republic of Germany (FRG) are discussed. The paper addresses: (1) the two "secondary levels" within the FRG school system, indicating that the Secondary I-Level (SI) comprises grades 5 through 9 or 10 while the Secondary II-Level (SII)…

  4. General Science, Ninth Grade: Theme III and Theme IV. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This document is the student laboratory manual that was designed to accompany some of the experimental activities found in the teacher's guide to this general science course for ninth graders. It contains laboratory worksheets for lessons on such topics as: (1) soil; (2) hazardous waste; (3) wildlife refuges; (4) the water cycle; (5) water…

  5. The resisted rise of randomisation in experimental design: British agricultural science, c.1910-1930.

    PubMed

    Berry, Dominic

    2015-09-01

    The most conspicuous form of agricultural experiment is the field trial, and within the history of such trials, the arrival of the randomised control trial (RCT) is considered revolutionary. Originating with R.A. Fisher within British agricultural science in the 1920s and 1930s, the RCT has since become one of the most prodigiously used experimental techniques throughout the natural and social sciences. Philosophers of science have already scrutinised the epistemological uniqueness of RCTs, undermining their status as the 'gold standard' in experimental design. The present paper introduces a historical case study from the origins of the RCT, uncovering the initially cool reception given to this method by agricultural scientists at the University of Cambridge and the (Cambridge based) National Institute of Agricultural Botany. Rather than giving further attention to the RCT, the paper focuses instead on a competitor method-the half-drill strip-which both predated the RCT and remained in wide use for at least a decade beyond the latter's arrival. In telling this history, John Pickstone's Ways of Knowing is adopted, as the most flexible and productive way to write the history of science, particularly when sciences and scientists have to work across a number of different kinds of place. It is shown that those who resisted the RCT did so in order to preserve epistemic and social goals that randomisation would have otherwise run a tractor through. PMID:26205200

  6. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    ERIC Educational Resources Information Center

    Onghena, Sofie

    2013-01-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact…

  7. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  8. Probeware in 8th Grade Science: A Quasi-Experimental Study on Attitude and Achievement

    NASA Astrophysics Data System (ADS)

    Moyer, John F., III

    The use of probeware in the delivery of science instruction has become quite widespread over the past few decades. The current emphasis on Science, Technology, Engineering, and Mathematics (STEM) education, especially in the case of underrepresented populations, seems to have accelerated the inclusion of probeware into curriculum. This quasi-experimental study sought to examine the effects of a direct replacement of traditional science tools with computer-based probeware on student achievement and student attitude toward science. Data analysis was conducted for large comparison groups and then for target STEM groups of African-American, low socioeconomic status, and female. Student achievement was measured by the Energy Concept Inventory and student attitude was measured by the Attitude Toward Science Inventory. The results showed that probeware did not have a significant effect on student achievement for almost all comparison groups. Analysis of student attitude toward science revealed that the use of probeware significantly affected overall student attitude as well as student attitude in several disaggregated subscales of attitude. These findings hold for both the comparison groups and the target STEM groups. Limitations of the study and suggestions for future research are presented.

  9. Experimental Characterization of a Grid-Loss Event on a 2.5-MW Dynamometer Using Advanced Operational Modal Analysis: Preprint

    SciTech Connect

    Helsen, J.; Weijtjens, W.; Guo, Y.; Keller, J.; McNiff, B.; Devriendt, C.; Guillaume, P.

    2015-02-01

    This paper experimentally investigates a worst case grid loss event conducted on the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) drivetrain mounted on the 2.5MW NREL dynamic nacelle test-rig. The GRC drivetrain has a directly grid-coupled, fixed speed asynchronous generator. The main goal is the assessment of the dynamic content driving this particular assess the dynamic content of the high-speed stage of the GRC gearbox. In addition to external accelerometers, high frequency sampled measurements of strain gauges were used to assess torque fluctuations and bending moments both at the nacelle main shaft and gearbox high-speed shaft (HSS) through the entire duration of the event. Modal analysis was conducted using a polyreference Least Squares Complex Frequency-domain (pLSCF) modal identification estimator. The event driving the torsional resonance was identified. Moreover, the pLSCF estimator identified main drivetrain resonances based on a combination of acceleration and strain measurements. Without external action during the grid-loss event, a mode shape characterized by counter phase rotation of the rotor and generator rotor determined by the drivetrain flexibility and rotor inertias was the main driver of the event. This behavior resulted in significant torque oscillations with large amplitude negative torque periods. Based on tooth strain measurements of the HSS pinion, this work showed that at each zero-crossing, the teeth lost contact and came into contact with the backside flank. In addition, dynamic nontorque loads between the gearbox and generator at the HSS played an important role, as indicated by strain gauge-measurements.

  10. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  11. A quasi-experimental quantitative study of the effect of IB on science performance

    NASA Astrophysics Data System (ADS)

    Healer, Margaret Irene

    The purpose of this quasi-experimental quantitative research study was to investigate the effect of participation in the International Baccalaureate (IB) program on science performance. The findings of the 2x3 mixed ANOVA and Eta square analysis indicated a significant difference (in science CSAP mean scores between the treatment group: IB students ( n = 50) and the control group: non-IB students (n = 50) at the 5th through 10th grade level. The analysis of data concluded that although scores declined between 5th, 8th, and 10th grades with IB and non-IB students, a statistical difference was indicated at each level between the two groups: IB and non-IB in the area of science performance as measured by the CSAP assessment. Educational leaders can use the findings of this study to maximize student science achievement. Further research is recommended through a mixed study to determine the effectiveness of participation in the IB Program and a study of specificity of pedagogical strategies used with science performance with a larger sample size of IB and non-IB students longitudinally.

  12. Grid Work

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Pointwise Inc.'s, Gridgen Software is a system for the generation of 3D (three dimensional) multiple block, structured grids. Gridgen is a visually-oriented, graphics-based interactive code used to decompose a 3D domain into blocks, distribute grid points on curves, initialize and refine grid points on surfaces and initialize volume grid points. Gridgen is available to U.S. citizens and American-owned companies by license.

  13. Data Grid Implementations

    SciTech Connect

    Moore, Reagan W.; Studham, Ronald S.; Rajasekar, Arcot; Watson, Chip; Stockinger, Heinz; Kunszt, Peter; Charlie Catlett and Ian Foster

    2002-02-27

    Data grids link distributed, heterogeneous storage resources into a coherent data management system. From a user perspective, the data grid provides a uniform name space across the underlying storage systems, while supporting retrieval and storage of files. In the high energy physics community, at least six data grids have been implemented for the storage and distribution of experimental data. Data grids are also being used to support projects as diverse as digital libraries (National Library of Medicine Visible Embryo project), federation of multiple astronomy sky surveys (NSF National Virtual Observatory project), and integration of distributed data sets (Long Term Ecological Reserve). Data grids also form the core interoperability mechanisms for creating persistent archives, in which data collections are migrated to new technologies over time. The ability to provide a uniform name space across multiple administration domains is becoming a critical component of national-scale, collaborative projects.

  14. Experimental performance evaluation of software defined networking (SDN) based data communication networks for large scale flexi-grid optical networks.

    PubMed

    Zhao, Yongli; He, Ruiying; Chen, Haoran; Zhang, Jie; Ji, Yuefeng; Zheng, Haomian; Lin, Yi; Wang, Xinbo

    2014-04-21

    Software defined networking (SDN) has become the focus in the current information and communication technology area because of its flexibility and programmability. It has been introduced into various network scenarios, such as datacenter networks, carrier networks, and wireless networks. Optical transport network is also regarded as an important application scenario for SDN, which is adopted as the enabling technology of data communication networks (DCN) instead of general multi-protocol label switching (GMPLS). However, the practical performance of SDN based DCN for large scale optical networks, which is very important for the technology selection in the future optical network deployment, has not been evaluated up to now. In this paper we have built a large scale flexi-grid optical network testbed with 1000 virtual optical transport nodes to evaluate the performance of SDN based DCN, including network scalability, DCN bandwidth limitation, and restoration time. A series of network performance parameters including blocking probability, bandwidth utilization, average lightpath provisioning time, and failure restoration time have been demonstrated under various network environments, such as with different traffic loads and different DCN bandwidths. The demonstration in this work can be taken as a proof for the future network deployment. PMID:24787842

  15. MAGNETIC GRID

    DOEpatents

    Post, R.F.

    1960-08-01

    An electronic grid is designed employing magnetic forces for controlling the passage of charged particles. The grid is particularly applicable to use in gas-filled tubes such as ignitrons. thyratrons, etc., since the magnetic grid action is impartial to the polarity of the charged particles and, accordingly. the sheath effects encountered with electrostatic grids are not present. The grid comprises a conductor having sections spaced apart and extending in substantially opposite directions in the same plane, the ends of the conductor being adapted for connection to a current source.

  16. Highly transparent low resistance Ga doped ZnO/Cu grid double layers prepared at room temperature

    NASA Astrophysics Data System (ADS)

    Jang, Cholho; Zhizhen, Ye; Jianguo, Lü

    2015-12-01

    Ga doped ZnO (GZO)/Cu grid double layer structures were prepared at room temperature (RT). We have studied the electrical and optical characteristics of the GZO/Cu grid double layer as a function of the Cu grid spacing distance. The optical transmittance and sheet resistance of the GZO/Cu grid double layer are higher than that of the GZO/Cu film double layer regardless of the Cu grid spacing distance and increase as the Cu grid spacing distance increases. The calculated values for the transmittance and sheet resistance of the GZO/Cu grid double layer well follow the trend of the experimentally observed transmittance and sheet resistance ones. For the GZO/Cu grid double layer with a Cu grid spacing distance of 1 mm, the highest figure of merit (ΦTC = 6.19 × 10-3 Ω-1) was obtained. In this case, the transmittance, resistivity and filling factor (FF) of the GZO/Cu grid double layer are 83.74%, 1.10 × 10-4 Ω·cm and 0.173, respectively. Project supported by the Key Project of the National Natural Science Foundation of China (No. 91333203), the Program for Innovative Research Team in University of Ministry of Education of China (No. IRT13037), the National Natural Science Foundation of China (No. 51172204), and the Zhejiang Provincial Department of Science and Technology of China (No. 2010R50020).

  17. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  18. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  19. Can Jurors Recognize Missing Control Groups, Confounds, and Experimenter Bias in Psychological Science?

    PubMed Central

    McAuliff, Bradley D.; Kovera, Margaret Bull; Nunez, Gabriel

    2010-01-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed. PMID:18587635

  20. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    NASA Astrophysics Data System (ADS)

    Onghena, Sofie

    2013-04-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact that Belgium, as a result of its geographical position, considered itself as the centre of scientific relations between France and Germany, and as actually strengthened by its linguistic and cultural dualism in this regard. This pursuit of internationalist nationalism also affected the configuration of chemistry and physics as experimental courses at Belgian Royal State Schools, although the years preceding WWI are usually characterized as a period of rising nationalism in science, with countries such as Germany and France as prominent actors. To what extent did France and Germany influence Belgian debates on science education, science teachers' training, the use of textbooks, and the instalment of school laboratories and teaching collections?

  1. The Grid

    SciTech Connect

    White, Vicky

    2003-05-21

    By now almost everyone has heard of 'The Grid', or 'Grid Computing' as it should more properly be described. There are frequent articles in both the popular and scientific press talking about 'The Grid' or about some specific Grid project. Run II Experiments, US-CMS, BTeV, the Sloane Digital Sky Survey and the Lattice QCD folks are all incorporating aspects of Grid Computing in their plans, and the Fermilab Computing Division is supporting and encouraging these efforts. Why are we doing this and what does it have to do with running a physics experiment or getting scientific results? I will explore some of these questions and try to give an overview, not so much of the technical aspects of Grid Computing, rather of what the phenomenon means for our field.

  2. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  3. The Frequency of Hands-On Experimentation and Student Attitudes toward Science: A Statistically Significant Relation (2005-51-Ornstein)

    ERIC Educational Resources Information Center

    Ornstein, Avi

    2006-01-01

    Attitudinal data tested hypotheses that students have more positive attitudes toward science when teachers regularly emphasize hands-on laboratory activities and when students more frequently experience higher levels of experimentation or inquiry. The first predicted that students would have more positive attitudes toward science in classrooms…

  4. Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording.

    PubMed

    Holman, Luke; Head, Megan L; Lanfear, Robert; Jennions, Michael D

    2015-07-01

    Observer bias and other "experimenter effects" occur when researchers' expectations influence study outcome. These biases are strongest when researchers expect a particular result, are measuring subjective variables, and have an incentive to produce data that confirm predictions. To minimize bias, it is good practice to work "blind," meaning that experimenters are unaware of the identity or treatment group of their subjects while conducting research. Here, using text mining and a literature review, we find evidence that blind protocols are uncommon in the life sciences and that nonblind studies tend to report higher effect sizes and more significant p-values. We discuss methods to minimize bias and urge researchers, editors, and peer reviewers to keep blind protocols in mind. PMID:26154287

  5. Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording

    PubMed Central

    Lanfear, Robert; Jennions, Michael D.

    2015-01-01

    Observer bias and other “experimenter effects” occur when researchers’ expectations influence study outcome. These biases are strongest when researchers expect a particular result, are measuring subjective variables, and have an incentive to produce data that confirm predictions. To minimize bias, it is good practice to work “blind,” meaning that experimenters are unaware of the identity or treatment group of their subjects while conducting research. Here, using text mining and a literature review, we find evidence that blind protocols are uncommon in the life sciences and that nonblind studies tend to report higher effect sizes and more significant p-values. We discuss methods to minimize bias and urge researchers, editors, and peer reviewers to keep blind protocols in mind. PMID:26154287

  6. The DESY Grid Centre

    NASA Astrophysics Data System (ADS)

    Haupt, A.; Gellrich, A.; Kemp, Y.; Leffhalm, K.; Ozerov, D.; Wegner, P.

    2012-12-01

    DESY is one of the world-wide leading centers for research with particle accelerators, synchrotron light and astroparticles. DESY participates in LHC as a Tier-2 center, supports on-going analyzes of HERA data, is a leading partner for ILC, and runs the National Analysis Facility (NAF) for LHC and ILC in the framework of the Helmholtz Alliance, Physics at the Terascale. For the research with synchrotron light major new facilities are operated and built (FLASH, PETRA-III, and XFEL). DESY furthermore acts as Data-Tier1 centre for the Neutrino detector IceCube. Established within the EGI-project DESY operates a grid infrastructure which supports a number of virtual Organizations (VO), incl. ATLAS, CMS, and LHCb. Furthermore, DESY hosts some of HEP and non-HEP VOs, such as the HERA experiments and ILC as well as photon science communities. The support of the new astroparticle physics VOs IceCube and CTA is currently set up. As the global structure of the grid offers huge resources which are perfect for batch-like computing, DESY has set up the National Analysis Facility (NAF) which complements the grid to allow German HEP users for efficient data analysis. The grid infrastructure and the NAF use the same physics data which is distributed via the grid. We call the conjunction of grid and NAF the DESY Grid Centre. In the contribution to CHEP2012 we will in depth discuss the conceptional and operational aspects of our multi-VO and multi-community Grid Centre and present the system setup. We will in particular focus on the interplay of Grid and NAF and present experiences of the operations.

  7. Implementing Production Grids

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Ziobarth, John (Technical Monitor)

    2002-01-01

    We have presented the essence of experience gained in building two production Grids, and provided some of the global context for this work. As the reader might imagine, there were a lot of false starts, refinements to the approaches and to the software, and several substantial integration projects (SRB and Condor integrated with Globus) to get where we are today. However, the point of this paper is to try and make it substantially easier for others to get to the point where Information Power Grids (IPG) and the DOE Science Grids are today. This is what is needed in order to move us toward the vision of a common cyber infrastructure for science. The author would also like to remind the readers that this paper primarily represents the actual experiences that resulted from specific architectural and software choices during the design and implementation of these two Grids. The choices made were dictated by the criteria laid out in section 1. There is a lot more Grid software available today that there was four years ago, and various of these packages are being integrated into IPG and the DOE Grids. However, the foundation choices of Globus, SRB, and Condor would not be significantly different today than they were four years ago. Nonetheless, if the GGF is successful in its work - and we have every reason to believe that it will be - then in a few years we will see that the 28 functions provided by these packages will be defined in terms of protocols and MIS, and there will be several robust implementations available for each of the basic components, especially the Grid Common Services. The impact of the emerging Web Grid Services work is not yet clear. It will likely have a substantial impact on building higher level services, however it is the opinion of the author that this will in no way obviate the need for the Grid Common Services. These are the foundation of Grids, and the focus of almost all of the operational and persistent infrastructure aspects of Grids.

  8. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them. PMID:25311906

  9. Fibonacci Grids

    NASA Technical Reports Server (NTRS)

    Swinbank, Richard; Purser, James

    2006-01-01

    Recent years have seen a resurgence of interest in a variety of non-standard computational grids for global numerical prediction. The motivation has been to reduce problems associated with the converging meridians and the polar singularities of conventional regular latitude-longitude grids. A further impetus has come from the adoption of massively parallel computers, for which it is necessary to distribute work equitably across the processors; this is more practicable for some non-standard grids. Desirable attributes of a grid for high-order spatial finite differencing are: (i) geometrical regularity; (ii) a homogeneous and approximately isotropic spatial resolution; (iii) a low proportion of the grid points where the numerical procedures require special customization (such as near coordinate singularities or grid edges). One family of grid arrangements which, to our knowledge, has never before been applied to numerical weather prediction, but which appears to offer several technical advantages, are what we shall refer to as "Fibonacci grids". They can be thought of as mathematically ideal generalizations of the patterns occurring naturally in the spiral arrangements of seeds and fruit found in sunflower heads and pineapples (to give two of the many botanical examples). These grids possess virtually uniform and highly isotropic resolution, with an equal area for each grid point. There are only two compact singular regions on a sphere that require customized numerics. We demonstrate the practicality of these grids in shallow water simulations, and discuss the prospects for efficiently using these frameworks in three-dimensional semi-implicit and semi-Lagrangian weather prediction or climate models.

  10. Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra

    2004-01-01

    Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice

  11. LDCM Grid Prototype (LGP)

    NASA Technical Reports Server (NTRS)

    Weinstein, Beth; Lubelczyk, Jeff

    2006-01-01

    The LGP successfully demonstrated that grid technology could be used to create a collaboration among research scientists, their science development machines, and distributed data to create a science production system in a nationally distributed environment. Grid technology provides a low cost and effective method of enabling production of science products by the science community. To demonstrate this, the LGP partnered with NASA GSFC scientists and used their existing science algorithms to generate virtual Landsat-like data products using distributed data resources. LGP created 48 output composite scenes with 4 input scenes each for a total of 192 scienes processed in parallel. The demonstration took 12 hours, which beat the requirement by almost 50 percent, well within the LDCM requirement to process 250 scenes per day. The LGP project also showed the successful use of workflow tools to automate the processing. Investing in this technology has led to funding for a ROSES ACCESS proposal. The proposal intends to enable an expert science user to produce products from a number of similar distributed instrument data sets using the Land Cover Change Community-based Processing and Analysis System (LC-ComPS) Toolbox. The LC-ComPS Toolbox is a collection of science algorithms that enable the generation of data with ground resolution on the order of Landsat-class instruments.

  12. Solar Fridges and Personal Power Grids: How Berkeley Lab is Fighting Global Poverty (LBNL Science at the Theater)

    SciTech Connect

    Buluswar, Shashi; Gadgil, Ashok

    2012-11-26

    At this November 26, 2012 Science at the Theater, scientists discussed the recently launched LBNL Institute for Globally Transformative Technologies (LIGTT) at Berkeley Lab. LIGTT is an ambitious mandate to discover and develop breakthrough technologies for combating global poverty. It was created with the belief that solutions will require more advanced R&D and a deep understanding of market needs in the developing world. Berkeley Lab's Ashok Gadgil, Shashi Buluswar and seven other LIGTT scientists discussed what it takes to develop technologies that will impact millions of people. These include: 1) Fuel efficient stoves for clean cooking: Our scientists are improving the Berkeley Darfur Stove, a high efficiency stove used by over 20,000 households in Darfur; 2) The ultra-low energy refrigerator: A lightweight, low-energy refrigerator that can be mounted on a bike so crops can survive the trip from the farm to the market; 3) The solar OB suitcase: A low-cost package of the five most critical biomedical devices for maternal and neonatal clinics; 4) UV Waterworks: A device for quickly, safely and inexpensively disinfecting water of harmful microorganisms.

  13. Using R in experimental design with BIBD: An application in health sciences

    NASA Astrophysics Data System (ADS)

    Oliveira, Teresa A.; Francisco, Carla; Oliveira, Amílcar; Ferreira, Agostinho

    2016-06-01

    Considering the implementation of an Experimental Design, in any field, the experimenter must pay particular attention and look for the best strategies in the following steps: planning the design selection, conduct the experiments, collect observed data, proceed to analysis and interpretation of results. The focus is on providing both - a deep understanding of the problem under research and a powerful experimental process at a reduced cost. Mainly thanks to the possibility of allowing to separate variation sources, the importance of Experimental Design in Health Sciences is strongly recommended since long time. Particular attention has been devoted to Block Designs and more precisely to Balanced Incomplete Block Designs - in this case the relevance states from the fact that these designs allow testing simultaneously a number of treatments bigger than the block size. Our example refers to a possible study of inter reliability of the Parkinson disease, taking into account the UPDRS (Unified Parkinson's disease rating scale) in order to test if there are significant differences between the specialists who evaluate the patients performances. Statistical studies on this disease were described for example in Richards et al (1994), where the authors investigate the inter-rater Reliability of the Unified Parkinson's Disease Rating Scale Motor Examination. We consider a simulation of a practical situation in which the patients were observed by different specialists and the UPDRS on assessing the impact of Parkinson's disease in patients was observed. Assigning treatments to the subjects following a particular BIBD(9,24,8,3,2) structure, we illustrate that BIB Designs can be used as a powerful tool to solve emerging problems in this area. Once a structure with repeated blocks allows to have some block contrasts with minimum variance, see Oliveira et al. (2006), the design with cardinality 12 was selected for the example. R software was used for computations.

  14. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-01

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation. PMID:26451629

  15. Striped ratio grids for scatter estimation

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Wang, Adam S.; Star-Lack, Josh

    2016-03-01

    Striped ratio grids are a new concept for scatter management in cone-beam CT. These grids are a modification of conventional anti-scatter grids and consist of stripes which alternate between high grid ratio and low grid ratio. Such a grid is related to existing hardware concepts for scatter estimation such as blocker-based methods or primary modulation, but rather than modulating the primary, the striped ratio grid modulates the scatter. The transitions between adjacent stripes can be used to estimate and subtract the remaining scatter. However, these transitions could be contaminated by variation in the primary radiation. We describe a simple nonlinear image processing algorithm to estimate scatter, and proceed to validate the striped ratio grid on experimental data of a pelvic phantom. The striped ratio grid is emulated by combining data from two scans with different grids. Preliminary results are encouraging and show a significant reduction of scatter artifact.

  16. FermiGrid - experience and future plans

    SciTech Connect

    Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Timm, S.; Yocum, D.; /Fermilab

    2007-09-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and the Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems.

  17. Smart Grid Integration Laboratory

    SciTech Connect

    Troxell, Wade

    2011-12-22

    The initial federal funding for the Colorado State University Smart Grid Integration Laboratory is through a Congressionally Directed Project (CDP), DE-OE0000070 Smart Grid Integration Laboratory. The original program requested in three one-year increments for staff acquisition, curriculum development, and instrumentation all which will benefit the Laboratory. This report focuses on the initial phase of staff acquisition which was directed and administered by DOE NETL/ West Virginia under Project Officer Tom George. Using this CDP funding, we have developed the leadership and intellectual capacity for the SGIC. This was accomplished by investing (hiring) a core team of Smart Grid Systems engineering faculty focused on education, research, and innovation of a secure and smart grid infrastructure. The Smart Grid Integration Laboratory will be housed with the separately funded Integrid Laboratory as part of CSU's overall Smart Grid Integration Center (SGIC). The period of performance of this grant was 10/1/2009 to 9/30/2011 which included one no cost extension due to time delays in faculty hiring. The Smart Grid Integration Laboratory's focus is to build foundations to help graduate and undergraduates acquire systems engineering knowledge; conduct innovative research; and team externally with grid smart organizations. Using the results of the separately funded Smart Grid Workforce Education Workshop (May 2009) sponsored by the City of Fort Collins, Northern Colorado Clean Energy Cluster, Colorado State University Continuing Education, Spirae, and Siemens has been used to guide the hiring of faculty, program curriculum and education plan. This project develops faculty leaders with the intellectual capacity to inspire its students to become leaders that substantially contribute to the development and maintenance of Smart Grid infrastructure through topics such as: (1) Distributed energy systems modeling and control; (2) Energy and power conversion; (3) Simulation of

  18. NASA's Participation in the National Computational Grid

    NASA Technical Reports Server (NTRS)

    Feiereisen, William J.; Zornetzer, Steve F. (Technical Monitor)

    1998-01-01

    Over the last several years it has become evident that the character of NASA's supercomputing needs has changed. One of the major missions of the agency is to support the design and manufacture of aero- and space-vehicles with technologies that will significantly reduce their cost. It is becoming clear that improvements in the process of aerospace design and manufacturing will require a high performance information infrastructure that allows geographically dispersed teams to draw upon resources that are broader than traditional supercomputing. A computational grid draws together our information resources into one system. We can foresee the time when a Grid will allow engineers and scientists to use the tools of supercomputers, databases and on line experimental devices in a virtual environment to collaborate with distant colleagues. The concept of a computational grid has been spoken of for many years, but several events in recent times are conspiring to allow us to actually build one. In late 1997 the National Science Foundation initiated the Partnerships for Advanced Computational Infrastructure (PACI) which is built around the idea of distributed high performance computing. The Alliance lead, by the National Computational Science Alliance (NCSA), and the National Partnership for Advanced Computational Infrastructure (NPACI), lead by the San Diego Supercomputing Center, have been instrumental in drawing together the "Grid Community" to identify the technology bottlenecks and propose a research agenda to address them. During the same period NASA has begun to reformulate parts of two major high performance computing research programs to concentrate on distributed high performance computing and has banded together with the PACI centers to address the research agenda in common.

  19. Changes in Critical Thinking Skills Following a Course on Science and Pseudoscience: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    McLean, Carmen P.; Miller, Nathan A.

    2010-01-01

    We assessed changes in paranormal beliefs and general critical thinking skills among students (n = 23) enrolled in an experimental course designed to teach distinguishing science from pseudoscience and a comparison group of students (n = 30) in an advanced research methods course. On average, both courses were successful in reducing paranormal…

  20. Heritage Education: Exploring the Conceptions of Teachers and Administrators from the Perspective of Experimental and Social Science Teaching

    ERIC Educational Resources Information Center

    Perez, Roque Jimenez; Lopez, Jose Maria Cuenca; Listan, D. Mario Ferreras

    2010-01-01

    This paper describes a research project into heritage education. Taking an interdisciplinary perspective from within the field of Experimental and Social Science Education, it presents an analysis of teachers' and administrators' conceptions of heritage, its teaching and its dissemination in Spain. A statistical description is provided of the…

  1. Experimental investigations of the nonlinear dynamics of a complex space-charge configuration inside and around a grid cathode with hole

    NASA Astrophysics Data System (ADS)

    Teodorescu-Soare, C. T.; Dimitriu, D. G.; Ionita, C.; Schrittwieser, R. W.

    2016-03-01

    By negatively biasing a metallic grid with a small hole, down to a critical value of the applied potential a complex space-charge structure appears inside and around the grid cathode. The static current-voltage characteristic of the discharge shows one or two current jumps (the number of current jumps depending on the working gas pressure), one of them being of hysteretic type. Electrical probe measurements show a positive potential inside the grid cathode with respect to the potential applied on it. This is interpreted as being due to the hollow cathode effect. Thus, the inner fireball appears around the virtual anode inside the grid cathode. For more negative potentials, the electrons inside the cathode reach sufficient energy to penetrate the inner sheath near the cathode, passing through the hole and giving rise to a second fireball-like structure located outside the cathode. This second structure interacts with the negative glow of the discharge. The recorded time series of the discharge current oscillations reveal strongly nonlinear dynamics of the complex space-charge structure: by changing the negative potential applied on the grid cathode, the structure passes through different dynamic states involving chaos, quasi-periodicity, intermittency and period-doubling bifurcations, appearing like a competition of different routes to chaos.

  2. Apollo-Soyuz pamphlet no. 9: General science. [experimental design in Astronomy, Biology, Geophysics, Aeronomy and Materials science

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    The objectives and planning activities for the Apollo-Soyuz mission are summarized. Aspects of the space flight considered include the docking module and launch configurations, spacecraft orbits, and weightlessness. The 28 NASA experiments conducted onboard the spacecraft are summarized. The contributions of the mission to the fields of astronomy, geoscience, biology, and materials sciences resulting from the experiments are explored.

  3. Grid-Enabled Measures

    PubMed Central

    Moser, Richard P.; Hesse, Bradford W.; Shaikh, Abdul R.; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-01-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment —a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute with two overarching goals: (1) Promote the use of standardized measures, which are tied to theoretically based constructs; and (2) Facilitate the ability to share harmonized data resulting from the use of standardized measures. This is done by creating an online venue connected to the Cancer Biomedical Informatics Grid (caBIG®) where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting and viewing meta-data about the measures and associated constructs. This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database, such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories— for data sharing). PMID:21521586

  4. Experimental Methods to Evaluate Science Utility Relative to the Decadal Survey

    NASA Technical Reports Server (NTRS)

    Widergren, Cynthia

    2012-01-01

    The driving factor for competed missions is the science that it plans on performing once it has reached its target body. These science goals are derived from the science recommended by the most current Decadal Survey. This work focuses on science goals in previous Venus mission proposals with respect to the 2013 Decadal Survey. By looking at how the goals compare to the survey and how much confidence NASA has in the mission's ability to accomplish these goals, a method was created to assess the science return utility of each mission. This method can be used as a tool for future Venus mission formulation and serves as a starting point for future development of create science utility assessment tools.

  5. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  6. LAPS Grid generation and adaptation

    NASA Astrophysics Data System (ADS)

    Pagliantini, Cecilia; Delzanno, Gia Luca; Guo, Zehua; Srinivasan, Bhuvana; Tang, Xianzhu; Chacon, Luis

    2011-10-01

    LAPS uses a common-data framework in which a general purpose grid generation and adaptation package in toroidal and simply connected domains is implemented. The initial focus is on implementing the Winslow/Laplace-Beltrami method for generating non-overlapping block structured grids. This is to be followed by a grid adaptation scheme based on Monge-Kantorovich optimal transport method [Delzanno et al., J. Comput. Phys,227 (2008), 9841-9864], that equidistributes application-specified error. As an initial set of applications, we will lay out grids for an axisymmetric mirror, a field reversed configuration, and an entire poloidal cross section of a tokamak plasma reconstructed from a CMOD experimental shot. These grids will then be used for computing the plasma equilibrium and transport in accompanying presentations. A key issue for Monge-Kantorovich grid optimization is the choice of error or monitor function for equi-distribution. We will compare the Operator Recovery Error Source Detector (ORESD) [Lapenta, Int. J. Num. Meth. Eng,59 (2004) 2065-2087], the Tau method and a strategy based on the grid coarsening [Zhang et al., AIAA J,39 (2001) 1706-1715] to find an ``optimal'' grid. Work supported by DOE OFES.

  7. An infrastructure for the integration of geoscience instruments and sensors on the Grid

    NASA Astrophysics Data System (ADS)

    Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.

    2009-04-01

    The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV

  8. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    ERIC Educational Resources Information Center

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-01-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually "do" science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields.…

  9. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    ERIC Educational Resources Information Center

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  10. An Experimental Examination of Quick Writing in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Benedek-Wood, Elizabeth; Mason, Linda H.; Wood, Philip H.; Hoffman, Katie E.; McGuire, Ashley

    2014-01-01

    A staggered A-B design study was used to evaluate the effects of Self- Regulated Strategy Development (SRSD) instruction for quick writing in middle school science across four classrooms. A sixth-grade science teacher delivered all students' writing assessment and SRSD instruction for informative quick writing. Results indicated that…

  11. Correlated Curriculum Program: An Experimental Program. Science Level 1 (9A, 9B, 10A).

    ERIC Educational Resources Information Center

    Loebl, Stanley, Ed.; And Others

    The unit plans in Correlated Science 1 are intended to be of use to the teacher in both lesson and team planning. The course in science was designed for optimum correlation with the work done in business, health, and industrial careers. Behavioral objectives, class routines, time allotments, student evaluation, and the design of the manual are…

  12. Your World and Welcome To It, Science (Experimental): 5314.03.

    ERIC Educational Resources Information Center

    Kleinman, David Z.

    Presented is a beginning course in biology with emphasis on ecology for students with limited interest and few experiences in science. These students most likely will not take many more science courses. Included are the basic ecological concepts of communities, population, societies and the effects humans have on the environment. Like all other…

  13. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    SciTech Connect

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequencies are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.

  14. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    ScienceCinema

    Marken, Ken [Superconductivity Technology Center, Los Alamos, New Mexico, United States

    2010-01-08

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors ? high-temperature superconducting (HTS) tapes ? which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  15. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  16. Grids: The Top Ten Questions

    DOE PAGESBeta

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  17. Experimental setup and the system performance for single-grid-based phase-contrast x-ray imaging (PCXI) with a microfocus x-ray tube

    NASA Astrophysics Data System (ADS)

    Lim, Hyunwoo; Park, Yeonok; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Park, Chulkyu; Woo, Taeho; Lee, Minsik; Kim, Jinsoo; Chung, Nagkun; Kim, Jinwon; Kim, Jinguk

    2015-08-01

    In this work, we investigated a simplified approach to phase-contrast x-ray imaging (PCXI) by using a single antiscatter grid and a microfocus x-ray tube, which has potential to open the way to further widespread use of PCXI into the related application areas. We established a table-top setup for PCXI studies of biological and non-biological samples and investigated the system performance. The PCXI system consists of a focused-linear grid having a strip density of 200 lines/in. (JPI Healthcare Corp.), a microfocus x-ray tube having a focal spot size of about 5 μm (Hamamatsu, L7910), and a high-resolution CMOS imaging detector having a pixel size of 48 μm (Rad-icon Imaging Corp., Shad-o-Box 2048). By using our prototype system, we successfully obtained attenuation, scattering, and differential phase-contrast x-ray images of improved visibility from the raw images of several samples at x-ray tube conditions of 50 kVp and 6 mAs. Our initial results indicate that the single-grid-based approach seems a useful method for PCXI with great simplicity and minimal requirements on the setup alignment.

  18. Arguing for Experimental "Facts" in Science: A Study of Research Article Results Sections in Biochemistry.

    ERIC Educational Resources Information Center

    Thompson, Dorothea K.

    1993-01-01

    Claims that the contextual nature of "results" sections in scientific articles remains largely unexplored. Examines scientific publications by biochemists. Identifies six rhetorical moves common to such articles. Demonstrates the rhetorical nature of science writing. (HB)

  19. Experimental stations as a tool to teach soil science at the University of Valencia

    NASA Astrophysics Data System (ADS)

    Cerdà, Artemi

    2010-05-01

    This paper shows the strategies used at the University of Valencia (Department of Geography. Soil Erosion and Degradation Research Group) to teach soil science at the Geography and Enviromental Science Degrees. The use of the Montesa and El Teularet research stations contribute with a better knowledge on soil science for the students as they can see the measurements carried out in the field. Students visit the stations and contribute to measurements and sampling every season. The use of meteorological stations, erosion plots, soil moisture and soil temperatures probes, and sampling give the students the chances to understand the theoretical approach they use to have. This presentation will show how the students evolve, and how their knowledge in soil science is improved.

  20. Pedagogical experimentations about participating science, in a european class, in France.

    NASA Astrophysics Data System (ADS)

    Burgio, Marion

    2015-04-01

    A european class is, in France, a class in which we teach a subject in a foreign language, for example science in English. I led, in my European class, during a seven weeks session, group work activities about different participating science actions. There were groups composed of three or four 16 years old students. Each group chose one type of participating science activity among : - Leading a visioconference with an IODP mission on board the Joides Resolution. - Being part of a "science songs community" with Tom Mc Fadden They divided the work and some of them studied the websites and contacted the actors to present the pedagogical or scientific background of their subject. Others had a concrete production like the organization of a visioconference with the Joides Resolution or the creation of a pedagogical song about geology. I will present some results of their work and explain the students motivation linked to this active learning method.

  1. An Experimental Science Program with the Open Classroom Approach Based on the Philadelphia Primary Science Guide. Part I, Primary Science Unit and Part II, Primary Ecology Unit.

    ERIC Educational Resources Information Center

    Quinn, Jeanette; Carty, Elaine

    Reported is a project designed to correlate six units of study from the Philadelphia Elementary Science Guide and to incorporate them in such a way as to reduce the suggested 32 weeks of teaching time, for the individual units, to 10 weeks for all 6 units. This was necessitated by an interruption of the school year by a teachers' strike. Three…

  2. "They Sweat for Science": The Harvard Fatigue Laboratory and Self-Experimentation in American Exercise Physiology.

    PubMed

    Johnson, Andi

    2015-08-01

    In many scientific fields, the practice of self-experimentation waned over the course of the twentieth century. For exercise physiologists working today, however, the practice of self-experimentation is alive and well. This paper considers the role of the Harvard Fatigue Laboratory and its scientific director, D. Bruce Dill, in legitimizing the practice of self-experimentation in exercise physiology. Descriptions of self-experimentation are drawn from papers published by members of the Harvard Fatigue Lab. Attention is paid to the ethical and practical justifications for self-experimentation in both the lab and the field. Born out of the practical, immediate demands of fatigue protocols, self-experimentation performed the long-term, epistemological function of uniting physiological data across time and space, enabling researchers to contribute to a general human biology program. PMID:25139499

  3. GridMan: A grid manipulation system

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Wang, Zhu

    1992-01-01

    GridMan is an interactive grid manipulation system. It operates on grids to produce new grids which conform to user demands. The input grids are not constrained to come from any particular source. They may be generated by algebraic methods, elliptic methods, hyperbolic methods, parabolic methods, or some combination of methods. The methods are included in the various available structured grid generation codes. These codes perform the basic assembly function for the various elements of the initial grid. For block structured grids, the assembly can be quite complex due to a large number of clock corners, edges, and faces for which various connections and orientations must be properly identified. The grid generation codes are distinguished among themselves by their balance between interactive and automatic actions and by their modest variations in control. The basic form of GridMan provides a much more substantial level of grid control and will take its input from any of the structured grid generation codes. The communication link to the outside codes is a data file which contains the grid or section of grid.

  4. A Case-Based Approach Improves Science Students' Experimental Variable Identification Skills

    ERIC Educational Resources Information Center

    Grunwald, Sandra; Hartman, Andrew

    2010-01-01

    Incorporation of experimental case studies into the laboratory curriculum increases students' abilities to identify experimental variables that affect the outcome of an experiment. Here the authors describe how such case studies were incorporated using an online course management system into a biochemistry laboratory curriculum and the assessment…

  5. Application of the Shockley-Ramo theorem on the grid inefficiency of Frisch grid ionization chambers

    NASA Astrophysics Data System (ADS)

    Göök, A.; Hambsch, F.-J.; Oberstedt, A.; Oberstedt, S.

    2012-02-01

    The concept of grid inefficiency in Frisch grid ionization chambers and its influence on the anode pulse shape is explained in terms of the Shockley-Ramo theorem for induced charges. The grid inefficiency correction is deduced from numerically calculated weighting potentials. A method to determine the correction factor experimentally is also presented. Experimental and calculated values of the correction factor are shown to be in good agreement.

  6. Science.

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science instruction and technology: "A 3-D Journey in Space: A New Visual Cognitive Adventure" (Yoav Yair, Rachel Mintz, and Shai Litvak); "Using Collaborative Inquiry and Interactive Technologies in an Environmental Science Project for Middle School Teachers: A Description and Analysis" (Patricia…

  7. Qualitative Quantitative and Experimental Concept Possession, Criteria for Identifying Conceptual Change in Science Education

    ERIC Educational Resources Information Center

    Lappi, Otto

    2013-01-01

    Students sometimes misunderstand or misinterpret scientific content because of persistent misconceptions that need to be overcome by science education--a learning process typically called conceptual change. The acquisition of scientific content matter thus requires a transformation of the initial knowledge-state of a common-sense picture of the…

  8. Mathematics Through Science, Part III: An Experimental Approach to Functions. Teacher's Commentary. Revised Edition.

    ERIC Educational Resources Information Center

    Bolduc, Elroy J., Jr.; And Others

    The purpose of this project is to teach learning and understanding of mathematics at the ninth grade level through the use of science experiments. This part of the program contains significant amounts of material normally found in a beginning algebra class. The material should be found useful for classes in general mathematics as a preparation for…

  9. Getting "What Works" Working: Building Blocks for the Integration of Experimental and Improvement Science

    ERIC Educational Resources Information Center

    Peterson, Amelia

    2016-01-01

    As a systemic approach to improving educational practice through research, "What Works" has come under repeated challenge from alternative approaches, most recently that of improvement science. While "What Works" remains a dominant paradigm for centralized knowledge-building efforts, there is need to understand why this…

  10. A Guide to Establishing a Science/Mathematics Research Program in High School. Experimental.

    ERIC Educational Resources Information Center

    Goodman, Harvey; And Others

    This guide has been designed to help teachers, supervisors, and administrators set up a science or mathematics research program which should provide students with a set of basic "tools" for use in problem solving situations. The guide is organized into 17 chapters. The first 15 chapters focus on: organizing a research program; recruiting students;…