Science.gov

Sample records for science grid experimental

  1. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  2. The open science grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2004-12-01

    The U.S. LHC Tier-1 and Tier-2 laboratories and universities are developing production Grids to support LHC applications running across a worldwide Grid computing system. Together with partners in computer science, physics grid projects and active experiments, we will build a common national production grid infrastructure which is open in its architecture, implementation and use. The Open Science Grid (OSG) model builds upon the successful approach of last year's joint Grid2003 project. The Grid3 shared infrastructure has for over eight months provided significant computational resources and throughput to a range of applications, including ATLAS and CMS data challenges, SDSS, LIGO, and biology analyses, and computer science demonstrators and experiments. To move towards LHC-scale data management, access and analysis capabilities, we must increase the scale, services, and sustainability of the current infrastructure by an order of magnitude or more. Thus, we must achieve a significant upgrade in its functionalities and technologies. The initial OSG partners will build upon a fully usable, sustainable and robust grid. Initial partners include the US LHC collaborations, DOE & NSF Laboratories and Universities & Trillium Grid projects. The approach is to federate with other application communities in the U.S. to build a shared infrastructure open to other sciences and capable of being modified and improved to respond to needs of other applications, including CDF, D0, BaBar, and RHIC experiments. We describe the application-driven, engineered services of the OSG, short term plans and status, and the roadmap for a consortium, its partnerships and national focus.

  3. Reliable multicast for the Grid: a case study in experimental computer science.

    PubMed

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  4. The Open Science Grid

    SciTech Connect

    Pordes, Ruth; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Wurthwein, Frank; Gardner, Rob; Wilde, Mike; /Chicago U. /Indiana U.

    2007-06-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support its use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  5. New Science on the Open Science Grid

    SciTech Connect

    Pordes, Ruth; Altunay, Mine; Avery, Paul; Bejan, Alina; Blackburn, Kent; Blatecky, Alan; Gardner, Rob; Kramer, Bill; Livny, Miron; McGee, John; Potekhin, Maxim; /Fermilab /Florida U. /Chicago U. /Caltech /LBL, Berkeley /Wisconsin U., Madison /Indiana U. /Brookhaven /UC, San Diego

    2008-06-01

    The Open Science Grid (OSG) includes work to enable new science, new scientists, and new modalities in support of computationally based research. There are frequently significant sociological and organizational changes required in transformation from the existing to the new. OSG leverages its deliverables to the large scale physics experiment member communities to benefit new communities at all scales through activities in education, engagement and the distributed facility. As a partner to the poster and tutorial at SciDAC 2008, this paper gives both a brief general description and some specific examples of new science enabled on the OSG. More information is available at the OSG web site: www.opensciencegrid.org.

  6. Space-based Science Operations Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Welch, Clara L.; Redman, Sandra

    2004-01-01

    science experimenters. There is an international aspect to the Grid involving the America's Pathway (AMPath) network, the Chilean REUNA Research and Education Network and the University of Chile in Santiago that will further demonstrate how extensive these services can be used. From the user's perspective, the Prototype will provide a single interface and logon to these varied services without the complexity of knowing the where's and how's of each service. There is a separate and deliberate emphasis on security. Security will be addressed by specifically outlining the different approaches and tools used. Grid technology, unlike the Internet, is being designed with security in mind. In addition we will show the locations, configurations and network paths associated with each service and virtual organization. We will discuss the separate virtual organizations that we define for the varied user communities. These will include certain, as yet undetermined, space-based science functions and/or processes and will include specific virtual organizations required for public and educational outreach and science and engineering collaboration. We will also discuss the Grid Prototype performance and the potential for further Grid applications both space-based and ground based projects and processes. In this paper and presentation we will detail each service and how they are integrated using Grid

  7. Enabling Campus Grids with Open Science Grid Technology

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian; Fraser, Dan; Pordes, Ruth; Swanson, David

    2011-12-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condor clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.

  8. Grid for Earth Science Applications

    NASA Astrophysics Data System (ADS)

    Petitdidier, Monique; Schwichtenberg, Horst

    2013-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites. Following the OECD recommendations on the openness of research and public sector data, more and more data are available for Academic organisation and SMEs; (2) New algorithms and methodologies have been developed to face the huge data processing and assimilation into simulations using new technologies and compute resources. Finally, our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly. The technical challenge is to put together databases and computing resources to answer the ES challenges. However all the applications are very intensive computing. Different compute solutions are available and depend on the characteristics of the applications. One of them is Grid especially efficient for independent or embarrassingly parallel jobs related to statistical and parametric studies. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity have been deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted via a huge number of runs to

  9. Neutron Science TeraGrid Gateway

    NASA Astrophysics Data System (ADS)

    Lynch, Vickie; Chen, Meili; Cobb, John; Kohl, Jim; Miller, Steve; Speirs, David; Vazhkudai, Sudharshan

    2010-11-01

    The unique contributions of the Neutron Science TeraGrid Gateway (NSTG) are the connection of national user facility instrument data sources to the integrated cyberinfrastructure of the National Science FoundationTeraGrid and the development of a neutron science gateway that allows neutron scientists to use TeraGrid resources to analyze their data, including comparison of experiment with simulation. The NSTG is working in close collaboration with the Spallation Neutron Source (SNS) at Oak Ridge as their principal facility partner. The SNS is a next-generation neutron source. It has completed construction at a cost of 1.4 billion and is ramping up operations. The SNS will provide an order of magnitude greater flux than any previous facility in the world and will be available to all of the nation's scientists, independent of funding source, on a peer-reviewed merit basis. With this new capability, the neutron science community is facing orders of magnitude larger data sets and is at a critical point for data analysis and simulation. There is a recognized need for new ways to manage and analyze data to optimize both beam time and scientific output. The TeraGrid is providing new capabilities in the gateway for simulations using McStas and a fitting service on distributed TeraGrid resources to improved turnaround. NSTG staff are also exploring replicating experimental data in archival storage. As part of the SNS partnership, the NSTG provides access to gateway support, cyberinfrastructure outreach, community development, and user support for the neutron science community. This community includes not only SNS staff and users but extends to all the major worldwide neutron scattering centers.

  10. Trends in life science grid: from computing grid to knowledge grid

    PubMed Central

    Konagaya, Akihiko

    2006-01-01

    Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community. PMID:17254294

  11. Parallel Grid Manipulations in Earth Science Calculations

    NASA Technical Reports Server (NTRS)

    Sawyer, W.; Lucchesi, R.; daSilva, A.; Takacs, L. L.

    1999-01-01

    sparse interpolation with little data locality between the physical lat-lon grid and a pole rotated computational grid- can be solved efficiently and at the GFlop/s rates needed to solve tomorrow's high resolution earth science models. In the subsequent presentation we will discuss the design and implementation of PILGRIM as well as a number of the problems it is required to solve. Some conclusions will be drawn about the potential performance of the overall earth science models on the supercomputer platforms foreseen for these problems.

  12. Grids for Dummies: Featuring Earth Science Data Mining Application

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2002-01-01

    This viewgraph presentation discusses the concept and advantages of linking computers together into data grids, an emerging technology for managing information across institutions, and potential users of data grids. The logistics of access to a grid, including the use of the World Wide Web to access grids, and security concerns are also discussed. The potential usefulness of data grids to the earth science community is also discussed, as well as the Global Grid Forum, and other efforts to establish standards for data grids.

  13. Technology for a NASA Space-Based Science Operations Grid

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.

    2003-01-01

    This viewgraph representation presents an overview of a proposal to develop a space-based operations grid in support of space-based science experiments. The development of such a grid would provide a dynamic, secure and scalable architecture based on standards and next-generation reusable software and would enable greater science collaboration and productivity through the use of shared resources and distributed computing. The authors propose developing this concept for use on payload experiments carried aboard the International Space Station. Topics covered include: grid definitions, portals, grid development and coordination, grid technology and potential uses of such a grid.

  14. The Open Science Grid status and architecture

    SciTech Connect

    Pordes, Ruth; Petravick, Don; Kramer, Bill; Olsen, James D.; Livny, Miron; Roy, Gordon A.; Avery, Paul Ralph; Blackburn, Kent; Wenaus, Torre J.; Wuerthwein, Frank K.; Foster, Ian; /Chicago U. /Indiana U.

    2007-09-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. The OSG project[1] is funded by the National Science Foundation and the Department of Energy Scientific Discovery through Advanced Computing program. The OSG project provides specific activities for the operation and evolution of the common infrastructure. The US ATLAS and US CMS collaborations contribute to and depend on OSG as the US infrastructure contributing to the World Wide LHC Computing Grid on which the LHC experiments distribute and analyze their data. Other stakeholders include the STAR RHIC experiment, the Laser Interferometer Gravitational-Wave Observatory (LIGO), the Dark Energy Survey (DES) and several Fermilab Tevatron experiments- CDF, D0, MiniBoone etc. The OSG implementation architecture brings a pragmatic approach to enabling vertically integrated community specific distributed systems over a common horizontal set of shared resources and services. More information can be found at the OSG web site: www.opensciencegrid.org.

  15. European grid services for global earth science

    NASA Astrophysics Data System (ADS)

    Brewer, S.; Sipos, G.

    2012-04-01

    This presentation will provide an overview of the distributed computing services that the European Grid Infrastructure (EGI) offers to the Earth Sciences community and also explain the processes whereby Earth Science users can engage with the infrastructure. One of the main overarching goals for EGI over the coming year is to diversify its user-base. EGI therefore - through the National Grid Initiatives (NGIs) that provide the bulk of resources that make up the infrastructure - offers a number of routes whereby users, either individually or as communities, can make use of its services. At one level there are two approaches to working with EGI: either users can make use of existing resources and contribute to their evolution and configuration; or alternatively they can work with EGI, and hence the NGIs, to incorporate their own resources into the infrastructure to take advantage of EGI's monitoring, networking and managing services. Adopting this approach does not imply a loss of ownership of the resources. Both of these approaches are entirely applicable to the Earth Sciences community. The former because researchers within this field have been involved with EGI (and previously EGEE) as a Heavy User Community and the latter because they have very specific needs, such as incorporating HPC services into their workflows, and these will require multi-skilled interventions to fully provide such services. In addition to the technical support services that EGI has been offering for the last year or so - the applications database, the training marketplace and the Virtual Organisation services - there now exists a dynamic short-term project framework that can be utilised to establish and operate services for Earth Science users. During this talk we will present a summary of various on-going projects that will be of interest to Earth Science users with the intention that suggestions for future projects will emerge from the subsequent discussions: • The Federated Cloud Task

  16. Service engineering for grid services in medicine and life science.

    PubMed

    Weisbecker, Anette; Falkner, Jürgen

    2009-01-01

    Clearly defined services with appropriate business models are necessary in order to exploit the benefit of grid computing for industrial and academic users in medicine and life sciences. In the project Services@MediGRID the service engineering approach is used to develop those clearly defined grid services and to provide sustainable business models for their usage.

  17. AstroGrid-D: Grid technology for astronomical science

    NASA Astrophysics Data System (ADS)

    Enke, Harry; Steinmetz, Matthias; Adorf, Hans-Martin; Beck-Ratzka, Alexander; Breitling, Frank; Brüsemeister, Thomas; Carlson, Arthur; Ensslin, Torsten; Högqvist, Mikael; Nickelt, Iliya; Radke, Thomas; Reinefeld, Alexander; Reiser, Angelika; Scholl, Tobias; Spurzem, Rainer; Steinacker, Jürgen; Voges, Wolfgang; Wambsganß, Joachim; White, Steve

    2011-02-01

    We present status and results of AstroGrid-D, a joint effort of astrophysicists and computer scientists to employ grid technology for scientific applications. AstroGrid-D provides access to a network of distributed machines with a set of commands as well as software interfaces. It allows simple use of computer and storage facilities and to schedule or monitor compute tasks and data management. It is based on the Globus Toolkit middleware (GT4). Chapter 1 describes the context which led to the demand for advanced software solutions in Astrophysics, and we state the goals of the project. We then present characteristic astrophysical applications that have been implemented on AstroGrid-D in chapter 2. We describe simulations of different complexity, compute-intensive calculations running on multiple sites (Section 2.1), and advanced applications for specific scientific purposes (Section 2.2), such as a connection to robotic telescopes (Section 2.2.3). We can show from these examples how grid execution improves e.g. the scientific workflow. Chapter 3 explains the software tools and services that we adapted or newly developed. Section 3.1 is focused on the administrative aspects of the infrastructure, to manage users and monitor activity. Section 3.2 characterises the central components of our architecture: The AstroGrid-D information service to collect and store metadata, a file management system, the data management system, and a job manager for automatic submission of compute tasks. We summarise the successfully established infrastructure in chapter 4, concluding with our future plans to establish AstroGrid-D as a platform of modern e-Astronomy.

  18. Public storage for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Guru, A.

    2014-06-01

    The Open Science Grid infrastructure doesn't provide efficient means to manage public storage offered by participating sites. A Virtual Organization that relies on opportunistic storage has difficulties finding appropriate storage, verifying its availability, and monitoring its utilization. The involvement of the production manager, site administrators and VO support personnel is required to allocate or rescind storage space. One of the main requirements for Public Storage implementation is that it should use SRM or GridFTP protocols to access the Storage Elements provided by the OSG Sites and not put any additional burden on sites. By policy, no new services related to Public Storage can be installed and run on OSG sites. Opportunistic users also have difficulties in accessing the OSG Storage Elements during the execution of jobs. A typical users' data management workflow includes pre-staging common data on sites before a job's execution, then storing for a subsequent download to a local institution the output data produced by a job on a worker node. When the amount of data is significant, the only means to temporarily store the data is to upload it to one of the Storage Elements. In order to do that, a user's job should be aware of the storage location, availability, and free space. After a successful data upload, users must somehow keep track of the data's location for future access. In this presentation we propose solutions for storage management and data handling issues in the OSG. We are investigating the feasibility of using the integrated Rule-Oriented Data System developed at RENCI as a front-end service to the OSG SEs. The current architecture, state of deployment and performance test results will be discussed. We will also provide examples of current usage of the system by beta-users.

  19. Pilot job accounting and auditing in Open Science Grid

    SciTech Connect

    Sfiligoi, Igor; Green, Chris; Quinn, Greg; Thain, Greg; /Wisconsin U., Madison

    2008-06-01

    The Grid accounting and auditing mechanisms were designed under the assumption that users would submit their jobs directly to the Grid gatekeepers. However, many groups are starting to use pilot-based systems, where users submit jobs to a centralized queue and are successively transferred to the Grid resources by the pilot infrastructure. While this approach greatly improves the user experience, it does disrupt the established accounting and auditing procedures. Open Science Grid deploys gLExec on the worker nodes to keep the pilot-related accounting and auditing information and centralizes the accounting collection with GRATIA.

  20. Grid Technology as a Cyber Infrastructure for Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    This paper describes how grids and grid service technologies can be used to develop an infrastructure for the Earth Science community. This cyberinfrastructure would be populated with a hierarchy of services, including discipline specific services such those needed by the Earth Science community as well as a set of core services that are needed by most applications. This core would include data-oriented services used for accessing and moving data as well as computer-oriented services used to broker access to resources and control the execution of tasks on the grid. The availability of such an Earth Science cyberinfrastructure would ease the development of Earth Science applications. With such a cyberinfrastructure, application work flows could be created to extract data from one or more of the Earth Science archives and then process it by passing it through various persistent services that are part of the persistent cyberinfrastructure, such as services to perform subsetting, reformatting, data mining and map projections.

  1. Learning Experimentation through Science Fairs

    ERIC Educational Resources Information Center

    Paul, Jürgen; Lederman, Norman G.; Groß, Jorge

    2016-01-01

    Experiments are essential for both doing science and learning science. The aim of the German youth science fair, "Jugend forscht," is to encourage scientific thinking and inquiry methods such as experimentation. Based on 57 interviews with participants of the competition, this study summarises students' conceptions and steps of learning…

  2. ISS Space-Based Science Operations Grid for the Ground Systems Architecture Workshop (GSAW)

    NASA Technical Reports Server (NTRS)

    Welch, Clara; Bradford, Bob

    2003-01-01

    Contents include the following:What is grid? Benefits of a grid to space-based science operations. Our approach. Score of prototype grid. The security question. Short term objectives. Long term objectives. Space-based services required for operations. The prototype. Score of prototype grid. Prototype service layout. Space-based science grid service components.

  3. Unlocking the potential of smart grid technologies with behavioral science

    PubMed Central

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings. PMID:25914666

  4. Unlocking the potential of smart grid technologies with behavioral science.

    PubMed

    Sintov, Nicole D; Schultz, P Wesley

    2015-01-01

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this paper, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  5. Unlocking the potential of smart grid technologies with behavioral science

    SciTech Connect

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-04-09

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizing the impact of smart grid technologies. In this study, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.

  6. Unlocking the potential of smart grid technologies with behavioral science

    DOE PAGES

    Sintov, Nicole D.; Schultz, P. Wesley

    2015-04-09

    Smart grid systems aim to provide a more stable and adaptable electricity infrastructure, and to maximize energy efficiency. Grid-linked technologies vary widely in form and function, but generally share common potentials: to reduce energy consumption via efficiency and/or curtailment, to shift use to off-peak times of day, and to enable distributed storage and generation options. Although end users are central players in these systems, they are sometimes not central considerations in technology or program design, and in some cases, their motivations for participating in such systems are not fully appreciated. Behavioral science can be instrumental in engaging end-users and maximizingmore » the impact of smart grid technologies. In this study, we present emerging technologies made possible by a smart grid infrastructure, and for each we highlight ways in which behavioral science can be applied to enhance their impact on energy savings.« less

  7. Optimal response to attacks on the open science grids.

    SciTech Connect

    Altunay, M.; Leyffer, S.; Linderoth, J. T.; Xie, Z.

    2011-01-01

    Cybersecurity is a growing concern, especially in open grids, where attack propagation is easy because of prevalent collaborations among thousands of users and hundreds of institutions. The collaboration rules that typically govern large science experiments as well as social networks of scientists span across the institutional security boundaries. A common concern is that the increased openness may allow malicious attackers to spread more readily around the grid. We consider how to optimally respond to attacks in open grid environments. To show how and why attacks spread more readily around the grid, we first discuss how collaborations manifest themselves in the grids and form the collaboration network graph, and how this collaboration network graph affects the security threat levels of grid participants. We present two mixed-integer program (MIP) models to find the optimal response to attacks in open grid environments, and also calculate the threat level associated with each grid participant. Given an attack scenario, our optimal response model aims to minimize the threat levels at unaffected participants while maximizing the uninterrupted scientific production (continuing collaborations). By adopting some of the collaboration rules (e.g., suspending a collaboration or shutting down a site), the model finds optimal response to subvert an attack scenario.

  8. Nuclear test experimental science

    SciTech Connect

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S.

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research.

  9. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  10. Enabling Science and Engineering Applications on the Grid

    SciTech Connect

    Seidel, Ed

    2004-08-25

    The Grid has the potential to fundamentally change the way science and engineering are done. Aggregate power of computing resources connected by networks - of the Grid - exceeds that of any single supercomputer by many orders of magnitude. At the same time, our ability to carry out computations of the scale and level of detail required, for example, to study the Universe, or simulate a rocket engine, are severely constrained by available computing power. Hence, such applications should be one of the main driving forces behind the development of Grid computing. I will discuss some large scale applications, including simulations of colliding black holes, and show how they are driving the development of Grid computing technology. Applications are already being developed that are not only aware of their needs, but also of the resources available to them on the Grid. They will be able to adapt themselves automatically to respond to their changing needs, to spawn off tasks on other resources, and to adapt to the changing characteristics of the Grid including machine and network loads and availability. I will discuss a number of innovative scenarios for computing on the Grid enabled by such technologies, and demonstrate how close these are to being a reality.

  11. Combining the GRID with Cloud for Earth Science Computing

    NASA Astrophysics Data System (ADS)

    Mishin, Dmitry; Levchenko, Oleg; Groudnev, Andrei; Zhizhin, Mikhail

    2010-05-01

    Cloud computing is a new economic model of using large cluster computing resources which were earlier managed by GRID. Reusing existing GRID infrastructure gives an opportunity to combine the Cloud and GRID technologies on the same hardware and to provide GRID users with functionality for running high performance computing tasks inside virtual machines. In this case Cloud works "above" GRID, sharing computing power and utilizing unused processor time. We manage virtual machines with Eucalyptus elastic cloud and we use Torque system from gLite infrastructure for spreading Cloud jobs in GRID computing nodes to scale the parallel computing tasks on virtual machines created by elastic cloud. For this purpose we have added new types of tasks to the standard GRID task list: to run a virtual node and to run a job on a virtual node. This gives a possibility to seamlessly upscale the Cloud with the new tasks when needed and to shrink it when the tasks are completed. Using GRID components for managing the size of a virtual cloud simplifies building the billing system to charge the Cloud users for the processor time, disk space and outer traffic consumed. A list of Earth Science computing problems that can be solved by using the elastic Cloud include repetitive tasks of downloading, converting and storing in a database of large arrays of data (e.g. weather forecast); creating a pyramid of lower resolution images from a very large one for fast distributed browsing; processing and analyzing the large distributed amounts of data by running Earth Science numerical models.

  12. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  13. Earth Science applications on Grid -advantages and limitations

    NASA Astrophysics Data System (ADS)

    Petitdidier, M.; Schwichtenberg, H.

    2012-04-01

    The civil society at large has addressed to the Earth Science community many strong requirements related in particular to natural and industrial risks, climate changes, new energies…. Our total knowledge about the complex Earth system is contained in models and measurements, how we put them together has to be managed cleverly… The technical challenge is to put together databases and computing resources to answer the ES challenges. The main critical point is that on one hand the civil society and all public ask for certainties i.e. precise values with small error range as it concerns prediction at short, medium and long term in all domains; on the other hand Science can mainly answer only in terms of probability of occurrence. To improve the answer or/and decrease the uncertainties, (1) new observational networks have been deployed in order to have a better geographical coverage and more accurate measurements have been carried out in key locations and aboard satellites, (2) new algorithms and methodologies have been developed using new technologies and compute resources. Numerous applications in atmospheric chemistry, meteorology, seismology, hydrology, pollution, climate and biodiversity were deployed successfully on Grid. In order to fulfill requirements of risk management, several prototype applications have been deployed using OGC (Open geospatial Consortium) components with Grid middleware. The Grid has permitted to decrease uncertainties by increasing the probability of occurrence via a larger number of runs. Some limitations are related to the combination of databases-outside the grid infrastructure- and grid compute resources; and to real-time applications that need resource reservation in order to insure results at given time. As a matter of fact ES scientists use different compute resources according to the phase of their application are used to work in large projects and share their results. They need a service-oriented architecture and a platform of

  14. DZero data-intensive computing on the Open Science Grid

    SciTech Connect

    Abbott, B.; Baranovski, A.; Diesburg, M.; Garzoglio, G.; Kurca, T.; Mhashilkar, P.; /Fermilab

    2007-09-01

    High energy physics experiments periodically reprocess data, in order to take advantage of improved understanding of the detector and the data processing code. Between February and May 2007, the DZero experiment has reprocessed a substantial fraction of its dataset. This consists of half a billion events, corresponding to about 100 TB of data, organized in 300,000 files. The activity utilized resources from sites around the world, including a dozen sites participating to the Open Science Grid consortium (OSG). About 1,500 jobs were run every day across the OSG, consuming and producing hundreds of Gigabytes of data. Access to OSG computing and storage resources was coordinated by the SAM-Grid system. This system organized job access to a complex topology of data queues and job scheduling to clusters, using a SAM-Grid to OSG job forwarding infrastructure. For the first time in the lifetime of the experiment, a data intensive production activity was managed on a general purpose grid, such as OSG. This paper describes the implications of using OSG, where all resources are granted following an opportunistic model, the challenges of operating a data intensive activity over such large computing infrastructure, and the lessons learned throughout the project.

  15. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  16. Experimental results of an iodine plasma in PEGASES gridded thruster

    NASA Astrophysics Data System (ADS)

    Grondein, Pascaline; Aanesland, Ane

    2015-09-01

    In the electric gridded thruster PEGASES, both positive and negative ions are expelled after extraction from an ion-ion plasma. This ion-ion plasma is formed downstream a localized magnetic field placed a few centimeters from the ionization region, trapping and cooling down the electron to allow a better attachment to an electronegative gas. For this thruster concept, iodine has emerged as the most attractive option. Heavy, under diatomic form and therefore good for high thrust, its low ionization threshold and high electronegativity lead to high ion-ion densities and low RF power. After the proof-of-concept of PEGASES using SF6 as propellant, we present here experimental results of an iodine plasma studied inside PEGASES thruster. At solid state at standard temperature and pressure, iodine is heated to sublimate, then injected inside the chamber where the neutral gas is heated and ionized. The whole injection system is heated to avoid deposition on surfaces and a mass flow controller allows a fine control on the neutral gas mass flow. A 3D translation stage inside the vacuum chamber allows volumetric plasma studies using electrostatic probes. The results are also compared with the global model dedicated to iodine as propellant for electric gridded thrusters. This work has been done within the LABEX Plas@par project, and received financial state aid managed by the Agence Nationale de la Recherche, as part of the programme ``Investissements d'avenir.''

  17. e-Science, caGrid, and Translational Biomedical Research

    PubMed Central

    Saltz, Joel; Kurc, Tahsin; Hastings, Shannon; Langella, Stephen; Oster, Scott; Ervin, David; Sharma, Ashish; Pan, Tony; Gurcan, Metin; Permar, Justin; Ferreira, Renato; Payne, Philip; Catalyurek, Umit; Caserta, Enrico; Leone, Gustavo; Ostrowski, Michael C.; Madduri, Ravi; Foster, Ian; Madhavan, Subhashree; Buetow, Kenneth H.; Shanbhag, Krishnakant; Siegel, Eliot

    2011-01-01

    Translational research projects target a wide variety of diseases, test many different kinds of biomedical hypotheses, and employ a large assortment of experimental methodologies. Diverse data, complex execution environments, and demanding security and reliability requirements make the implementation of these projects extremely challenging and require novel e-Science technologies. PMID:21311723

  18. Grid infrastructure to support science portals for large scale instruments.

    SciTech Connect

    von Laszewski, G.; Foster, I.

    1999-09-29

    Soon, a new generation of scientific workbenches will be developed as a collaborative effort among various research institutions in the US. These scientific workbenches will be accessed in the Web via portals. Reusable components are needed to build such portals for different scientific disciplines, allowing uniform desktop access to remote resources. Such components will include tools and services enabling easy collaboration, job submission, job monitoring, component discovery, and persistent object storage. Based on experience gained from Grand Challenge applications for large-scale instruments, we demonstrate how Grid infrastructure components can be used to support the implementation of science portals. The availability of these components will simplify the prototype implementation of a common portal architecture.

  19. Commissioning the HTCondor-CE for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Cartwright, T.; Frey, J.; Fajardo, E. M.; Lin, B.; Selmeci, M.; Tannenbaum, T.; Zvada, M.

    2015-12-01

    The HTCondor-CE is the next-generation gateway software for the Open Science Grid (OSG). This is responsible for providing a network service which authorizes remote users and provides a resource provisioning service (other well-known gateways include Globus GRAM, CREAM, Arc-CE, and Openstacks Nova). Based on the venerable HTCondor software, this new CE is simply a highly-specialized configuration of HTCondor. It was developed and adopted to provide the OSG with a more flexible, scalable, and easier-to-manage gateway software. Further, the focus of the HTCondor-CE is not job submission (as in GRAM or CREAM) but resource provisioning. This software does not exist in a vacuum: to deploy this gateway across the OSG, we had to integrate it with the CE configuration, deploy a corresponding information service, coordinate with sites, and overhaul our documentation.

  20. Application of an Unstructured Grid Navier-Stokes Solver to a Generic Helicopter Boby: Comparison of Unstructured Grid Results with Structured Grid Results and Experimental Results

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.

    1999-01-01

    An unstructured-grid Navier-Stokes solver was used to predict the surface pressure distribution, the off-body flow field, the surface flow pattern, and integrated lift and drag coefficients on the ROBIN configuration (a generic helicopter) without a rotor at four angles of attack. The results are compared to those predicted by two structured- grid Navier-Stokes solvers and to experimental surface pressure distributions. The surface pressure distributions from the unstructured-grid Navier-Stokes solver are in good agreement with the results from the structured-grid Navier-Stokes solvers. Agreement with the experimental pressure coefficients is good over the forward portion of the body. However, agreement is poor on the lower portion of the mid-section of the body. Comparison of the predicted surface flow patterns showed similar regions of separated flow. Predicted lift and drag coefficients were in fair agreement with each other.

  1. SCEC Earthworks: A TeraGrid Science Gateway

    NASA Astrophysics Data System (ADS)

    Francoeur, H.; Muench, J.; Okaya, D.; Maechling, P.; Deelman, E.; Mehta, G.

    2006-12-01

    SCEC Earthworks is a scientific gateway designed to provide community wide access to the TeraGrid. Earthworks provides its users with a portal based interface for easily running anelastic wave propagation (AWM) simulations. Using Gridsphere and several portlets developed as a collaborative effort with IRIS, Earthworks enables users to run simulations without any knowledge of the underlying workflow technology needed to utilize the TeraGrid. The workflow technology behind Earthworks has been developed as a collaborative effort between SCEC and the Information Sciences Institute (ISI). Earthworks uses a complex software stack to translate abstract workflows defined by the user into a series of jobs that run on a number of computational resources. These computational resources include a combination of servers provided by SCEC, USC High Performance Computing Center and NSF TeraGrid supercomputer facilities. Workflows are constructed after input from the user is passed via a Java based interface to the Earthworks backend, where a DAX (directed acyclic graph in XML) is generated. This DAX describes each step of the workflow including its inputs, outputs, and arguments, as well as the parent child relationships between each process. The DAX is then handed off to the Virtual Data System (VDS) and Pegasus provided by ISI, which translate it from an abstract workflow to a concrete workflow by filling in logical file and application names with their physical path and location. This newly created DAG (directed acyclic graph) is handed off to the Condor scheduler. The bottom part of the software stack is a Globus installation at each site the provides local transfer and resource management capabilities. Resources across different sites are transparently managed and tracked by VDS which allows greater flexibility in running the workflows. After a workflow is completed, products and metadata are registered with integrated data management tools. This allows for metadata querying

  2. Physical Science Laboratory Manual, Experimental Version.

    ERIC Educational Resources Information Center

    Cooperative General Science Project, Atlanta, GA.

    Provided are physical science laboratory experiments which have been developed and used as a part of an experimental one year undergraduate course in general science for non-science majors. The experiments cover a limited number of topics representative of the scientific enterprise. Some of the topics are pressure and buoyancy, heat, motion,…

  3. Reputation, Princing and the E-Science Grid

    NASA Astrophysics Data System (ADS)

    Anandasivam, Arun; Neumann, Dirk

    One of the fundamental aspects for an efficient Grid usage is the optimization of resource allocation among the participants. However, this has not yet materialized. Each user is a self-interested participant trying to maximize his utility whereas the utility is not only determined by the fastest completion time, but on the prices as well. Future revenues are influenced by users' reputation. Reputation mechanisms help to build trust between loosely coupled and geographically distributed participants. Providers need an incentive to reduce selfish cancellation of jobs and privilege own jobs. In this chapter we present first an offline scheduling mechanism with a fixed price. Jobs are collected by a broker and scheduled to machines. The goal of the broker is to balance the load and to maximize the revenue in the network. Consumers can submit their jobs according to their preferences, but taking the incentives of the broker into account. This mechanism does not consider reputation. In a second step a reputation-based pricing mechanism for a simple, but fair pricing of resources is analyzed. In e-Science researchers do not appreciate idiosyncratic pricing strategies and policies. Their interest lies in doing research in an efficient manner. Consequently, in our mechanism the price is tightly coupled to the reputation of a site to guarantee fairness of pricing and facilitate price determination. Furthermore, the price is not the only parameter as completion time plays an important role, when deadlines have to be met. We provide a flexible utility and decision model for every participant and analyze the outcome of our reputation-based pricing system via simulation.

  4. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  5. Science Process Instrument. Experimental Edition.

    ERIC Educational Resources Information Center

    American Association for the Advancement of Science, Washington, DC. Commission on Science Education.

    This instrument contains activities by which one can determine a child's intellectual development in: (1) observing, (2) classifying, (3) measuring, (4) using numbers, (5) using space/time relationships, (6) inferring, and (7) communicating and predicting. The seven sections of the instrument correspond to those processes defined in Science - A…

  6. Who Needs Plants? Science (Experimental).

    ERIC Educational Resources Information Center

    Ropeik, Bernard H.; Kleinman, David Z.

    The basic elective course in introductory botany is designed for secondary students who probably will not continue study in plant science. The objectives of the course are to help the student 1) identify, compare and differentiate types of plants; 2) identify plant cell structures; 3) distinguish between helpful and harmful plants; 4) predict…

  7. Taguchi Experimental Design for Cleaning PWAs with Ball Grid Arrays

    NASA Technical Reports Server (NTRS)

    Bonner, J. K.; Mehta, A.; Walton, S.

    1997-01-01

    Ball grid arrays (BGAs), and other area array packages, are becoming more prominent as a way to increase component pin count while avoiding the manufacturing difficulties inherent in processing quad flat packs (QFPs)...Cleaning printed wiring assemblies (PWAs) with BGA components mounted on the surface is problematic...Currently, a low flash point semi-aqueous material, in conjunction with a batch cleaning unit, is being used to clean PWAs. The approach taken at JPL was to investigate the use of (1) semi-aqueous materials having a high flash point and (2) aqueous cleaning involving a saponifier.

  8. An overview of Grid portal technologies for the development of HMR science gateways

    NASA Astrophysics Data System (ADS)

    D'Agostino, D.

    2012-04-01

    Grid portals and related technologies represent an easy and transparent way for scientists to interact with Distributed Computing Infrastructures (DCIs) as the Grid and the Cloud. Many toolkits and frameworks are available, both commercial and open source, but there is a lack of best practices, customization methodologies and dedicated high-level service repositories that allow a fast development of specialized scientific gateways in Europe. Starting from the US TeraGrid-XSEDE experience, in this contribution the most interesting portal toolkits and related European projects are analyzed with the perspective to develop a science gateway for HMR community within the the Distributed Research Infrastructure for Hydrometeorology (DRIHM) project.

  9. The Low Down on e-Science and Grids for Biology

    PubMed Central

    2001-01-01

    The Grid is touted as a next generation Internet/Web, designed primarily to support e-Science. I hope to shed some light on what the Grid is, its purpose, and its potential impact on scientific practice in biology. The key message is that biologists are already primarily working in a manner that the Grid is intended to support. However, to ensure that the Grid’s good intentions are appropriate and fulfilled in practice, biologists must become engaged in the process of its development. PMID:18628864

  10. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    NASA Astrophysics Data System (ADS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  11. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  12. Experimental Demonstration of a Self-organized Architecture for Emerging Grid Computing Applications on OBS Testbed

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Hong, Xiaobin; Wu, Jian; Lin, Jintong

    As Grid computing continues to gain popularity in the industry and research community, it also attracts more attention from the customer level. The large number of users and high frequency of job requests in the consumer market make it challenging. Clearly, all the current Client/Server(C/S)-based architecture will become unfeasible for supporting large-scale Grid applications due to its poor scalability and poor fault-tolerance. In this paper, based on our previous works [1, 2], a novel self-organized architecture to realize a highly scalable and flexible platform for Grids is proposed. Experimental results show that this architecture is suitable and efficient for consumer-oriented Grids.

  13. Analysis of the Current Use, Benefit, and Value of the Open Science Grid

    SciTech Connect

    Pordes, R.; /Fermilab

    2009-04-01

    The Open Science Grid usage has ramped up more than 25% in the past twelve months due to both the increase in throughput of the core stakeholders - US LHC, LIGO and Run II - and increase in usage by nonphysics communities. It is important to understand the value collaborative projects, such as the OSG, contribute to the scientific community. This needs to be cognizant of the environment of commercial cloud offerings, the evolving and maturing middleware for grid based distributed computing, and the evolution in science and research dependence on computation. We present a first categorization of OSG value and analysis across several different aspects of the Consortium's goals and activities. And lastly, we presents some of the upcoming challenges of LHC data analysis ramp up and our ongoing contributions to the World Wide LHC Computing Grid.

  14. Remote Job Testing for the Neutron Science TeraGrid Gateway

    SciTech Connect

    Lynch, Vickie E; Cobb, John W; Miller, Stephen D; Reuter, Michael A; Smith, Bradford C

    2009-01-01

    Remote job execution gives neutron science facilities access to high performance computing such as the TeraGrid. A scientific community can use community software with a community certificate and account through a common interface of a portal. Results show this approach is successful, but with more testing and problem solving, we expect remote job executions to become more reliable.

  15. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  16. Snowmass Energy Frontier Simulations using the Open Science Grid (A Snowmass 2013 whitepaper)

    SciTech Connect

    Avetisyan, Aram; Bhattacharya, Saptaparna; Narain, Meenakshi; Padhi, Sanjay; Hirschauer, Jim; Levshina, Tanya; McBride, Patricia; Sehgal, Chander; Slyz, Marko; Rynge, Mats; Malik, Sudhir; Stupak, III, John

    2013-08-04

    Snowmass is a US long-term planning study for the high-energy community by the American Physical Society's Division of Particles and Fields. For its simulation studies, opportunistic resources are harnessed using the Open Science Grid infrastructure. Late binding grid technology, GlideinWMS, was used for distributed scheduling of the simulation jobs across many sites mainly in the US. The pilot infrastructure also uses the Parrot mechanism to dynamically access CvmFS in order to ascertain a homogeneous environment across the nodes. This report presents the resource usage and the storage model used for simulating large statistics Standard Model backgrounds needed for Snowmass Energy Frontier studies.

  17. Experimenter's Laboratory for Visualized Interactive Science

    NASA Technical Reports Server (NTRS)

    Hansen, Elaine R.; Rodier, Daniel R.; Klemp, Marjorie K.

    1994-01-01

    ELVIS (Experimenter's Laboratory for Visualized Interactive Science) is an interactive visualization environment that enables scientists, students, and educators to visualize and analyze large, complex, and diverse sets of scientific data. It accomplishes this by presenting the data sets as 2-D, 3-D, color, stereo, and graphic images with movable and multiple light sources combined with displays of solid-surface, contours, wire-frame, and transparency. By simultaneously rendering diverse data sets acquired from multiple sources, formats, and resolutions and by interacting with the data through an intuitive, direct-manipulation interface, ELVIS provides an interactive and responsive environment for exploratory data analysis.

  18. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    SciTech Connect

    Livny, Miron; Shank, James; Ernst, Michael; Blackburn, Kent; Goasguen, Sebastien; Tuts, Michael; Gibbons, Lawrence; Pordes, Ruth; Sliz, Piotr; Deelman, Ewa; Barnett, William; Olson, Doug; McGee, John; Cowles, Robert; Wuerthwein, Frank; Gardner, Robert; Avery, Paul; Wang, Shaowen; Lincoln, David Swanson

    2015-02-11

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. We operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.

  19. Fully Automated Single-Zone Elliptic Grid Generation for Mars Science Laboratory (MSL) Aeroshell and Canopy Geometries

    NASA Technical Reports Server (NTRS)

    kaul, Upender K.

    2008-01-01

    A procedure for generating smooth uniformly clustered single-zone grids using enhanced elliptic grid generation has been demonstrated here for the Mars Science Laboratory (MSL) geometries such as aeroshell and canopy. The procedure obviates the need for generating multizone grids for such geometries, as reported in the literature. This has been possible because the enhanced elliptic grid generator automatically generates clustered grids without manual prescription of decay parameters needed with the conventional approach. In fact, these decay parameters are calculated as decay functions as part of the solution, and they are not constant over a given boundary. Since these decay functions vary over a given boundary, orthogonal grids near any arbitrary boundary can be clustered automatically without having to break up the boundaries and the corresponding interior domains into various zones for grid generation.

  20. GENESIS SciFlo: Enabling Multi-Instrument Atmospheric Science Using Grid Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Tang, B.; Manipon, G.; Yunck, T.; Fetzer, E.; Braverman, A.; Dobinson, E.

    2004-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of web services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations will include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-strato-sphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we are developing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo is a system for Scientific Knowledge Creation on the Grid using a Semantically-Enabled Dataflow Execution Environment. SciFlo leverages Simple Object Access Protocol (SOAP) Web Services and the Grid Computing standards (Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable web services and executable operators into a distributed computing flow (operator tree). The SciFlo client & server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The scientist injects a distributed computation into the Grid by simply filling out

  1. Grid Technology as a Cyberinfrastructure for Delivering High-End Services to the Earth and Space Science Community

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas H.

    2004-01-01

    Grid technology consists of middleware that permits distributed computations, data and sensors to be seamlessly integrated into a secure, single-sign-on processing environment. In &is environment, a user has to identify and authenticate himself once to the grid middleware, and then can utilize any of the distributed resources to which he has been,panted access. Grid technology allows resources that exist in enterprises that are under different administrative control to be securely integrated into a single processing environment The grid community has adopted commercial web services technology as a means for implementing persistent, re-usable grid services that sit on top of the basic distributed processing environment that grids provide. These grid services can then form building blocks for even more complex grid services. Each grid service is characterized using the Web Service Description Language, which provides a description of the interface and how other applications can access it. The emerging Semantic grid work seeks to associates sufficient semantic information with each grid service such that applications wii1 he able to automatically select, compose and if necessary substitute available equivalent services in order to assemble collections of services that are most appropriate for a particular application. Grid technology has been used to provide limited support to various Earth and space science applications. Looking to the future, this emerging grid service technology can provide a cyberinfrastructures for both the Earth and space science communities. Groups within these communities could transform those applications that have community-wide applicability into persistent grid services that are made widely available to their respective communities. In concert with grid-enabled data archives, users could easily create complex workflows that extract desired data from one or more archives and process it though an appropriate set of widely distributed grid

  2. ReSS: A Resource Selection Service for the Open Science Grid

    SciTech Connect

    Garzoglio, Gabriele; Levshina, Tanya; Mhashilkar, Parag; Timm, Steve; /Fermilab

    2008-01-01

    The Open Science Grid offers access to hundreds of computing and storage resources via standard Grid interfaces. Before the deployment of an automated resource selection system, users had to submit jobs directly to these resources. They would manually select a resource and specify all relevant attributes in the job description prior to submitting the job. The necessity of a human intervention in resource selection and attribute specification hinders automated job management components from accessing OSG resources and it is inconvenient for the users. The Resource Selection Service (ReSS) project addresses these shortcomings. The system integrates condor technology, for the core match making service, with the gLite CEMon component, for gathering and publishing resource information in the Glue Schema format. Each one of these components communicates over secure protocols via web services interfaces. The system is currently used in production on OSG by the DZero Experiment, the Engagement Virtual Organization, and the Dark Energy. It is also the resource selection service for the Fermilab Campus Grid, FermiGrid. ReSS is considered a lightweight solution to push-based workload management. This paper describes the architecture, performance, and typical usage of the system.

  3. Harmonisation of Grid and Geospatial Services Standards in the Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Cox, S. J.; Woolf, A.; Wyborn, L.; Woodcock, R.; Atkinson, R.; Esterle, J.

    2005-05-01

    Many investigations in the geosciences require consideration of datasets for which the points of truth are distributed. Furthermore, resolution of many geoscientific problems requires use of computational resources that are only available remotely. However, finding and accessing these resources is often unsystematic, and may be very difficult. Even where data centres and computational services exist, they use different interfaces and online access mechanisms, and there is little coordination at cataloguing. Developments in Grid computing and Web Services are providing the means to share computational and informational resources across organisations and boundaries. The approach is to place standard service interfaces in front of the resources. We report on a set of interrelated initiatives in Australia and UK to establish a common framework in the earth and environmental sciences based on Grid and Web Services. A UK project, the 'NERC DataGrid', is developing software that will be used by various data centres to create a 'virtual environmental data Grid'. The first trial will involve the British Atmospheric Data Centre and British Oceanographic Data Centre. For the first time, users will be able to easily find and access both atmospheric and ocean data from these centres in exactly the same way. A community of researchers in Australia (SEEGrid - the Solid Earth and Environment Grid) is concentrating in two areas: (i) open interfaces to data and processing, leveraging Open Geospatial Consortium web service interface standards, and ISO/TC 211 geographic information standards, and (ii) simulations of earth processes using computing resources available over open Grid interfaces. Trials have involved simultaneous access through standard interfaces to heterogeneous measurement archives, and configuration of process simulations independent of computation engine. A particular focus is the importance of standardisation of information content at the semantic level, enabling

  4. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  5. Thermoplastic Composites Reinforced with Textile Grids: Development of a Manufacturing Chain and Experimental Characterisation

    NASA Astrophysics Data System (ADS)

    Böhm, R.; Hufnagl, E.; Kupfer, R.; Engler, T.; Hausding, J.; Cherif, C.; Hufenbach, W.

    2013-12-01

    A significant improvement in the properties of plastic components can be achieved by introducing flexible multiaxial textile grids as reinforcement. This reinforcing concept is based on the layerwise bonding of biaxially or multiaxially oriented, completely stretched filaments of high-performance fibers, e.g. glass or carbon, and thermoplastic components, using modified warp knitting techniques. Such pre-consolidated grid-like textiles are particularly suitable for use in injection moulding, since the grid geometry is very robust with respect to flow pressure and temperature on the one hand and possesses an adjustable spacing to enable a complete filling of the mould cavity on the other hand. The development of pre-consolidated textile grids and their further processing into composites form the basis for providing tailored parts with a large number of additional integrated functions like fibrous sensors or electroconductive fibres. Composites reinforced in that way allow new product groups for promising lightweight structures to be opened up in future. The article describes the manufacturing process of this new composite class and their variability regarding reinforcement and function integration. An experimentally based study of the mechanical properties is performed. For this purpose, quasi-static and highly dynamic tensile tests have been carried out as well as impact penetration experiments. The reinforcing potential of the multiaxial grids is demonstrated by means of evaluating drop tower experiments on automotive components. It has been shown that the load-adapted reinforcement enables a significant local or global improvement of the properties of plastic components depending on industrial requirements.

  6. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results

    PubMed Central

    Humada, Ali M.; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M.; Ahmed, Mushtaq N.

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions. PMID:27035575

  7. Photovoltaic Grid-Connected Modeling and Characterization Based on Experimental Results.

    PubMed

    Humada, Ali M; Hojabri, Mojgan; Sulaiman, Mohd Herwan Bin; Hamada, Hussein M; Ahmed, Mushtaq N

    2016-01-01

    A grid-connected photovoltaic (PV) system operates under fluctuated weather condition has been modeled and characterized based on specific test bed. A mathematical model of a small-scale PV system has been developed mainly for residential usage, and the potential results have been simulated. The proposed PV model based on three PV parameters, which are the photocurrent, IL, the reverse diode saturation current, Io, the ideality factor of diode, n. Accuracy of the proposed model and its parameters evaluated based on different benchmarks. The results showed that the proposed model fitting the experimental results with high accuracy compare to the other models, as well as the I-V characteristic curve. The results of this study can be considered valuable in terms of the installation of a grid-connected PV system in fluctuated climatic conditions.

  8. Sealife: a semantic grid browser for the life sciences applied to the study of infectious diseases.

    PubMed

    Schroeder, Michael; Burger, Albert; Kostkova, Patty; Stevens, Robert; Habermann, Bianca; Dieng-Kuntz, Rose

    2006-01-01

    The objective of Sealife is the conception and realisation of a semantic Grid browser for the life sciences, which will link the existing Web to the currently emerging eScience infrastructure. The SeaLife Browser will allow users to automatically link a host of Web servers and Web/Grid services to the Web content he/she is visiting. This will be accomplished using eScience's growing number of Web/Grid Services and its XML-based standards and ontologies. The browser will identify terms in the pages being browsed through the background knowledge held in ontologies. Through the use of Semantic Hyperlinks, which link identified ontology terms to servers and services, the SeaLife Browser will offer a new dimension of context-based information integration. In this paper, we give an overview over the different components of the browser and their interplay. This SeaLife Browser will be demonstrated within three application scenarios in evidence-based medicine, literature & patent mining, and molecular biology, all relating to the study of infectious diseases. The three applications vertically integrate the molecule/cell, the tissue/organ and the patient/population level by covering the analysis of high-throughput screening data for endocytosis (the molecular entry pathway into the cell), the expression of proteins in the spatial context of tissue and organs, and a high-level library on infectious diseases designed for clinicians and their patients. For more information see http://www.biote.ctu-dresden.de/sealife.

  9. Materials Science and Materials Chemistry for Large Scale Electrochemical Energy Storage: From Transportation to Electrical Grid

    SciTech Connect

    Liu, Jun; Zhang, Jiguang; Yang, Zhenguo; Lemmon, John P.; Imhoff, Carl H.; Graff, Gordon L.; Li, Liyu; Hu, Jian Z.; Wang, Chong M.; Xiao, Jie; Xia, Guanguang; Viswanathan, Vilayanur V.; Baskaran, Suresh; Sprenkle, Vincent L.; Li, Xiaolin; Shao, Yuyan; Schwenzer, Birgit

    2013-02-15

    Large-scale electrical energy storage has become more important than ever for reducing fossil energy consumption in transportation and for the widespread deployment of intermittent renewable energy in electric grid. However, significant challenges exist for its applications. Here, the status and challenges are reviewed from the perspective of materials science and materials chemistry in electrochemical energy storage technologies, such as Li-ion batteries, sodium (sulfur and metal halide) batteries, Pb-acid battery, redox flow batteries, and supercapacitors. Perspectives and approaches are introduced for emerging battery designs and new chemistry combinations to reduce the cost of energy storage devices.

  10. Collaborative Science Using Web Services and the SciFlo Grid Dataflow Engine

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Xing, Z.; Yunck, T.

    2006-12-01

    The General Earth Science Investigation Suite (GENESIS) project is a NASA-sponsored partnership between the Jet Propulsion Laboratory, academia, and NASA data centers to develop a new suite of Web Services tools to facilitate multi-sensor investigations in Earth System Science. The goal of GENESIS is to enable large-scale, multi-instrument atmospheric science using combined datasets from the AIRS, MODIS, MISR, and GPS sensors. Investigations include cross-comparison of spaceborne climate sensors, cloud spectral analysis, study of upper troposphere-stratosphere water transport, study of the aerosol indirect cloud effect, and global climate model validation. The challenges are to bring together very large datasets, reformat and understand the individual instrument retrievals, co-register or re-grid the retrieved physical parameters, perform computationally-intensive data fusion and data mining operations, and accumulate complex statistics over months to years of data. To meet these challenges, we have developed a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data access, subsetting, registration, mining, fusion, compression, and advanced statistical analysis. SciFlo leverages remote Web Services, called via Simple Object Access Protocol (SOAP) or REST (one-line) URLs, and the Grid Computing standards (WS-* &Globus Alliance toolkits), and enables scientists to do multi-instrument Earth Science by assembling reusable Web Services and native executables into a distributed computing flow (tree of operators). The SciFlo client &server engines optimize the execution of such distributed data flows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. In particular, SciFlo exploits the wealth of datasets accessible by OpenGIS Consortium (OGC) Web Mapping Servers & Web Coverage Servers (WMS/WCS), and by Open Data

  11. Experimental optimization of the FireFly 600 photovoltaic off-grid system.

    SciTech Connect

    Boyson, William Earl; Orozco, Ron; Ralph, Mark E.; Brown, Marlene Laura; King, David L.; Hund, Thomas D.

    2003-10-01

    A comprehensive evaluation and experimental optimization of the FireFly{trademark} 600 off-grid photovoltaic system manufactured by Energia Total, Ltd. was conducted at Sandia National Laboratories in May and June of 2001. This evaluation was conducted at the request of the manufacturer and addressed performance of individual system components, overall system functionality and performance, safety concerns, and compliance with applicable codes and standards. A primary goal of the effort was to identify areas for improvement in performance, reliability, and safety. New system test procedures were developed during the effort.

  12. Integrating Solar Power onto the Electric Grid - Bridging the Gap between Atmospheric Science, Engineering and Economics

    NASA Astrophysics Data System (ADS)

    Ghonima, M. S.; Yang, H.; Zhong, X.; Ozge, B.; Sahu, D. K.; Kim, C. K.; Babacan, O.; Hanna, R.; Kurtz, B.; Mejia, F. A.; Nguyen, A.; Urquhart, B.; Chow, C. W.; Mathiesen, P.; Bosch, J.; Wang, G.

    2015-12-01

    One of the main obstacles to high penetrations of solar power is the variable nature of solar power generation. To mitigate variability, grid operators have to schedule additional reliability resources, at considerable expense, to ensure that load requirements are met by generation. Thus despite the cost of solar PV decreasing, the cost of integrating solar power will increase as penetration of solar resources onto the electric grid increases. There are three principal tools currently available to mitigate variability impacts: (i) flexible generation, (ii) storage, either virtual (demand response) or physical devices and (iii) solar forecasting. Storage devices are a powerful tool capable of ensuring smooth power output from renewable resources. However, the high cost of storage is prohibitive and markets are still being designed to leverage their full potential and mitigate their limitation (e.g. empty storage). Solar forecasting provides valuable information on the daily net load profile and upcoming ramps (increasing or decreasing solar power output) thereby providing the grid advance warning to schedule ancillary generation more accurately, or curtail solar power output. In order to develop solar forecasting as a tool that can be utilized by the grid operators we identified two focus areas: (i) develop solar forecast technology and improve solar forecast accuracy and (ii) develop forecasts that can be incorporated within existing grid planning and operation infrastructure. The first issue required atmospheric science and engineering research, while the second required detailed knowledge of energy markets, and power engineering. Motivated by this background we will emphasize area (i) in this talk and provide an overview of recent advancements in solar forecasting especially in two areas: (a) Numerical modeling tools for coastal stratocumulus to improve scheduling in the day-ahead California energy market. (b) Development of a sky imager to provide short term

  13. Experimental and Investigative Science: Support from the Pupil Researcher Initiative.

    ERIC Educational Resources Information Center

    Harrison, Bill; Mannion, Ken

    1996-01-01

    Describes the Pupil Researcher Initiative, a group of educators attempting to provide science teachers with resources aimed at supporting experimental and investigative science as defined in England's GCSE syllabi. Units were developed to reflect the different ways that real science research is carried out. (MKR)

  14. Space science experimentation automation and support

    NASA Technical Reports Server (NTRS)

    Frainier, Richard J.; Groleau, Nicolas; Shapiro, Jeff C.

    1994-01-01

    This paper outlines recent work done at the NASA Ames Artificial Intelligence Research Laboratory on automation and support of science experiments on the US Space Shuttle in low earth orbit. Three approaches to increasing the science return of these experiments using emerging automation technologies are described: remote control (telescience), science advisors for astronaut operators, and fully autonomous experiments. The capabilities and limitations of these approaches are reviewed.

  15. Experimental Investigation of Statistical Moments of Travel Time in Grid-Generated Turbulence

    NASA Astrophysics Data System (ADS)

    Durgin, William; Meleschi, Shangari; Andreeva, Tatiana

    2003-11-01

    Experimental Investigation of Statistical Moments of Travel Time in Grid-Generated Turbulence W.W. Durgin, S.B. Meleschi and T.A. Andreeva ABSTRACT. An experimental technique for investigation of the behavior of acoustic waves propagation through a turbulent medium is discussed. The present study utilizes the ultrasonic travel-time technique to diagnose a grid-generated turbulence. The statistics of the travel-time variations of ultrasonic wave propagation along a path are used to determine some metrics of the turbulence. Experimental investigation is performed under well-controlled laboratory conditions using a data acquisition and control system featuring a high-speed analog to digital conversion card that enables fine resolution of ultrasonic signals. Experimental data confirm numerical and theoretical predictions of a nonlinear increase of the first-order travel time variance with propagation distance. This behavior seems to be closely related to the occurrence of first caustics [Kulkarny and White, Blanc-Benon et al, 1991, 1995]. With increased turbulent intensity the distance at which the first caustic occurs, decreases. Numerically the phenomena was explored by Blanc-Benon and Juvé [1990], Juvé et al, 1991; Karweit et al [1991]; and their probability of appearance in a random field theoretically by Kulkarny and White [1982], Klyatskin [1993], Blanc-Benon et al, [1995]. Current work seeks to illustrate the correspondence between flow parameters and the first appearance of non-linearity in the travel time variance. Expansion of experimental controls to include flow temperature references and rigid transducer supports adds to the integrity of the determined travel times.

  16. Grid-enabled measures: using Science 2.0 to standardize measures and share data.

    PubMed

    Moser, Richard P; Hesse, Bradford W; Shaikh, Abdul R; Courtney, Paul; Morgan, Glen; Augustson, Erik; Kobrin, Sarah; Levin, Kerry Y; Helba, Cynthia; Garner, David; Dunn, Marsha; Coa, Kisha

    2011-05-01

    Scientists are taking advantage of the Internet and collaborative web technology to accelerate discovery in a massively connected, participative environment--a phenomenon referred to by some as Science 2.0. As a new way of doing science, this phenomenon has the potential to push science forward in a more efficient manner than was previously possible. The Grid-Enabled Measures (GEM) database has been conceptualized as an instantiation of Science 2.0 principles by the National Cancer Institute (NCI) with two overarching goals: (1) promote the use of standardized measures, which are tied to theoretically based constructs; and (2) facilitate the ability to share harmonized data resulting from the use of standardized measures. The first is accomplished by creating an online venue where a virtual community of researchers can collaborate together and come to consensus on measures by rating, commenting on, and viewing meta-data about the measures and associated constructs. The second is accomplished by connecting the constructs and measures to an ontological framework with data standards and common data elements such as the NCI Enterprise Vocabulary System (EVS) and the cancer Data Standards Repository (caDSR). This paper will describe the web 2.0 principles on which the GEM database is based, describe its functionality, and discuss some of the important issues involved with creating the GEM database such as the role of mutually agreed-on ontologies (i.e., knowledge categories and the relationships among these categories--for data sharing).

  17. Surface Modeling and Grid Generation of Orbital Sciences X34 Vehicle. Phase 1

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    The surface modeling and grid generation requirements, motivations, and methods used to develop Computational Fluid Dynamic volume grids for the X34-Phase 1 are presented. The requirements set forth by the Aerothermodynamics Branch at the NASA Langley Research Center serve as the basis for the final techniques used in the construction of all volume grids, including grids for parametric studies of the X34. The Integrated Computer Engineering and Manufacturing code for Computational Fluid Dynamics (ICEM/CFD), the Grid Generation code (GRIDGEN), the Three-Dimensional Multi-block Advanced Grid Generation System (3DMAGGS) code, and Volume Grid Manipulator (VGM) code are used to enable the necessary surface modeling, surface grid generation, volume grid generation, and grid alterations, respectively. All volume grids generated for the X34, as outlined in this paper, were used for CFD simulations within the Aerothermodynamics Branch.

  18. Visual monitoring of autonomous life sciences experimentation

    NASA Technical Reports Server (NTRS)

    Blank, G. E.; Martin, W. N.

    1987-01-01

    The design and implementation of a computerized visual monitoring system to aid in the monitoring and control of life sciences experiments on board a space station was investigated. A likely multiprocessor design was chosen, a plausible life science experiment with which to work was defined, the theoretical issues involved in the programming of a visual monitoring system for the experiment was considered on the multiprocessor, a system for monitoring the experiment was designed, and simulations of such a system was implemented on a network of Apollo workstations.

  19. Experimental investigation of the dynamics of a vibrating grid in superfluid 4He over a range of temperatures and pressures.

    PubMed

    Charalambous, D; Skrbek, L; Hendry, P C; McClintock, P V E; Vinen, W F

    2006-09-01

    In an earlier paper [Nichol, Phys. Rev. E, 70, 056307 (2004)] some of the present authors presented the results of an experimental study of the dynamics of a stretched grid driven into vibration at or near its resonant frequency in isotopically pure superfluid 4He over a range of pressures at a very low temperature, where the density of normal fluid is negligible. In this paper we present the results of a similar study, based on a different grid, but now including the temperature range where the normal fluid density is no longer insignificant. The new grid is very similar to the old one except for a small difference in the character of its surface roughness. In many respects the results at low temperature are similar to those for the old grid. At low amplitudes the results are somewhat history dependent, but in essence there is no damping greater than that in vacuo. At a critical amplitude corresponding to a velocity of about 50 mms(-1) there is a sudden and large increase in damping, which can be attributed to the generation of new vortex lines. Strange shifts in the resonant frequency at intermediate amplitudes observed with the old grid are no longer seen, however they must therefore have been associated with the different surface roughness, or perhaps were due simply to some artifact of the old grid, the details of which we are currently unable to determine. With the new grid we have studied both the damping at low amplitudes due to excitations of the normal fluid, and the dependence of the supercritical damping on temperature. We present evidence that in helium at low amplitudes there may be some enhancement in the effective mass of the grid in addition to that associated with potential flow of the helium. In some circumstances small satellite resonances are seen near the main fundamental grid resonance, which are attributed to coupling to some other oscillatory system within the experimental cell.

  20. RAON experimental facilities for nuclear science

    SciTech Connect

    Kwon, Y. K.; Kim, Y. K.; Komatsubara, T.; Moon, J. Y.; Park, J. S.; Shin, T. S.; Kim, Y. J.

    2014-05-02

    The Rare Isotope Science Project (RISP) was established in December 2011 and has put quite an effort to carry out the design and construction of the accelerator complex facility named “RAON”. RAON is a rare isotope (RI) beam facility that aims to provide various RI beams of proton-and neutron-rich nuclei as well as variety of stable ion beams of wide ranges of energies up to a few hundreds MeV/nucleon for the researches in basic science and application. Proposed research programs for nuclear physics and nuclear astrophysics at RAON include studies of the properties of exotic nuclei, the equation of state of nuclear matter, the origin of the universe, process of nucleosynthesis, super heavy elements, etc. Various high performance magnetic spectrometers for nuclear science have been designed, which are KOBRA (KOrea Broad acceptance Recoil spectrometer and Apparatus), LAMPS (Large Acceptance Multi-Purpose Spectrometer), and ZDS (Zero Degree Spectrometer). The status of those spectrometers for nuclear science will be presented with a brief report on the RAON.

  1. Geomorphology, Science (Experimental): 5343.09.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    Performance objectives are stated for this secondary school instructional unit concerned with aspects of earth science with emphases on the internal and external forces that bring about changes in the earth's crust. Lists of films and state-adopted and other texts are presented. Included are a course outline summarizing the unit content; numerous…

  2. Environmental Science, Grade 9. Experimental Curriculum Bulletin.

    ERIC Educational Resources Information Center

    Bernstein, Leonard, Ed.

    This is the teacher's guide for the required, interdisciplinary, ninth-year environmental science course for the New York City Schools. One hundred twenty lesson plans, divided into nine units, are presented. Areas of study include the living and non-living environment, ecosystems, population, urban ecology, energy and technology, pollution, and…

  3. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid

    PubMed Central

    Poehlman, William L.; Rynge, Mats; Branton, Chris; Balamurugan, D.; Feltus, Frank A.

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments. PMID:27499617

  4. Scalability of network facing services used in the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Sfiligoi, I.; Pi, H.; Würthwein, F.; Theissen, C.; Dost, J. M.

    2011-12-01

    The Open Science Grid relies on several network facing services to deliver resources to its users. The major services are the Compute Elements, Storage Elements, Workload Management Systems and Information Systems. All of these services are exposed to traffic coming from all over the world in an unmanaged way, so it is very important to know how they behave at different levels of load. In this paper we present the methodology and the results of scalability and reliability tests performed by OSG on some of the above services. The major services being tested are the Condor batch system, the GT2, GRAM5 and CREAM CEs, and the BeStMan SRM SE.

  5. OSG-GEM: Gene Expression Matrix Construction Using the Open Science Grid.

    PubMed

    Poehlman, William L; Rynge, Mats; Branton, Chris; Balamurugan, D; Feltus, Frank A

    2016-01-01

    High-throughput DNA sequencing technology has revolutionized the study of gene expression while introducing significant computational challenges for biologists. These computational challenges include access to sufficient computer hardware and functional data processing workflows. Both these challenges are addressed with our scalable, open-source Pegasus workflow for processing high-throughput DNA sequence datasets into a gene expression matrix (GEM) using computational resources available to U.S.-based researchers on the Open Science Grid (OSG). We describe the usage of the workflow (OSG-GEM), discuss workflow design, inspect performance data, and assess accuracy in mapping paired-end sequencing reads to a reference genome. A target OSG-GEM user is proficient with the Linux command line and possesses basic bioinformatics experience. The user may run this workflow directly on the OSG or adapt it to novel computing environments.

  6. Nuclear Test-Experimental Science: Annual report, fiscal year 1988

    SciTech Connect

    Struble, G.L.; Donohue, M.L.; Bucciarelli, G.; Hymer, J.D.; Kirvel, R.D.; Middleton, C.; Prono, J.; Reid, S.; Strack, B.

    1988-01-01

    Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challenges and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.

  7. Recent developments in the experimental identification of the dynamics of a highly flexible grid

    NASA Technical Reports Server (NTRS)

    Montgomery, Raymond C.; Lazarus, Terri

    1987-01-01

    Control effectiveness tests of reaction wheel actuators attached to a highly flexible grid are reported. Analytic determination of actuator control effectiveness is accomplished with finite-element modeling. Experimental determination is done with a least-square parameter identification algorithm that identifies the control coefficients of the second-order difference equation model of each vibration mode. The algorithm assumes a model with frequency and damping predetermined from free-decay tests for each mode. Accounting for the difference in forced and resonant frequency was necessary to produce control effectiveness estimates that are in reasonable agreement with the analytic predictions. The average error for control effectiveness coefficients greater than 5/sq sec was 6.384 percent.

  8. [Experimentation with women: science fiction or reality?].

    PubMed

    Villar Amigó, Vicente M

    2008-01-01

    Many people will not have heard about the experimentation that has been, and continues to be, carried out on women, because much of the media makes no mention of the matter. Just a few examples that could be mentioned are experimentation with the contraceptive pill, forced sterilization, egg donation, surrogate motherhood, kidney and other organ donation, and unnecessary therapy and surgery. In a few cases such experimentation could well be termed exploitation of women, with all kinds of excuses or humanitarian reasons, and sometimes communitarian purposes and even reasons concerning possible benefits for the whole of society, being mentioned. The present work aims to stimulate reflection about some types of research which can only be regarded as exploitation of women.

  9. Roles and applications of biomedical ontologies in experimental animal science.

    PubMed

    Masuya, Hiroshi

    2012-01-01

    A huge amount of experimental data from past studies has played a vital role in the development of new knowledge and technologies in biomedical science. The importance of computational technologies for the reuse of data, data integration, and knowledge discoveries has also increased, providing means of processing large amounts of data. In recent years, information technologies related to "ontologies" have played more significant roles in the standardization, integration, and knowledge representation of biomedical information. This review paper outlines the history of data integration in biomedical science and its recent trends in relation to the field of experimental animal science.

  10. A Laboratory That Reveals Indirect Experimental Methods of Science

    ERIC Educational Resources Information Center

    Schmidt, Chris; And Others

    1975-01-01

    Describes a physics laboratory experiment for nonscience majors intended to illustrate indirect experimental methods of sciences treating objects too small for sensory observation. The student explores and attempts to identify an object in a closed container, using provided experimental tools and small openings in the opaque lid. (MLH)

  11. The NASA/GSFC Advanced Data Grid: A Prototype for Future Earth Science Ground System Architectures

    NASA Technical Reports Server (NTRS)

    Gasster, Samuel D.; Lee, Craig; Davis, Brooks; Clark, Matt; AuYeung, Mike; Wilson, John R.; Ladwig, Debra M.

    2003-01-01

    Contents include the following: Background and motivation. Grid computing concepts. Advanced data grid (ADG) prototype development. ADG requirements and operations concept. ADG architecture. ADG implementation. ADG test plan. ADG schedule. Summary and status.

  12. Experimental comparison of inquiry and direct instruction in science

    NASA Astrophysics Data System (ADS)

    Cobern, William W.; Schuster, David; Adams, Betty; Applegate, Brooks; Skjold, Brandy; Undreiu, Adriana; Loving, Cathleen C.; Gobert, Janice D.

    2010-04-01

    There are continuing educational and political debates about 'inquiry' versus 'direct' teaching of science. Traditional science instruction has been largely direct but in the US, recent national and state science education standards advocate inquiry throughout K-12 education. While inquiry-based instruction has the advantage of modelling aspects of the nature of real scientific inquiry, there is little unconfounded comparative research into the effectiveness and efficiency of the two instructional modes for developing science conceptual understanding. This research undertook a controlled experimental study comparing the efficacy of carefully designed inquiry instruction and equally carefully designed direct instruction in realistic science classroom situations at the middle school grades. The research design addressed common threats to validity. We report on the nature of the instructional units in each mode, research design, methods, classroom implementations, monitoring, assessments, analysis and project findings.

  13. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  14. The Structure of Scientific Arguments by Secondary Science Teachers: Comparison of Experimental and Historical Science Topics

    ERIC Educational Resources Information Center

    Gray, Ron; Kang, Nam-Hwa

    2014-01-01

    Just as scientific knowledge is constructed using distinct modes of inquiry (e.g. experimental or historical), arguments constructed during science instruction may vary depending on the mode of inquiry underlying the topic. The purpose of this study was to examine whether and how secondary science teachers construct scientific arguments during…

  15. ISOGA: Integrated Services Optical Grid Architecture for Emerging E-Science Collaborative Applications

    SciTech Connect

    Oliver Yu

    2008-11-28

    This final report describes the accomplishments in the ISOGA (Integrated Services Optical Grid Architecture) project. ISOGA enables efficient deployment of existing and emerging collaborative grid applications with increasingly diverse multimedia communication requirements over a wide-area multi-domain optical network grid; and enables collaborative scientists with fast retrieval and seamless browsing of distributed scientific multimedia datasets over a wide-area optical network grid. The project focuses on research and development in the following areas: the polymorphic optical network control planes to enable multiple switching and communication services simultaneously; the intelligent optical grid user-network interface to enable user-centric network control and monitoring; and the seamless optical grid dataset browsing interface to enable fast retrieval of local/remote dataset for visualization and manipulation.

  16. Experimental Study of Two Phase Flow Behavior Past BWR Spacer Grids

    SciTech Connect

    Ratnayake, Ruwan K.; Hochreiter, L.E.; Ivanov, K.N.; Cimbala, J.M.

    2002-07-01

    Performance of best estimate codes used in the nuclear industry can be significantly improved by reducing the empiricism embedded in their constitutive models. Spacer grids have been found to have an important impact on the maximum allowable Critical Heat Flux within the fuel assembly of a nuclear reactor core. Therefore, incorporation of suitable spacer grids models can improve the critical heat flux prediction capability of best estimate codes. Realistic modeling of entrainment behavior of spacer grids requires understanding the different mechanisms that are involved. Since visual information pertaining to the entrainment behavior of spacer grids cannot possibly be obtained from operating nuclear reactors, experiments have to be designed and conducted for this specific purpose. Most of the spacer grid experiments available in literature have been designed in view of obtaining quantitative data for the purpose of developing or modifying empirical formulations for heat transfer, critical heat flux or pressure drop. Very few experiments have been designed to provide fundamental information which can be used to understand spacer grid effects and phenomena involved in two phase flow. Air-water experiments were conducted to obtain visual information on the two-phase flow behavior both upstream and downstream of Boiling Water Reactor (BWR) spacer grids. The test section was designed and constructed using prototypic dimensions such as the channel cross-section, rod diameter and other spacer grid configurations of a typical BWR fuel assembly. The test section models the flow behavior in two adjacent sub channels in the BWR core. A portion of a prototypic BWR spacer grid accounting for two adjacent channels was used with industrial mild steel rods for the purpose of representing the channel internals. Symmetry was preserved in this practice, so that the channel walls could effectively be considered as the channel boundaries. Thin films were established on the rod surfaces

  17. The Experimenter Expectancy Effect: An Inevitable Component of School Science?

    ERIC Educational Resources Information Center

    Allen, Michael

    2015-01-01

    A medium-scale quantitative study (n = 90) found that 10-11-year-old pupils dealt with theory and evidence in notably different ways, depending on how the same science practical task was delivered. Under the auspices of a 2×2 part-randomised and part-quasi experimental design, pupils were asked to complete a brief, apparently simple task involving…

  18. An Illustration of the Experimenter Expectancy Effect in School Science

    ERIC Educational Resources Information Center

    Allen, Michael; Briten, Elizabeth

    2012-01-01

    Two groups of year 6 pupils (age 10-11 years) each experienced science practical lessons that were essentially identical but for one difference: one group (theory-led) were told by the teacher what result they should expect, and the other group (hypothetico-deductive) were not. The theory-led group demonstrated experimental bias, recording results…

  19. Environmental Science. An Experimental Programme for Primary Teachers.

    ERIC Educational Resources Information Center

    Linke, R. D.

    An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…

  20. The influence of science popularizers on the public's view of religion and science: An experimental assessment.

    PubMed

    Scheitle, Christopher P; Ecklund, Elaine Howard

    2015-06-08

    Research suggests that public figures can play an influential role in forming public opinion; yet, little research has experimentally tested the efficacy of public figures on the cognitive formation of boundaries. Using an experiment embedded within a nationally representative survey, we examine how two science popularizers, Francis Collins and Richard Dawkins, influence perceptions regarding the boundaries between religion and science. We find that learning of Dawkins does not influence people's perceptions of the religion-science relationship, while learning of Collins shifts respondents toward a collaborative view of religion and science. Findings suggest that figures with unexpected views might be more effective in changing conceptual boundaries.

  1. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  2. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  3. Development of experimental systems for material sciences under microgravity

    NASA Technical Reports Server (NTRS)

    Tanii, Jun; Obi, Shinzo; Kamimiyata, Yotsuo; Ajimine, Akio

    1988-01-01

    As part of the Space Experiment Program of the Society of Japanese Aerospace Companies, three experimental systems (G452, G453, G454) have been developed for materials science studies under microgravity by the NEC Corporation. These systems are to be flown as Get Away Special payloads for studying the feasibility of producing new materials. Together with the experimental modules carrying the hardware specific to the experiment, the three systems all comprise standard subsystems consisting of a power supply, sequence controller, temperature controller, data recorder, and video recorder.

  4. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis.

    PubMed

    Duarte, Afonso M S; Psomopoulos, Fotis E; Blanchet, Christophe; Bonvin, Alexandre M J J; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C; de Lucas, Jesus M; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.

  5. Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

    PubMed Central

    Duarte, Afonso M. S.; Psomopoulos, Fotis E.; Blanchet, Christophe; Bonvin, Alexandre M. J. J.; Corpas, Manuel; Franc, Alain; Jimenez, Rafael C.; de Lucas, Jesus M.; Nyrönen, Tommi; Sipos, Gergely; Suhr, Stephanie B.

    2015-01-01

    With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community. PMID:26157454

  6. Experimental Physical Sciences Vistas: MaRIE (draft)

    SciTech Connect

    Shlachter, Jack

    2010-09-08

    To achieve breakthrough scientific discoveries in the 21st century, a convergence and integration of world-leading experimental facilities and capabilities with theory, modeling, and simulation is necessary. In this issue of Experimental Physical Sciences Vistas, I am excited to present our plans for Los Alamos National Laboratory's future flagship experimental facility, MaRIE (Matter-Radiation Interactions in Extremes). MaRIE is a facility that will provide transformational understanding of matter in extreme conditions required to reduce or resolve key weapons performance uncertainties, develop the materials needed for advanced energy systems, and transform our ability to create materials by design. Our unique role in materials science starting with the Manhattan Project has positioned us well to develop a contemporary materials strategy pushing the frontiers of controlled functionality - the design and tailoring of a material for the unique demands of a specific application. Controlled functionality requires improvement in understanding of the structure and properties of materials in order to synthesize and process materials with unique characteristics. In the nuclear weapons program today, improving data and models to increase confidence in the stockpile can take years from concept to new knowledge. Our goal with MaRIE is to accelerate this process by enhancing predictive capability - the ability to compute a priori the observables of an experiment or test and pertinent confidence intervals using verified and validated simulation tools. It is a science-based approach that includes the use of advanced experimental tools, theoretical models, and multi-physics codes, simultaneously dealing with multiple aspects of physical operation of a system that are needed to develop an increasingly mature predictive capability. This same approach is needed to accelerate improvements to other systems such as nuclear reactors. MaRIE will be valuable to many national security

  7. Animal experimentation in forensic sciences: How far have we come?

    PubMed

    Cattaneo, C; Maderna, E; Rendinelli, A; Gibelli, D

    2015-09-01

    In the third millennium where ethical, ethological and cultural evolution seem to be leading more and more towards an inter-species society, the issue of animal experimentation is a moral dilemma. Speaking from a self-interested human perspective, avoiding all animal testing where human disease and therapy are concerned may be very difficult or even impossible; such testing may not be so easily justifiable when suffering-or killing-of non human animals is inflicted for forensic research. In order to verify how forensic scientists are evolving in this ethical issue, we undertook a systematic review of the current literature. We investigated the frequency of animal experimentation in forensic studies in the past 15 years and trends in publication in the main forensic science journals. Types of species, lesions inflicted, manner of sedation or anesthesia and euthanasia were examined in a total of 404 articles reviewed, among which 279 (69.1%) concerned studies involving animals sacrificed exclusively for the sake of the experiment. Killing still frequently includes painful methods such as blunt trauma, electrocution, mechanical asphyxia, hypothermia, and even exsanguination; of all these animals, apparently only 60.8% were anesthetized. The most recent call for a severe reduction if not a total halt to the use of animals in forensic sciences was made by Bernard Knight in 1992. In fact the principle of reduction and replacement, frequently respected in clinical research, must be considered the basis for forensic science research needing animals.

  8. Boundary condition identification for a grid model by experimental and numerical dynamic analysis

    NASA Astrophysics Data System (ADS)

    Mao, Qiang; Devitis, John; Mazzotti, Matteo; Bartoli, Ivan; Moon, Franklin; Sjoblom, Kurt; Aktan, Emin

    2015-04-01

    There is a growing need to characterize unknown foundations and assess substructures in existing bridges. It is becoming an important issue for the serviceability and safety of bridges as well as for the possibility of partial reuse of existing infrastructures. Within this broader contest, this paper investigates the possibility of identifying, locating and quantifying changes of boundary conditions, by leveraging a simply supported grid structure with a composite deck. Multi-reference impact tests are operated for the grid model and modification of one supporting bearing is done by replacing a steel cylindrical roller with a roller of compliant material. Impact based modal analysis provide global modal parameters such as damped natural frequencies, mode shapes and flexibility matrix that are used as indicators of boundary condition changes. An updating process combining a hybrid optimization algorithm and the finite element software suit ABAQUS is presented in this paper. The updated ABAQUS model of the grid that simulates the supporting bearing with springs is used to detect and quantify the change of the boundary conditions.

  9. Teaching science problem solving: An overview of experimental work

    NASA Astrophysics Data System (ADS)

    Taconis, R.; Ferguson-Hessler, M. G. M.; Broekkamp, H.

    2001-04-01

    The traditional approach to teaching science problem solving is having the students work individually on a large number of problems. This approach has long been overtaken by research suggesting and testing other methods, which are expected to be more effective. To get an overview of the characteristics of good and innovative problem-solving teaching strategies, we performed an analysis of a number of articles published between 1985 and 1995 in high-standard international journals, describing experimental research into the effectiveness of a wide variety of teaching strategies for science problem solving. To characterize the teaching strategies found, we used a model of the capacities needed for effective science problem solving, composed of a knowledge base and a skills base. The relations between the cognitive capacities required by the experimental or control treatments and those of the model were specified and used as independent variables. Other independent variables were learning conditions such as feedback and group work. As a dependent variable we used standardized learning effects. We identified 22 articles describing 40 experiments that met the standards we deemed necessary for a meta-analysis. These experiments were analyzed both with quantitative (correlational) methods and with a systematic qualitative method. A few of the independent variables were found to characterize effective strategies for teaching science problem solving. Effective treatments all gave attention to the structure and function (the schemata) of the knowledge base, whereas attention to knowledge of strategy and the practice of problem solving turned out to have little effect. As for learning conditions, both providing the learners with guidelines and criteria they can use in judging their own problem-solving process and products, and providing immediate feedback to them were found to be important prerequisites for the acquisition of problem-solving skills. Group work did not lead to

  10. Analyzing Sustainable Energy Opportunities for a Small Scale Off-Grid Facility: A Case Study at Experimental Lakes Area (ELA), Ontario

    NASA Astrophysics Data System (ADS)

    Duggirala, Bhanu

    This thesis explored the opportunities to reduce energy demand and renewable energy feasibility at an off-grid science "community" called the Experimental Lakes Area (ELA) in Ontario. Being off-grid, ELA is completely dependent on diesel and propane fuel supply for all its electrical and heating needs, which makes ELA vulnerable to fluctuating fuel prices. As a result ELA emits a large amount of greenhouse gases (GHG) for its size. Energy efficiency and renewable energy technologies can reduce energy consumption and consequently energy cost, as well as GHG. Energy efficiency was very important to ELA due to the elevated fuel costs at this remote location. Minor upgrades to lighting, equipment and building envelope were able to reduce energy costs and reduce load. Efficient energy saving measures were recommended that save on operating and maintenance costs, namely, changing to LED lights, replacing old equipment like refrigerators and downsizing of ice makers. This resulted in a 4.8% load reduction and subsequently reduced the initial capital cost for biomass by 27,000, by 49,500 for wind power and by 136,500 for solar power. Many alternative energies show promise as potential energy sources to reduce the diesel and propane consumption at ELA including wind energy, solar heating and biomass. A biomass based CHP system using the existing diesel generators as back-up has the shortest pay back period of the technologies modeled. The biomass based CHP system has a pay back period of 4.1 years at 0.80 per liter of diesel, as diesel price approaches $2.00 per liter the pay back period reduces to 0.9 years, 50% the generation cost compared to present generation costs. Biomass has been successfully tried and tested in many off-grid communities particularly in a small-scale off-grid setting in North America and internationally. Also, the site specific solar and wind data show that ELA has potential to harvest renewable resources and produce heat and power at competitive

  11. EverVIEW: a visualization platform for hydrologic and Earth science gridded data

    USGS Publications Warehouse

    Romañach, Stephanie S.; McKelvy, James M.; Suir, Kevin J.; Conzelmann, Craig

    2015-01-01

    The EverVIEW Data Viewer is a cross-platform desktop application that combines and builds upon multiple open source libraries to help users to explore spatially-explicit gridded data stored in Network Common Data Form (NetCDF). Datasets are displayed across multiple side-by-side geographic or tabular displays, showing colorized overlays on an Earth globe or grid cell values, respectively. Time-series datasets can be animated to see how water surface elevation changes through time or how habitat suitability for a particular species might change over time under a given scenario. Initially targeted toward Florida's Everglades restoration planning, EverVIEW has been flexible enough to address the varied needs of large-scale planning beyond Florida, and is currently being used in biological planning efforts nationally and internationally.

  12. PhosphoGRID: a database of experimentally verified in vivo protein phosphorylation sites from the budding yeast Saccharomyces cerevisiae.

    PubMed

    Stark, Chris; Su, Ting-Cheng; Breitkreutz, Ashton; Lourenco, Pedro; Dahabieh, Matthew; Breitkreutz, Bobby-Joe; Tyers, Mike; Sadowski, Ivan

    2010-01-01

    Protein phosphorylation plays a central role in cellular regulation. Recent proteomics strategies for identifying phosphopeptides have been developed using the model organism Saccharomyces cerevisiae, and consequently, when combined with studies of individual gene products, the number of reported specific phosphorylation sites for this organism has expanded enormously. In order to systematically document and integrate these various data types, we have developed a database of experimentally verified in vivo phosphorylation sites curated from the S. cerevisiae primary literature. PhosphoGRID (www.phosphogrid.org) records the positions of over 5000 specific phosphorylated residues on 1495 gene products. Nearly 900 phosphorylated residues are reported from detailed studies of individual proteins; these in vivo phosphorylation sites are documented by a hierarchy of experimental evidence codes. Where available for specific sites, we have also noted the relevant protein kinases and/or phosphatases, the specific condition(s) under which phosphorylation occurs, and the effect(s) that phosphorylation has on protein function. The unique features of PhosphoGRID that assign both function and specific physiological conditions to each phosphorylated residue will provide a valuable benchmark for proteome-level studies and will facilitate bioinformatic analysis of cellular signal transduction networks. Database URL: http://phosphogrid.org/

  13. Experimental Evaluation of Load Rejection Over-Voltage from Grid-Tied Solar Inverters

    SciTech Connect

    Nelson, Austin; Hoke, Anderson; Chakraborty, Sudipta; Ropp, Michael; Chebahtah, Justin; Wang, Trudie; Zimmerly, Brian

    2015-06-14

    This paper investigates the impact of load rejection over-voltage (LRO) from commercially available grid-tied photovoltaic (PV) inverters. LRO can occur when a breaker opens and the power output from a distributed energy resource (DER) exceeds the load. Simplified models of current-controlled inverters can over-predict LRO magnitudes, thus it is useful to quantify the effect through laboratory testing. The load rejection event was replicated using a hardware testbed at the National Renewable Energy Laboratory (NREL), and a set of commercially available PV inverters was tested to quantify the impact of LRO for a range of generation-to-load ratios. The magnitude and duration of the over-voltage events are reported in this paper along with a discussion of characteristic inverter output behavior. The results for the inverters under test showed that maximum over-voltage magnitudes were less than 200% of nominal voltage, and much lower in many test cases. These research results are important because utilities that interconnect inverter-based DER need to understand their characteristics under abnormal grid conditions.

  14. Numerical and experimental investigation of an airfoil with load control in the wake of an active grid

    NASA Astrophysics Data System (ADS)

    Fischer, A.; Lutz, T.; Kramer, E.; Cordes, U.; Hufnagel, K.; Tropea, C.; Kampers, G.; Hölling, M.; Peinke, J.

    2016-09-01

    A new passive load reduction system, using coupled leading and trailing edge flaps, was developed at TU Darmstadt and investigated experimentally and numerically. The experiments were performed in the wind tunnel of the University of Oldenburg, where sinusoidal inflow conditions, representing for example the tower blockage effect, were created by means of an active grid. The numerical investigations were performed at the University of Stuttgart, using a quasi two-dimensional setup and a block structured CFD solver. In the present paper, a brief description of the experimental setup is given, whereas the numerical setup, in particular the realisation of the wind tunnel conditions, is presented in more detail. Moreover, a comparison between the measured and simulated loads for an airfoil with and without adaptive camber concept is discussed.

  15. Grid oscillators

    NASA Technical Reports Server (NTRS)

    Popovic, Zorana B.; Kim, Moonil; Rutledge, David B.

    1988-01-01

    Loading a two-dimensional grid with active devices offers a means of combining the power of solid-state oscillators in the microwave and millimeter-wave range. The grid structure allows a large number of negative resistance devices to be combined. This approach is attractive because the active devices do not require an external locking signal, and the combining is done in free space. In addition, the loaded grid is a planar structure amenable to monolithic integration. Measurements on a 25-MESFET grid at 9.7 GHz show power-combining and frequency-locking without an external locking signal, with an ERP of 37 W. Experimental far-field patterns agree with theoretical results obtained using reciprocity.

  16. Experimental Investigation of Bearing Slip in a Wind Turbine Gearbox During a Transient Grid Loss Event

    SciTech Connect

    Helsen, Jan; Guo, Yi; Keller, Jonathan; Guillaume, Patrick

    2016-12-01

    This work investigates the behaviour of the high speed stage of a wind turbine gearbox during a transient grid loss event. Dynamometer testing on a full scale wind turbine nacelle is used. A combination of external and internal gearbox measurements is analysed. Particular focus is on the characterization of the high speed shaft tapered roller bearing slip behaviour. This slipping behaviour is linked to dynamic events by many researchers and described as potential bearing failure initiator. However only limited full scale dynamic testing is documented. Strain gauge bridges in grooves along the circumference of the outer ring are used to characterize the bearing behaviour in detail. It is shown that during the transient event the high speed shaft experiences a combined torsional and bending deformation. These unfavourable loading conditions induce roller slip in the bearings during the torque reversals indicating the potential of the applied load case to go beyond the preload of the tapered roller bearing.

  17. SEE-GRID eInfrastructure for Regional eScience

    NASA Astrophysics Data System (ADS)

    Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel

    In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e

  18. Scientific Grid computing.

    PubMed

    Coveney, Peter V

    2005-08-15

    We introduce a definition of Grid computing which is adhered to throughout this Theme Issue. We compare the evolution of the World Wide Web with current aspirations for Grid computing and indicate areas that need further research and development before a generally usable Grid infrastructure becomes available. We discuss work that has been done in order to make scientific Grid computing a viable proposition, including the building of Grids, middleware developments, computational steering and visualization. We review science that has been enabled by contemporary computational Grids, and associated progress made through the widening availability of high performance computing.

  19. Distributed geant4 simulation in medical and space science applications using DIANE framework and the GRID

    NASA Astrophysics Data System (ADS)

    Mościcki, Jakub T.; Guatelli, Susanna; Mantero, Alfonso; Pia, M. G.

    2003-09-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed Analysis Environment (DIANE) [1] is a R&D study, focusing on semi-interactive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European Data Grid middleware. This paper describes the general concepts behind the DIANE framework and results of the first tests with distributed Geant 4 simulations.

  20. New Source Code: Spelman Women Transforming the Grid of Science and Technology

    NASA Astrophysics Data System (ADS)

    Okonkwo, Holly

    From a seminary for newly freedwomen in the 19th century "Deep South" of the United States to a "Model Institution for Excellence" in undergraduate science, technology, engineering, and math education, the narrative of Spelman College is a critical piece to understanding the overall history and socially constructed nature of science and higher education in the U.S. Making a place for science at Spelman College, disrupts and redefines the presumed and acceptable roles of African American women in science and their social, political and economic engagements in U.S society as a whole. Over the course of 16 months, I explore the narrative experiences of members of the Spelman campus community and immerse myself in the environment to experience becoming of member of a scientific community that asserts a place for women of African descent in science and technology and perceives this positionality as positive, powerful and the locus of agency. My intention is to offer this research as an in-depth ethnographic presentation of intentional science learning, knowledge production and practice as lived experiences at the multiple intersections of the constructs of race, gender, positionality and U.S science itself. In this research, I am motivated to move the contemporary discourse of diversifying science, technology, engineering and mathematics fields in the U.S. academy, beyond the chronicling of women of African descent as statistical rarities over time, as subjectivities and the deficit frameworks that theoretically encapsulate their narratives. The findings of this research demonstrate that Spelman students, staff and alumni are themselves, the cultural capital that validates Spelman's identity as a place, its institutional mission and are at the core of the institutional success of the college. It is a personal mission as much as it is an institutional mission, which is precisely what makes it powerful.

  1. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  2. Pre-Service Teachers' Use of Improvised and Virtual Laboratory Experimentation in Science Teaching

    ERIC Educational Resources Information Center

    Bhukuvhani, Crispen; Kusure, Lovemore; Munodawafa, Violet; Sana, Abel; Gwizangwe, Isaac

    2010-01-01

    This research surveyed 11 purposely sampled Bindura University of Science Education (Zimbabwe) Bachelor of Science Education Honours Part III pre-service science teachers' use of improvised and virtual laboratory experimentation in science teaching. A self-designed four-point Likert scale twenty-item questionnaire was used. SPSS Version 10 was…

  3. GOC-TX: A Reliable Ticket Synchronization Application for the Open Science Grid

    NASA Astrophysics Data System (ADS)

    Hayashi, Soichi; Gopu, Arvind; Quick, Robert

    2011-12-01

    One of the major operational issues faced by large multi-institutional collaborations is permitting its users and support staff to use their native ticket tracking environment while also exchanging these tickets with collaborators. After several failed attempts at email-parser based ticket exchanges, the OSG Operations Group has designed a comprehensive ticket synchronizing application. The GOC-TX application uses web-service interfaces offered by various commercial, open source and other homegrown ticketing systems, to synchronize tickets between two or more of these systems. GOC-TX operates independently from any ticketing system. It can be triggered by one ticketing system via email, active messaging, or a web-services call to check for current sync-status, pull applicable recent updates since prior synchronizations to the source ticket, and apply the updates to a destination ticket. The currently deployed production version of GOC-TX is able to synchronize tickets between the Numara Footprints ticketing system used by the OSG and the following systems: European Grid Initiative's system Global Grid User Support (GGUS) and the Request Tracker (RT) system used by Brookhaven. Additional interfaces to the BMC Remedy system used by Fermilab, and to other instances of RT used by other OSG partners, are expected to be completed in summer 2010. A fully configurable open source version is expected to be made available by early autumn 2010. This paper will cover the structure of the GOC-TX application, its evolution, and the problems encountered by OSG Operations group with ticket exchange within the OSG Collaboration.

  4. Rain, Rocks, Rockets and Reactions, Science (Experimental): 5334.04.

    ERIC Educational Resources Information Center

    Barringer, Tom

    This unit of instruction was designed for terminal science students whose interest and background in science are extremely limited. The course is presented through activity-centered study, and consists of selected topics in physical science including household chemistry, weather, geology and space science. The booklet lists the relevant…

  5. The Distinction between Experimental and Historical Sciences as a Framework for Improving Classroom Inquiry

    ERIC Educational Resources Information Center

    Gray, Ron

    2014-01-01

    Inquiry experiences in secondary science classrooms are heavily weighted toward experimentation. We know, however, that many fields of science (e.g., evolutionary biology, cosmology, and paleontology), while they may utilize experiments, are not justified by experimental methodologies. With the focus on experimentation in schools, these fields of…

  6. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation, and Research

    NASA Astrophysics Data System (ADS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-06-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing -1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  7. NASA IMAGESEER: NASA IMAGEs for Science, Education, Experimentation and Research

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Grubb, Thomas G.; Milner, Barbara C.

    2012-01-01

    A number of web-accessible databases, including medical, military or other image data, offer universities and other users the ability to teach or research new Image Processing techniques on relevant and well-documented data. However, NASA images have traditionally been difficult for researchers to find, are often only available in hard-to-use formats, and do not always provide sufficient context and background for a non-NASA Scientist user to understand their content. The new IMAGESEER (IMAGEs for Science, Education, Experimentation and Research) database seeks to address these issues. Through a graphically-rich web site for browsing and downloading all of the selected datasets, benchmarks, and tutorials, IMAGESEER provides a widely accessible database of NASA-centric, easy to read, image data for teaching or validating new Image Processing algorithms. As such, IMAGESEER fosters collaboration between NASA and research organizations while simultaneously encouraging development of new and enhanced Image Processing algorithms. The first prototype includes a representative sampling of NASA multispectral and hyperspectral images from several Earth Science instruments, along with a few small tutorials. Image processing techniques are currently represented with cloud detection, image registration, and map cover/classification. For each technique, corresponding data are selected from four different geographic regions, i.e., mountains, urban, water coastal, and agriculture areas. Satellite images have been collected from several instruments - Landsat-5 and -7 Thematic Mappers, Earth Observing-1 (EO-1) Advanced Land Imager (ALI) and Hyperion, and the Moderate Resolution Imaging Spectroradiometer (MODIS). After geo-registration, these images are available in simple common formats such as GeoTIFF and raw formats, along with associated benchmark data.

  8. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    SciTech Connect

    Jablonowski, Christiane

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  9. Dancing on the Grid: using e-Science tools to extend choreographic research.

    PubMed

    Bailey, Helen; Bachler, Michelle; Buckingham Shum, Simon; Le Blanc, Anja; Popat, Sita; Rowley, Andrew; Turner, Martin

    2009-07-13

    This paper considers the role and impact of new and emerging e-Science tools on practice-led research in dance. Specifically, it draws on findings from the e-Dance project. This 2-year project brings together an interdisciplinary team combining research aspects of choreography, next generation of videoconferencing and human-computer interaction analysis incorporating hypermedia and nonlinear annotations for recording and documentation.

  10. Students' epistemologies about experimental physics: Validating the Colorado Learning Attitudes about Science Survey for experimental physics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-06-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder and elsewhere, we developed the Colorado Learning Attitudes about Science Survey for experimental physics (E-CLASS). Previous work with this assessment has included establishing the accuracy and clarity of the instrument through student interviews and preliminary testing. Several years of data collection at multiple institutions has resulted in a growing national data set of student responses. Here, we report on results of the analysis of these data to investigate the statistical validity and reliability of the E-CLASS as a measure of students' epistemologies for a broad student population. We find that the E-CLASS demonstrates an acceptable level of both validity and reliability on measures of item and test discrimination, test-retest reliability, partial-sample reliability, internal consistency, concurrent validity, and convergent validity. We also examine students' responses using principal component analysis and find that, as expected, the E-CLASS does not exhibit strong factors (a.k.a. categories).

  11. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William W.; Schuster, David; Adams, Betty; Applegate, Brooks; Skjold, Brandy; Undreiu, Adriana; Loving, Cathleen C.; Gobert, Janice D.

    2010-01-01

    There are continuing educational and political debates about "inquiry" versus "direct" teaching of science. Traditional science instruction has been largely direct but in the US, recent national and state science education standards advocate inquiry throughout K-12 education. While inquiry-based instruction has the advantage of modelling aspects…

  12. FermiGrid

    SciTech Connect

    Yocum, D.R.; Berman, E.; Canal, P.; Chadwick, K.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Sharma, N.; Timm, S.; /Fermilab

    2007-05-01

    As one of the founding members of the Open Science Grid Consortium (OSG), Fermilab enables coherent access to its production resources through the Grid infrastructure system called FermiGrid. This system successfully provides for centrally managed grid services, opportunistic resource access, development of OSG Interfaces for Fermilab, and an interface to the Fermilab dCache system. FermiGrid supports virtual organizations (VOs) including high energy physics experiments (USCMS, MINOS, D0, CDF, ILC), astrophysics experiments (SDSS, Auger, DES), biology experiments (GADU, Nanohub) and educational activities.

  13. Grid Architecture 2

    SciTech Connect

    Taft, Jeffrey D.

    2016-01-01

    The report describes work done on Grid Architecture under the auspices of the Department of Electricity Office of Electricity Delivery and Reliability in 2015. As described in the first Grid Architecture report, the primary purpose of this work is to provide stakeholder insight about grid issues so as to enable superior decision making on their part. Doing this requires the creation of various work products, including oft-times complex diagrams, analyses, and explanations. This report provides architectural insights into several important grid topics and also describes work done to advance the science of Grid Architecture as well.

  14. Experimental Investigation of the Behavior of Sub-Grid Scale Motions in Turbulent Shear Flow

    NASA Technical Reports Server (NTRS)

    Cantwell, Brian

    1992-01-01

    Experiments have been carried out on a vertical jet of helium issuing into a co-flow of air at a fixed exit velocity ratio of 2.0. At all the experimental conditions studied, the flow exhibits a strong self excited periodicity. The natural frequency behavior of the jet, the underlying fine-scale flow structure, and the transition to turbulence have been studied over a wide range of flow conditions. The experiments were conducted in a variable pressure facility which made it possible to vary the Reynolds number and Richardson number independently. A stroboscopic schlieren system was used for flow visualization and single-component Laser Doppler Anemometry was used to measure the axial component of velocity. The flow exhibits several interesting features. The presence of co-flow eliminates the random meandering typical of buoyant plumes in a quiescent environment and the periodicity of the helium jet under high Richardson number conditions is striking. Under these conditions transition to turbulence consists of a rapid but highly structured and repeatable breakdown and intermingling of jet and freestream fluid. At Ri = 1.6 the three-dimensional structure of the flow is seen to repeat from cycle to cycle. The point of transition moves closer to the jet exit as either the Reynolds number or the Richardson number increases. The wavelength of the longitudinal instability increases with Richardson number. At low Richardson numbers, the natural frequency scales on an inertial time scale. At high Richardson number the natural frequency scales on a buoyancy time scale. The transition from one flow regime to another occurs over a narrow range of Richardson numbers from 0.7 to 1. A buoyancy Strouhal number is used to correlate the high Richardson number frequency behavior.

  15. Real-World Experimentation Comparing Time-Sharing and Batch Processing in Teaching Computer Science,

    DTIC Science & Technology

    effectiveness of time-sharing and batch processing in teaching computer science . The experimental design was centered on direct, ’real world’ comparison...ALGOL). The experimental sample involved all introductory computer science courses with a total population of 415 cadets. The results generally

  16. The DOE SunShot Initiative: Science and Technology to enable Solar Electricity at Grid Parity

    NASA Astrophysics Data System (ADS)

    Ramesh, Ramamoorthy

    2012-02-01

    The SunShot Initiative's mission is to develop solar energy technologies through a collaborative national push to make solar Photovoltaic (PV) and Concentrated Solar Power (CSP) energy technologies cost-competitive with fossil fuel based energy by reducing the cost of solar energy systems by ˜ 75 percent before 2020. Reducing the total installed cost for utility-scale solar electricity to roughly 6 cents per kilowatt hour (1/Watt) without subsidies will result in rapid, large-scale adoption of solar electricity across the United States and the world. Achieving this goal will require significant reductions and technological innovations in all PV system components, namely modules, power electronics, and balance of systems (BOS), which includes all other components and costs required for a fully installed system including permitting and inspection costs. This investment will re-establish American technological and market leadership, improve the nation's energy security, strengthen U.S. economic competitiveness and catalyze domestic economic growth in the global clean energy race. SunShot is a cooperative program across DOE, involving the Office of Science, the Office of Energy Efficiency and Renewable Energy and ARPA-E.

  17. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  18. Experimental Comparison of Inquiry and Direct Instruction in Science

    ERIC Educational Resources Information Center

    Cobern, William; Schuster, David; Adams, Betty

    2010-01-01

    It is evident that "experientially-based" instruction and "active student engagement" are advantageous for effective science learning. However, "hands-on" and "minds-on" aspects can occur in both inquiry and direct science instruction, and convincing comparative evidence for the superiority of either mode…

  19. Vanguard: A New Science Mission For Experimental Astrobiology

    NASA Astrophysics Data System (ADS)

    Ellery, A.; Wynn-Williams, D.; Edwards, H.; Dickensheets, D.; Welch, C.; Curley, A.

    As an alternative to technically and financially problemat ic sample return missions, a rover-mounted laser Raman spectrometer sensitive to biomolecules and their mineral substrata is a promising alternative in the search for evidence of former life on Mars. We presented a new remote in situ analysis package being designed for experimental astrobiology on terrestrial-type planetary surfaces. The science is based on the hypothesis that if life arose on Mars, the selective pressure of solar radiation would have led to the evolution of pigmented systems to harness the energy of sunlight and to protect cells from concurrent UV stress. Microbial communities would have therefore become stratified by the light gradient, and our remote system would penetrate the near-subsurface profile in a vertical transect of horizontal strata in ancient sediments (such as palaeolake beds). The system will include an extensive array of robotic support to translocate and deploy a Raman spectrometer detectors beneath the surface of Mars ­ it will comprise of a base station lander to support communications, a robotic micro-rover to permit well- separated triplicate profiles made by three ground-penetrating moles mounted in a vertical configuration. Each mole will deploy a tether carrying fibre optic cables coupling the Raman spectrometer onboard the rover and the side-scanning sensor head on the mole. The complete system has been named Vanguard, and it represents a close collaboration between a space robotics engineer (Ellery), an astrobiologist (Wynn-Williams), a molecular spectroscopist (Edwards), an opto-electronic technologist (Dickensheets), a spacecraft engineer (Welch) and a robotic vision specialist (Curley). The autonomy requirement for the Vanguard instrument requires that significant scientific competence is imparted to the instrument through an expert system to ensure that quick-look analysis is performed onboard in real-time as the mole penetrates beneath the surface. Onboard

  20. Central Computer Science Concepts to Research-Based Teacher Training in Computer Science: An Experimental Study

    ERIC Educational Resources Information Center

    Zendler, Andreas; Klaudt, Dieter

    2012-01-01

    The significance of computer science for economics and society is undisputed. In particular, computer science is acknowledged to play a key role in schools (e.g., by opening multiple career paths). The provision of effective computer science education in schools is dependent on teachers who are able to properly represent the discipline and whose…

  1. An 11-year global gridded aerosol optical thickness reanalysis (v1.0) for atmospheric and climate sciences

    NASA Astrophysics Data System (ADS)

    Lynch, Peng; Reid, Jeffrey S.; Westphal, Douglas L.; Zhang, Jianglong; Hogan, Timothy F.; Hyer, Edward J.; Curtis, Cynthia A.; Hegg, Dean A.; Shi, Yingxi; Campbell, James R.; Rubin, Juli I.; Sessions, Walter R.; Turk, F. Joseph; Walker, Annette L.

    2016-04-01

    While stand alone satellite and model aerosol products see wide utilization, there is a significant need in numerous atmospheric and climate applications for a fused product on a regular grid. Aerosol data assimilation is an operational reality at numerous centers, and like meteorological reanalyses, aerosol reanalyses will see significant use in the near future. Here we present a standardized 2003-2013 global 1 × 1° and 6-hourly modal aerosol optical thickness (AOT) reanalysis product. This data set can be applied to basic and applied Earth system science studies of significant aerosol events, aerosol impacts on numerical weather prediction, and electro-optical propagation and sensor performance, among other uses. This paper describes the science of how to develop and score an aerosol reanalysis product. This reanalysis utilizes a modified Navy Aerosol Analysis and Prediction System (NAAPS) at its core and assimilates quality controlled retrievals of AOT from the Moderate Resolution Imaging Spectroradiometer (MODIS) on Terra and Aqua and the Multi-angle Imaging SpectroRadiometer (MISR) on Terra. The aerosol source functions, including dust and smoke, were regionally tuned to obtain the best match between the model fine- and coarse-mode AOTs and the Aerosol Robotic Network (AERONET) AOTs. Other model processes, including deposition, were tuned to minimize the AOT difference between the model and satellite AOT. Aerosol wet deposition in the tropics is driven with satellite-retrieved precipitation, rather than the model field. The final reanalyzed fine- and coarse-mode AOT at 550 nm is shown to have good agreement with AERONET observations, with global mean root mean square error around 0.1 for both fine- and coarse-mode AOTs. This paper includes a discussion of issues particular to aerosol reanalyses that make them distinct from standard meteorological reanalyses, considerations for extending such a reanalysis outside of the NASA A-Train era, and examples of how

  2. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor.

    PubMed

    Singh, M J; De Esch, H P L

    2010-01-01

    This paper describes the physics design of a 100 keV, 60 A H(-) accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated.

  3. Physics design of a 100 keV acceleration grid system for the diagnostic neutral beam for international tokamak experimental reactor

    NASA Astrophysics Data System (ADS)

    Singh, M. J.; De Esch, H. P. L.

    2010-01-01

    This paper describes the physics design of a 100 keV, 60 A H- accelerator for the diagnostic neutral beam (DNB) for international tokamak experimental reactor (ITER). The accelerator is a three grid system comprising of 1280 apertures, grouped in 16 groups with 80 apertures per beam group. Several computer codes have been used to optimize the design which follows the same philosophy as the ITER Design Description Document (DDD) 5.3 and the 1 MeV heating and current drive beam line [R. Hemsworth, H. Decamps, J. Graceffa, B. Schunke, M. Tanaka, M. Dremel, A. Tanga, H. P. L. De Esch, F. Geli, J. Milnes, T. Inoue, D. Marcuzzi, P. Sonato, and P. Zaccaria, Nucl. Fusion 49, 045006 (2009)]. The aperture shapes, intergrid distances, and the extractor voltage have been optimized to minimize the beamlet divergence. To suppress the acceleration of coextracted electrons, permanent magnets have been incorporated in the extraction grid, downstream of the cooling water channels. The electron power loads on the extractor and the grounded grids have been calculated assuming 1 coextracted electron per ion. The beamlet divergence is calculated to be 4 mrad. At present the design for the filter field of the RF based ion sources for ITER is not fixed, therefore a few configurations of the same have been considered. Their effect on the transmission of the electrons and beams through the accelerator has been studied. The OPERA-3D code has been used to estimate the aperture offset steering constant of the grounded grid and the extraction grid, the space charge interaction between the beamlets and the kerb design required to compensate for this interaction. All beamlets in the DNB must be focused to a single point in the duct, 20.665 m from the grounded grid, and the required geometrical aimings and aperture offsets have been calculated.

  4. Challenges facing production grids

    SciTech Connect

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  5. Learning Political Science with Prediction Markets: An Experimental Study

    ERIC Educational Resources Information Center

    Ellis, Cali Mortenson; Sami, Rahul

    2012-01-01

    Prediction markets are designed to aggregate the information of many individuals to forecast future events. These markets provide participants with an incentive to seek information and a forum for interaction, making markets a promising tool to motivate student learning. We carried out a quasi-experiment in an introductory political science class…

  6. Early Adolescence: Using Consumer Science to Develop Experimental Techniques.

    ERIC Educational Resources Information Center

    Padilla, Michael

    1981-01-01

    Describes several consumer science activities useful for introducing process skills for the middle/junior high school student. Activities described include testing laundry detergent effectiveness for stain removal, comparison of quantities in fast foods, and various activities concerning tests of product claims. (DS)

  7. Accounting for reciprocal host-microbiome interactions in experimental science.

    PubMed

    Stappenbeck, Thaddeus S; Virgin, Herbert W

    2016-06-09

    Mammals are defined by their metagenome, a combination of host and microbiome genes. This knowledge presents opportunities to further basic biology with translation to human diseases. However, the now-documented influence of the metagenome on experimental results and the reproducibility of in vivo mammalian models present new challenges. Here we provide the scientific basis for calling on all investigators, editors and funding agencies to embrace changes that will enhance reproducible and interpretable experiments by accounting for metagenomic effects. Implementation of new reporting and experimental design principles will improve experimental work, speed discovery and translation, and properly use substantial investments in biomedical research.

  8. Experimental systems overview of the Rare Isotope Science Project in Korea

    NASA Astrophysics Data System (ADS)

    Tshoo, K.; Kim, Y. K.; Kwon, Y. K.; Woo, H. J.; Kim, G. D.; Kim, Y. J.; Kang, B. H.; Park, S. J.; Park, Y.-H.; Yoon, J. W.; Kim, J. C.; Lee, J. H.; Seo, C. S.; Hwang, W.; Yun, C. C.; Jeon, D.; Kim, S. K.

    2013-12-01

    The Rare Isotope Science Project (RISP) was launched by the Institute for Basic Science (IBS) in December 2011 in Korea. The project aims to construct the new accelerator complex consisting of the Isotope Separation On-Line (ISOL) and the In-Flight Fragment (IF) facilities for the rare isotope science. The scientific programs and the experimental systems of RISP are briefly introduced with an overview of the complex.

  9. Experimental Evaluation of PV Inverter Anti-Islanding with Grid Support Functions in Multi-Inverter Island Scenarios

    SciTech Connect

    Hoke, Anderson; Nelson, Austin; Miller, Brian; Chakraborty, Sudipta; Bell, Frances; McCarty, Michael

    2016-07-01

    As PV and other DER systems are connected to the grid at increased penetration levels, island detection may become more challenging for two reasons: 1.) In islands containing many DERs, active inverter-based anti-islanding methods may have more difficulty detecting islands because each individual inverter's efforts to detect the island may be interfered with by the other inverters in the island. 2.) The increasing numbers of DERs are leading to new requirements that DERs ride through grid disturbances and even actively try to regulate grid voltage and frequency back towards nominal operating conditions. These new grid support requirements may directly or indirectly interfere with anti-islanding controls. This report describes a series of tests designed to examine the impacts of both grid support functions and multi-inverter islands on anti-islanding effectiveness. Crucially, the multi-inverter anti-islanding tests described in this report examine scenarios with multiple inverters connected to multiple different points on the grid. While this so-called 'solar subdivision' scenario has been examined to some extent through simulation, this is the first known work to test it using hardware inverters. This was accomplished through the use of power hardware-in-the-loop (PHIL) simulation, which allows the hardware inverters to be connected to a real-time transient simulation of an electric power system that can be easily reconfigured to test various distribution circuit scenarios. The anti-islanding test design was a modified version of the unintentional islanding test in IEEE Standard 1547.1, which creates a balanced, resonant island with the intent of creating a highly challenging condition for island detection. Three common, commercially available single-phase PV inverters from three different manufacturers were tested. The first part of this work examined each inverter individually using a series of pure hardware resistive-inductive-capacitive (RLC) resonant load

  10. The First Language in Science Class: A Quasi-Experimental Study in Late French Immersion

    ERIC Educational Resources Information Center

    Turnbull, Miles; Cormier, Marianne; Bourque, Jimmy

    2011-01-01

    This article reports analysis of data collected from a quasi-experimental study in 2 Canadian late French immersion science classes. We examine if, how, and when the first language (L1) is used when students in the first years of their second language learning talk about complex science concepts. We compare differences in groups following a…

  11. Improving plant bioaccumulation science through consistent reporting of experimental data.

    PubMed

    Fantke, Peter; Arnot, Jon A; Doucette, William J

    2016-10-01

    Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments.

  12. Safe Grid

    NASA Technical Reports Server (NTRS)

    Chow, Edward T.; Stewart, Helen; Korsmeyer, David (Technical Monitor)

    2003-01-01

    The biggest users of GRID technologies came from the science and technology communities. These consist of government, industry and academia (national and international). The NASA GRID is moving into a higher technology readiness level (TRL) today; and as a joint effort among these leaders within government, academia, and industry, the NASA GRID plans to extend availability to enable scientists and engineers across these geographical boundaries collaborate to solve important problems facing the world in the 21 st century. In order to enable NASA programs and missions to use IPG resources for program and mission design, the IPG capabilities needs to be accessible from inside the NASA center networks. However, because different NASA centers maintain different security domains, the GRID penetration across different firewalls is a concern for center security people. This is the reason why some IPG resources are been separated from the NASA center network. Also, because of the center network security and ITAR concerns, the NASA IPG resource owner may not have full control over who can access remotely from outside the NASA center. In order to obtain organizational approval for secured remote access, the IPG infrastructure needs to be adapted to work with the NASA business process. Improvements need to be made before the IPG can be used for NASA program and mission development. The Secured Advanced Federated Environment (SAFE) technology is designed to provide federated security across NASA center and NASA partner's security domains. Instead of one giant center firewall which can be difficult to modify for different GRID applications, the SAFE "micro security domain" provide large number of professionally managed "micro firewalls" that can allow NASA centers to accept remote IPG access without the worry of damaging other center resources. The SAFE policy-driven capability-based federated security mechanism can enable joint organizational and resource owner approved remote

  13. The Earth System Grid Federation (ESGF): Climate Science Infrastructure for Large-scale Data Management and Dissemination

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2015-12-01

    Progress in understanding and predicting climate change requires advanced tools to securely store, manage, access, process, analyze, and visualize enormous and distributed data sets. Only then can climate researchers understand the effects of climate change across all scales and use this information to inform policy decisions. With the advent of major international climate modeling intercomparisons, a need emerged within the climate-change research community to develop efficient, community-based tools to obtain relevant meteorological and other observational data, develop custom computational models, and export analysis tools for climate-change simulations. While many nascent efforts to fill these gaps appeared, they were not integrated and therefore did not benefit from collaborative development. Sharing huge data sets was difficult, and the lack of data standards prevented the merger of output data from different modeling groups. Thus began one of the largest-ever collaborative data efforts in climate science, resulting in the Earth System Grid Federation (ESGF), which is now used to disseminate model, observational, and reanalysis data for research assessed by the Intergovernmental Panel on Climate Change (IPCC). Today, ESGF is an open-source petabyte-level data storage and dissemination operational code-base that manages secure resources essential for climate change study. It is designed to remain robust even as data volumes grow exponentially. The internationally distributed, peer-to-peer ESGF "data cloud" archive represents the culmination of an effort that began in the late 1990s. ESGF portals are gateways to scientific data collections hosted at sites around the globe that allow the user to register and potentially access the entire ESGF network of data and services. The growing international interest in ESGF development efforts has attracted many others who want to make their data more widely available and easy to use. For example, the World Climate

  14. Is Physicality an Important Aspect of Learning through Science Experimentation among Kindergarten Students?

    ERIC Educational Resources Information Center

    Zacharia, Zacharias C.; Loizou, Eleni; Papaevripidou, Marios

    2012-01-01

    The purpose of this study was to investigate whether physicality (actual and active touch of concrete material), as such, is a necessity for science experimentation learning at the kindergarten level. We compared the effects of student experimentation with Physical Manipulatives (PM) and Virtual Manipulatives (VM) on kindergarten students'…

  15. The Beliefs and Behaviors of Pupils in an Experimental School: The Science Lab.

    ERIC Educational Resources Information Center

    Lancy, David F.

    This booklet, the second in a series, reports on the results of a year-long research project conducted in an experimental school associated with the Learning Research and Development Center, University of Pittsburgh. Specifically, this is a report of findings pertaining to one major setting in the experimental school, the science lab. The science…

  16. Experimenter Confirmation Bias and the Correction of Science Misconceptions

    NASA Astrophysics Data System (ADS)

    Allen, Michael; Coole, Hilary

    2012-06-01

    This paper describes a randomised educational experiment ( n = 47) that examined two different teaching methods and compared their effectiveness at correcting one science misconception using a sample of trainee primary school teachers. The treatment was designed to promote engagement with the scientific concept by eliciting emotional responses from learners that were triggered by their own confirmation biases. The treatment group showed superior learning gains to control at post-test immediately after the lesson, although benefits had dissipated after 6 weeks. Findings are discussed with reference to the conceptual change paradigm and to the importance of feeling emotion during a learning experience, having implications for the teaching of pedagogies to adults that have been previously shown to be successful with children.

  17. A New Virtual and Remote Experimental Environment for Teaching and Learning Science

    NASA Astrophysics Data System (ADS)

    Lustigova, Zdena; Lustig, Frantisek

    This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.

  18. Nonlethal suppression: from basic science to operationally relevant experimentation

    NASA Astrophysics Data System (ADS)

    Servatius, Richard J.; Beck, Kevin D.

    2006-05-01

    Use of force justification, second nature to law enforcement personnel, is increasingly considered by military personnel especially in military operations on urban terrain (MOUT) scenarios. In these situations, military and civilian law enforcement objectives are similar: exert control over individuals and groups with minimum force. Although the list of potential devices and systems grow, empirical demonstrations of effectiveness are lacking. Here, a position is presented regarding approaches to experimental analysis of nonlethal (a.k.a., less-than-lethal and less lethal) technologies and solutions. Appreciation of the concepts of suppression and its attendant behavioral variables will advance the development of nonlethal weapons and systems (NLW&S).

  19. Students' Epistemologies about Experimental Physics: Validating the Colorado Learning Attitudes about Science Survey for Experimental Physics

    ERIC Educational Resources Information Center

    Wilcox, Bethany R.; Lewandowski, H. J.

    2016-01-01

    Student learning in instructional physics labs represents a growing area of research that includes investigations of students' beliefs and expectations about the nature of experimental physics. To directly probe students' epistemologies about experimental physics and support broader lab transformation efforts at the University of Colorado Boulder…

  20. Social Science and Neuroscience beyond Interdisciplinarity: Experimental Entanglements.

    PubMed

    Fitzgerald, Des; Callard, Felicity

    2015-01-01

    This article is an account of the dynamics of interaction across the social sciences and neurosciences. Against an arid rhetoric of 'interdisciplinarity', it calls for a more expansive imaginary of what experiment - as practice and ethos - might offer in this space. Arguing that opportunities for collaboration between social scientists and neuroscientists need to be taken seriously, the article situates itself against existing conceptualizations of these dynamics, grouping them under three rubrics: 'critique', 'ebullience' and 'interaction'. Despite their differences, each insists on a distinction between sociocultural and neurobiological knowledge, or does not show how a more entangled field might be realized. The article links this absence to the 'regime of the inter-', an ethic of interdisciplinarity that guides interaction between disciplines on the understanding of their pre-existing separateness. The argument of the paper is thus twofold: (1) that, contra the 'regime of the inter-', it is no longer practicable to maintain a hygienic separation between sociocultural webs and neurobiological architecture; (2) that the cognitive neuroscientific experiment, as a space of epistemological and ontological excess, offers an opportunity to researchers, from all disciplines, to explore and register this realization.

  1. Considerations for Life Science experimentation on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Davies, P.; Rossberg Walker, K.

    1992-01-01

    The conduct of Life Science experiments aboard the Shuttle Spacelab presents unaccustomed challenges to scientists. Not only is one confronted with the challenge of conducting an experiment in the unique microgravity environment of a orbiting spacecraft, but there are also the challenges of conducing experiments remotely, using equipment, techniques, chemicals, and materials that may differ from those standardly used in ones own laboratory. Then there is the question of "controls." How does one study the effects of altered gravitational fields on biological systems and control for other variables like vibration, acceleration, noise, temperature, humidity, and the logistics of specimen transport? Typically, the scientist new to space research has neither considered all of these potential problems nor has the data at hand with which to tackle the problems. This paper will explore some of these issues and provide pertinent data from recent Space Shuttle flights that will assist the new as well as the experienced scientist in dealing with the challenges of conducting research under spaceflight conditions.

  2. Social Science and Neuroscience beyond Interdisciplinarity: Experimental Entanglements

    PubMed Central

    Callard, Felicity

    2015-01-01

    This article is an account of the dynamics of interaction across the social sciences and neurosciences. Against an arid rhetoric of ‘interdisciplinarity’, it calls for a more expansive imaginary of what experiment – as practice and ethos – might offer in this space. Arguing that opportunities for collaboration between social scientists and neuroscientists need to be taken seriously, the article situates itself against existing conceptualizations of these dynamics, grouping them under three rubrics: ‘critique’, ‘ebullience’ and ‘interaction’. Despite their differences, each insists on a distinction between sociocultural and neurobiological knowledge, or does not show how a more entangled field might be realized. The article links this absence to the ‘regime of the inter-’, an ethic of interdisciplinarity that guides interaction between disciplines on the understanding of their pre-existing separateness. The argument of the paper is thus twofold: (1) that, contra the ‘regime of the inter-’, it is no longer practicable to maintain a hygienic separation between sociocultural webs and neurobiological architecture; (2) that the cognitive neuroscientific experiment, as a space of epistemological and ontological excess, offers an opportunity to researchers, from all disciplines, to explore and register this realization. PMID:25972621

  3. Analysis and Experimental Verification of New Power Flow Control for Grid-Connected Inverter with LCL Filter in Microgrid

    PubMed Central

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304

  4. Analysis and experimental verification of new power flow control for grid-connected inverter with LCL filter in microgrid.

    PubMed

    Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang

    2014-01-01

    Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.

  5. An Experimental Comparison of Case Histories with Conventional Materials in Teaching a College General Education Course in Science.

    ERIC Educational Resources Information Center

    Peterson, Ronald G.

    Experimentally evaluated were the merits of a case history and a current reading materials approach to a college general education science course with regard to facts and generalizations, methods of science, and scientific attitudes examination scores. In the experimental treatment, the nature of science and scientific research and other course…

  6. Logical Experimental Design and Execution in the Biomedical Sciences.

    PubMed

    Holder, Daniel J; Marino, Michael J

    2017-03-17

    Lack of reproducibility has been highlighted as a significant problem in biomedical research. The present unit is devoted to describing ways to help ensure that research findings can be replicated by others, with a focus on the design and execution of laboratory experiments. Essential components for this include clearly defining the question being asked, using available information or information from pilot studies to aid in the design the experiment, and choosing manipulations under a logical framework based on Mill's "methods of knowing" to build confidence in putative causal links. Final experimental design requires systematic attention to detail, including the choice of controls, sample selection, blinding to avoid bias, and the use of power analysis to determine the sample size. Execution of the experiment is done with care to ensure that the independent variables are controlled and the measurements of the dependent variables are accurate. While there are always differences among laboratories with respect to technical expertise, equipment, and suppliers, execution of the steps itemized in this unit will ensure well-designed and well-executed experiments to answer any question in biomedical research. © 2017 by John Wiley & Sons, Inc.

  7. "Exploratory experimentation" as a probe into the relation between historiography and philosophy of science.

    PubMed

    Schickore, Jutta

    2016-02-01

    This essay utilizes the concept "exploratory experimentation" as a probe into the relation between historiography and philosophy of science. The essay traces the emergence of the historiographical concept "exploratory experimentation" in the late 1990s. The reconstruction of the early discussions about exploratory experimentation shows that the introduction of the concept had unintended consequences: Initially designed to debunk philosophical ideas about theory testing, the concept "exploratory experimentation" quickly exposed the poverty of our conceptual tools for the analysis of experimental practice. Looking back at a number of detailed analyses of experimental research, we can now appreciate that the concept of exploratory experimentation is too vague and too elusive to fill the desideratum whose existence it revealed.

  8. Experimental evaluation of prefiltering for 56 Gbaud DP-QPSK signal transmission in 75 GHz WDM grid

    NASA Astrophysics Data System (ADS)

    Borkowski, Robert; de Carvalho, Luis Henrique H.; Silva, Edson Porto da; Diniz, Júlio César M.; Zibar, Darko; de Oliveira, Júlio César R. F.; Tafur Monroy, Idelfonso

    2014-01-01

    We investigate optical prefiltering for 56 Gbaud (224 Gbit/s) electrical time-division multiplexed (ETDM) dual polarization (DP) quaternary phase shift keying (QPSK) transmission. Different transmitter-side optical filter shapes are tested and their bandwidths are varied. Comparison of studied filter shapes shows an advantage of a pre-emphasis filter. Subsequently, we perform a fiber transmission of the 56 Gbaud DP QPSK signal filtered with the 65 GHz pre-emphasis filter to fit the 75 GHz transmission grid. Bit error rate (BER) of the signal remains below forward error correction (FEC) limit after 300 km of fiber propagation.

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  10. Real-Time Smart Grids Control for Preventing Cascading Failures and Blackout using Neural Networks: Experimental Approach for N-1-1 Contingency

    NASA Astrophysics Data System (ADS)

    Zarrabian, Sina; Belkacemi, Rabie; Babalola, Adeniyi A.

    2016-12-01

    In this paper, a novel intelligent control is proposed based on Artificial Neural Networks (ANN) to mitigate cascading failure (CF) and prevent blackout in smart grid systems after N-1-1 contingency condition in real-time. The fundamental contribution of this research is to deploy the machine learning concept for preventing blackout at early stages of its occurrence and to make smart grids more resilient, reliable, and robust. The proposed method provides the best action selection strategy for adaptive adjustment of generators' output power through frequency control. This method is able to relieve congestion of transmission lines and prevent consecutive transmission line outage after N-1-1 contingency condition. The proposed ANN-based control approach is tested on an experimental 100 kW test system developed by the authors to test intelligent systems. Additionally, the proposed approach is validated on the large-scale IEEE 118-bus power system by simulation studies. Experimental results show that the ANN approach is very promising and provides accurate and robust control by preventing blackout. The technique is compared to a heuristic multi-agent system (MAS) approach based on communication interchanges. The ANN approach showed more accurate and robust response than the MAS algorithm.

  11. Grid Work

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Pointwise Inc.'s, Gridgen Software is a system for the generation of 3D (three dimensional) multiple block, structured grids. Gridgen is a visually-oriented, graphics-based interactive code used to decompose a 3D domain into blocks, distribute grid points on curves, initialize and refine grid points on surfaces and initialize volume grid points. Gridgen is available to U.S. citizens and American-owned companies by license.

  12. Views of the STS-5 Science Press briefing with Student Experimenters

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Views of the STS-5 Science Press briefing with Student Experimenters. Photos include Michelle Issel of Wallingfor, Connecticut showing her studen experiment dealing with the formation of crystals in a weightless environment (37862); Aaron Gillette of Winter Haven, Florida displaying his student experiment dealing with the growth of Porifera in zero gravity (37863).

  13. Opening Possibilities in Experimental Science and Its History: Critical Explorations with Pendulums and Singing Tubes

    ERIC Educational Resources Information Center

    Cavicchi, Elizabeth

    2008-01-01

    A teacher and a college student explore experimental science and its history by reading historical texts, and responding with replications and experiments of their own. A curriculum of ever-widening possibilities evolves in their ongoing interactions with each other, history, and such materials as pendulums, flame, and resonant singing tubes.…

  14. The Role of the Scientific Discovery Narrative in Middle School Science Education: An Experimental Study

    ERIC Educational Resources Information Center

    Arya, Diana J.; Maul, Andrew

    2012-01-01

    In an experimental study (N = 209), the authors compared the effects of exposure to typical middle-school written science content when presented in the context of the scientific discovery narrative and when presented in a more traditional nonnarrative format on 7th and 8th grade students in the United States. The development of texts was…

  15. General Science, Ninth Grade: Theme I and Theme II. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This ninth grade student manual was developed to be used in conjunction with some of the experimental science activities described in the teacher's guide. It contains laboratory worksheets for: (1) measurement; (2) basic energy concepts; (3) heat energy; (4) light; (5) sound; (6) electricity; and (7) present and future energy resources. Additional…

  16. A Genre Analysis of English and Spanish Research Paper Abstracts in Experimental Social Sciences.

    ERIC Educational Resources Information Center

    Martin, Pedro Martin

    2003-01-01

    Investigated the extent to which there is rhetorical variation between the research article abstracts written in English for international journals and those written in Spanish and published in Spanish journals in the area of experimental social sciences. Rhetorical variables found across the two languages may be explained by the different…

  17. General Science, Ninth Grade: Theme III and Theme IV. Student Laboratory Manual. Experimental.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Curriculum and Instruction.

    This document is the student laboratory manual that was designed to accompany some of the experimental activities found in the teacher's guide to this general science course for ninth graders. It contains laboratory worksheets for lessons on such topics as: (1) soil; (2) hazardous waste; (3) wildlife refuges; (4) the water cycle; (5) water…

  18. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  19. MAGNETIC GRID

    DOEpatents

    Post, R.F.

    1960-08-01

    An electronic grid is designed employing magnetic forces for controlling the passage of charged particles. The grid is particularly applicable to use in gas-filled tubes such as ignitrons. thyratrons, etc., since the magnetic grid action is impartial to the polarity of the charged particles and, accordingly. the sheath effects encountered with electrostatic grids are not present. The grid comprises a conductor having sections spaced apart and extending in substantially opposite directions in the same plane, the ends of the conductor being adapted for connection to a current source.

  20. Experimental Characterization of a Grid-Loss Event on a 2.5-MW Dynamometer Using Advanced Operational Modal Analysis: Preprint

    SciTech Connect

    Helsen, J.; Weijtjens, W.; Guo, Y.; Keller, J.; McNiff, B.; Devriendt, C.; Guillaume, P.

    2015-02-01

    This paper experimentally investigates a worst case grid loss event conducted on the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) drivetrain mounted on the 2.5MW NREL dynamic nacelle test-rig. The GRC drivetrain has a directly grid-coupled, fixed speed asynchronous generator. The main goal is the assessment of the dynamic content driving this particular assess the dynamic content of the high-speed stage of the GRC gearbox. In addition to external accelerometers, high frequency sampled measurements of strain gauges were used to assess torque fluctuations and bending moments both at the nacelle main shaft and gearbox high-speed shaft (HSS) through the entire duration of the event. Modal analysis was conducted using a polyreference Least Squares Complex Frequency-domain (pLSCF) modal identification estimator. The event driving the torsional resonance was identified. Moreover, the pLSCF estimator identified main drivetrain resonances based on a combination of acceleration and strain measurements. Without external action during the grid-loss event, a mode shape characterized by counter phase rotation of the rotor and generator rotor determined by the drivetrain flexibility and rotor inertias was the main driver of the event. This behavior resulted in significant torque oscillations with large amplitude negative torque periods. Based on tooth strain measurements of the HSS pinion, this work showed that at each zero-crossing, the teeth lost contact and came into contact with the backside flank. In addition, dynamic nontorque loads between the gearbox and generator at the HSS played an important role, as indicated by strain gauge-measurements.

  1. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    ERIC Educational Resources Information Center

    Onghena, Sofie

    2013-01-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact…

  2. Very high temperature chemistry: Science justification for containerless experimentation in space

    NASA Technical Reports Server (NTRS)

    Hofmeister, William H.; Nordine, Paul

    1990-01-01

    A summary is presented of the justification for application of containerless processing in space to high temperature science. Low earth orbit offers a gravitational environment that allows samples to be positioned in an experimental apparatus by very small forces. Well controlled experiments become possible on reactive materials at high temperatures in a reasonably quiescent state and without container contamination. This provides an opportunity to advance the science of high temperature chemistry that can only be realized with a commitment by NASA to provide advanced facilities for in-space containerless study of materials at very high temperature.

  3. Report on Workshop on Research in Experimental Computer Science Held in Palo Alto, California on 16-18 October 1991

    DTIC Science & Technology

    1992-06-01

    This report describes a workshop that was concerned with how to improve research in experimental computer science . The overall goal of the workshop...was to identity problems and issues in experimental computer science and to propose solutions. The workshop was sponsored by the Office of Naval

  4. Probeware in 8th Grade Science: A Quasi-Experimental Study on Attitude and Achievement

    NASA Astrophysics Data System (ADS)

    Moyer, John F., III

    The use of probeware in the delivery of science instruction has become quite widespread over the past few decades. The current emphasis on Science, Technology, Engineering, and Mathematics (STEM) education, especially in the case of underrepresented populations, seems to have accelerated the inclusion of probeware into curriculum. This quasi-experimental study sought to examine the effects of a direct replacement of traditional science tools with computer-based probeware on student achievement and student attitude toward science. Data analysis was conducted for large comparison groups and then for target STEM groups of African-American, low socioeconomic status, and female. Student achievement was measured by the Energy Concept Inventory and student attitude was measured by the Attitude Toward Science Inventory. The results showed that probeware did not have a significant effect on student achievement for almost all comparison groups. Analysis of student attitude toward science revealed that the use of probeware significantly affected overall student attitude as well as student attitude in several disaggregated subscales of attitude. These findings hold for both the comparison groups and the target STEM groups. Limitations of the study and suggestions for future research are presented.

  5. Current Grid operation and future role of the Grid

    NASA Astrophysics Data System (ADS)

    Smirnova, O.

    2012-12-01

    Grid-like technologies and approaches became an integral part of HEP experiments. Some other scientific communities also use similar technologies for data-intensive computations. The distinct feature of Grid computing is the ability to federate heterogeneous resources of different ownership into a seamless infrastructure, accessible via a single log-on. Like other infrastructures of similar nature, Grid functioning requires not only technologically sound basis, but also reliable operation procedures, monitoring and accounting. The two aspects, technological and operational, are closely related: weaker is the technology, more burden is on operations, and other way around. As of today, Grid technologies are still evolving: at CERN alone, every LHC experiment uses an own Grid-like system. This inevitably creates a heavy load on operations. Infrastructure maintenance, monitoring and incident response are done on several levels, from local system administrators to large international organisations, involving massive human effort worldwide. The necessity to commit substantial resources is one of the obstacles faced by smaller research communities when moving computing to the Grid. Moreover, most current Grid solutions were developed under significant influence of HEP use cases, and thus need additional effort to adapt them to other applications. Reluctance of many non-HEP researchers to use Grid negatively affects the outlook for national Grid organisations, which strive to provide multi-science services. We started from the situation where Grid organisations were fused with HEP laboratories and national HEP research programmes; we hope to move towards the world where Grid will ultimately reach the status of generic public computing and storage service provider and permanent national and international Grid infrastructures will be established. How far will we be able to advance along this path, depends on us. If no standardisation and convergence efforts will take place

  6. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  7. Implementing Production Grids

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Ziobarth, John (Technical Monitor)

    2002-01-01

    We have presented the essence of experience gained in building two production Grids, and provided some of the global context for this work. As the reader might imagine, there were a lot of false starts, refinements to the approaches and to the software, and several substantial integration projects (SRB and Condor integrated with Globus) to get where we are today. However, the point of this paper is to try and make it substantially easier for others to get to the point where Information Power Grids (IPG) and the DOE Science Grids are today. This is what is needed in order to move us toward the vision of a common cyber infrastructure for science. The author would also like to remind the readers that this paper primarily represents the actual experiences that resulted from specific architectural and software choices during the design and implementation of these two Grids. The choices made were dictated by the criteria laid out in section 1. There is a lot more Grid software available today that there was four years ago, and various of these packages are being integrated into IPG and the DOE Grids. However, the foundation choices of Globus, SRB, and Condor would not be significantly different today than they were four years ago. Nonetheless, if the GGF is successful in its work - and we have every reason to believe that it will be - then in a few years we will see that the 28 functions provided by these packages will be defined in terms of protocols and MIS, and there will be several robust implementations available for each of the basic components, especially the Grid Common Services. The impact of the emerging Web Grid Services work is not yet clear. It will likely have a substantial impact on building higher level services, however it is the opinion of the author that this will in no way obviate the need for the Grid Common Services. These are the foundation of Grids, and the focus of almost all of the operational and persistent infrastructure aspects of Grids.

  8. Overture: The grid classes

    SciTech Connect

    Brislawn, K.; Brown, D.; Chesshire, G.; Henshaw, W.

    1997-01-01

    Overture is a library containing classes for grids, overlapping grid generation and the discretization and solution of PDEs on overlapping grids. This document describes the Overture grid classes, including classes for single grids and classes for collections of grids.

  9. The influence of the International Baccalaureate experimental science program format on classroom learning environment and student attitudes toward the subject of science

    NASA Astrophysics Data System (ADS)

    Raiford, Lisa Renee

    This study examined the classroom learning environments and science attitudes of students in three IB Higher Level science classrooms. The study sample consisted of 82 twelfth grade IB science students and three IB Higher Level science teachers. Qualitative and quantitative research methods were used to generate assertions toward the development of a grounded theory on accelerated curriculum effects on the classroom learning environment. The four research questions that guided this investigation are: (1) What are the shared characteristics of IB Higher Level experimental science instructors? (2) What instructional methods do instructors use to implement the IB Higher Level experimental science program and why do the instructors use these methods? (3) What are the students' perceptions about the classroom learning environment in IB Higher Level experimental science courses? (4) Does a relationship exist between student perceptions of the classroom learning environment and student attitudes toward the subject of science in IB Higher Level experimental science courses? The qualitative data sources were field notes from classroom observations, teacher interview transcripts, and relevant documents. These data sources were analyzed by constant comparison analysis. Assertions were generated about the educational and professional qualifications of the Higher Level science instructors and the teaching methods used to implement the IB science curriculum. Quantitative data sources consisted of student responses to the Preferred and Actual Forms of the Individualized Classroom Environment Questionnaire (ICEQ) and the Attitude Towards Science in School Assessment (ATSSA). Student responses to the Preferred and Actual Forms of the ICEQ were analyzed with paired t-tests and one way analyses of variance to determine the students' perceptions about the science classroom environment. Correlation tests were used to examine the relationship between learning environment dimensions and

  10. Fibonacci Grids

    NASA Technical Reports Server (NTRS)

    Swinbank, Richard; Purser, James

    2006-01-01

    Recent years have seen a resurgence of interest in a variety of non-standard computational grids for global numerical prediction. The motivation has been to reduce problems associated with the converging meridians and the polar singularities of conventional regular latitude-longitude grids. A further impetus has come from the adoption of massively parallel computers, for which it is necessary to distribute work equitably across the processors; this is more practicable for some non-standard grids. Desirable attributes of a grid for high-order spatial finite differencing are: (i) geometrical regularity; (ii) a homogeneous and approximately isotropic spatial resolution; (iii) a low proportion of the grid points where the numerical procedures require special customization (such as near coordinate singularities or grid edges). One family of grid arrangements which, to our knowledge, has never before been applied to numerical weather prediction, but which appears to offer several technical advantages, are what we shall refer to as "Fibonacci grids". They can be thought of as mathematically ideal generalizations of the patterns occurring naturally in the spiral arrangements of seeds and fruit found in sunflower heads and pineapples (to give two of the many botanical examples). These grids possess virtually uniform and highly isotropic resolution, with an equal area for each grid point. There are only two compact singular regions on a sphere that require customized numerics. We demonstrate the practicality of these grids in shallow water simulations, and discuss the prospects for efficiently using these frameworks in three-dimensional semi-implicit and semi-Lagrangian weather prediction or climate models.

  11. Grids = Structure.

    ERIC Educational Resources Information Center

    Barrington, Linda; Carter, Jacky

    2003-01-01

    Proposes that narrow columns provide a flexible system of organization for designers. Notes that grids serve the content on the pages, help to develop a layout that will clearly direct the reader to information; and prevent visual monotony. Concludes when grid layouts are used, school publications look as good as professional ones. (PM)

  12. Game-XP: Action Games as Experimental Paradigms for Cognitive Science.

    PubMed

    Gray, Wayne D

    2017-03-13

    Why games? How could anyone consider action games an experimental paradigm for Cognitive Science? In 1973, as one of three strategies he proposed for advancing Cognitive Science, Allen Newell exhorted us to "accept a single complex task and do all of it." More specifically, he told us that rather than taking an "experimental psychology as usual approach," we should "focus on a series of experimental and theoretical studies around a single complex task" so as to demonstrate that our theories of human cognition were powerful enough to explain "a genuine slab of human behavior" with the studies fitting into a detailed theoretical picture. Action games represent the type of experimental paradigm that Newell was advocating and the current state of programming expertise and laboratory equipment, along with the emergence of Big Data and naturally occurring datasets, provide the technologies and data needed to realize his vision. Action games enable us to escape from our field's regrettable focus on novice performance to develop theories that account for the full range of expertise through a twin focus on expertise sampling (across individuals) and longitudinal studies (within individuals) of simple and complex tasks.

  13. Spaceflight Operations Services Grid (SOSG) Prototype Implementation and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.; Lisotta, Anthony J.; Redman, Sandra

    2004-01-01

    Science Operations Services Grid is focusing on building a prototype grid-based environment that incorporates existing and new spaceflight services to enable current and future NASA programs with cost savings and new and evolvable methods to conduct science in a distributed environment. The Science Operations Services Grid (SOSG) will provide a distributed environment for widely disparate organizations to conduct their systems and processes in a more efficient and cost effective manner. These organizations include those that: 1) engage in space-based science and operations, 2) develop space-based systems and processes, and 3) conduct scientific research, bringing together disparate scientific disciplines like geology and oceanography to create new information. In addition educational outreach will be significantly enhanced by providing to schools the same tools used by NASA with the ability of the schools to actively participate on many levels in the science generated by NASA from space and on the ground. The services range from voice, video and telemetry processing and display to data mining, high level processing and visualization tools all accessible from a single portal. In this environment, users would not require high end systems or processes at their home locations to use these services. Also, the user would need to know minimal details about the applications in order to utilize the services. In addition, security at all levels is an underlying goal of the project. The Science Operations Services Grid will focus on four tools that are currently used by the ISS Payload community along with nine more that are new to the community. Under the prototype four Grid virtual organizations PO) will be developed to represent four types of users. They are a Payload (experimenters) VO, a Flight Controllers VO, an Engineering and Science Collaborators VO and an Education and Public Outreach VO. The User-based services will be implemented to replicate the operational voice

  14. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  15. FermiGrid - experience and future plans

    SciTech Connect

    Chadwick, K.; Berman, E.; Canal, P.; Hesselroth, T.; Garzoglio, G.; Levshina, T.; Sergeev, V.; Sfiligoi, I.; Timm, S.; Yocum, D.; /Fermilab

    2007-09-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. In order to better serve this community, Fermilab has placed its production computer resources in a Campus Grid infrastructure called 'FermiGrid'. The FermiGrid infrastructure allows the large experiments at Fermilab to have priority access to their own resources, enables sharing of these resources in an opportunistic fashion, and movement of work (jobs, data) between the Campus Grid and National Grids such as Open Science Grid and the WLCG. FermiGrid resources support multiple Virtual Organizations (VOs), including VOs from the Open Science Grid (OSG), EGEE and the Worldwide LHC Computing Grid Collaboration (WLCG). Fermilab also makes leading contributions to the Open Science Grid in the areas of accounting, batch computing, grid security, job management, resource selection, site infrastructure, storage management, and VO services. Through the FermiGrid interfaces, authenticated and authorized VOs and individuals may access our core grid services, the 10,000+ Fermilab resident CPUs, near-petabyte (including CMS) online disk pools and the multi-petabyte Fermilab Mass Storage System. These core grid services include a site wide Globus gatekeeper, VO management services for several VOs, Fermilab site authorization services, grid user mapping services, as well as job accounting and monitoring, resource selection and data movement services. Access to these services is via standard and well-supported grid interfaces. We will report on the user experience of using the FermiGrid campus infrastructure interfaced to a national cyberinfrastructure--the successes and the problems.

  16. Building a Science of Animal Minds: Lloyd Morgan, Experimentation, and Morgan's Canon.

    PubMed

    Fitzpatrick, Simon; Goodrich, Grant

    2016-07-25

    Conwy Lloyd Morgan (1852-1936) is widely regarded as the father of modern comparative psychology. Yet, Morgan initially had significant doubts about whether a genuine science of comparative psychology was even possible, only later becoming more optimistic about our ability to make reliable inferences about the mental capacities of non-human animals. There has been a fair amount of disagreement amongst scholars of Morgan's work about the nature, timing, and causes of this shift in Morgan's thinking. We argue that Morgan underwent two quite different shifts of attitude towards the proper practice of comparative psychology. The first was a qualified acceptance of the Romanesian approach to comparative psychology that he had initially criticized. The second was a shift away from Romanes' reliance on systematizing anecdotal evidence of animal intelligence towards an experimental approach, focused on studying the development of behaviour. We emphasize the role of Morgan's evolving epistemological views in bringing about the first shift - in particular, his philosophy of science. We emphasize the role of an intriguing but overlooked figure in the history of comparative psychology in explaining the second shift, T. Mann Jones, whose correspondence with Morgan provided an important catalyst for Morgan's experimental turn, particularly the special focus on development. We also shed light on the intended function of Morgan's Canon, the methodological principle for which Morgan is now mostly known. The Canon can only be properly understood by seeing it in the context of Morgan's own unique experimental vision for comparative psychology.

  17. Can jurors recognize missing control groups, confounds, and experimenter bias in psychological science?

    PubMed

    McAuliff, Bradley D; Kovera, Margaret Bull; Nunez, Gabriel

    2009-06-01

    This study examined the ability of jury-eligible community members (N = 248) to detect internal validity threats in psychological science presented during a trial. Participants read a case summary in which an expert testified about a study that varied in internal validity (valid, missing control group, confound, and experimenter bias) and ecological validity (high, low). Ratings of expert evidence quality and expert credibility were higher for the valid versus missing control group versions only. Internal validity did not influence verdict or ratings of plaintiff credibility and no differences emerged as a function of ecological validity. Expert evidence quality, expert credibility, and plaintiff credibility were positively correlated with verdict. Implications for the scientific reasoning literature and for trials containing psychological science are discussed.

  18. A Blend of Romanism and Germanism: Experimental Science Instruction in Belgian State Secondary Education, 1880-1914

    NASA Astrophysics Data System (ADS)

    Onghena, Sofie

    2013-04-01

    A case study of secondary experimental science instruction in Belgium demonstrates the importance of cross-national communication in the study of science education. Belgian secondary science education in the years 1880-1914 had a clear internationalist dimension. French and German influences turn out to have been essential, stimulated by the fact that Belgium, as a result of its geographical position, considered itself as the centre of scientific relations between France and Germany, and as actually strengthened by its linguistic and cultural dualism in this regard. This pursuit of internationalist nationalism also affected the configuration of chemistry and physics as experimental courses at Belgian Royal State Schools, although the years preceding WWI are usually characterized as a period of rising nationalism in science, with countries such as Germany and France as prominent actors. To what extent did France and Germany influence Belgian debates on science education, science teachers' training, the use of textbooks, and the instalment of school laboratories and teaching collections?

  19. Solar Fridges and Personal Power Grids: How Berkeley Lab is Fighting Global Poverty (LBNL Science at the Theater)

    SciTech Connect

    Buluswar, Shashi; Gadgil, Ashok

    2012-11-26

    At this November 26, 2012 Science at the Theater, scientists discussed the recently launched LBNL Institute for Globally Transformative Technologies (LIGTT) at Berkeley Lab. LIGTT is an ambitious mandate to discover and develop breakthrough technologies for combating global poverty. It was created with the belief that solutions will require more advanced R&D and a deep understanding of market needs in the developing world. Berkeley Lab's Ashok Gadgil, Shashi Buluswar and seven other LIGTT scientists discussed what it takes to develop technologies that will impact millions of people. These include: 1) Fuel efficient stoves for clean cooking: Our scientists are improving the Berkeley Darfur Stove, a high efficiency stove used by over 20,000 households in Darfur; 2) The ultra-low energy refrigerator: A lightweight, low-energy refrigerator that can be mounted on a bike so crops can survive the trip from the farm to the market; 3) The solar OB suitcase: A low-cost package of the five most critical biomedical devices for maternal and neonatal clinics; 4) UV Waterworks: A device for quickly, safely and inexpensively disinfecting water of harmful microorganisms.

  20. Evidence of Experimental Bias in the Life Sciences: Why We Need Blind Data Recording

    PubMed Central

    Lanfear, Robert; Jennions, Michael D.

    2015-01-01

    Observer bias and other “experimenter effects” occur when researchers’ expectations influence study outcome. These biases are strongest when researchers expect a particular result, are measuring subjective variables, and have an incentive to produce data that confirm predictions. To minimize bias, it is good practice to work “blind,” meaning that experimenters are unaware of the identity or treatment group of their subjects while conducting research. Here, using text mining and a literature review, we find evidence that blind protocols are uncommon in the life sciences and that nonblind studies tend to report higher effect sizes and more significant p-values. We discuss methods to minimize bias and urge researchers, editors, and peer reviewers to keep blind protocols in mind. PMID:26154287

  1. The emergence of modern statistics in agricultural science: analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919-1933.

    PubMed

    Parolini, Giuditta

    2015-01-01

    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher's methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians' tools and expertise into the station research programme. Fisher's statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them.

  2. Smart Grid Integration Laboratory

    SciTech Connect

    Troxell, Wade

    2011-12-22

    The initial federal funding for the Colorado State University Smart Grid Integration Laboratory is through a Congressionally Directed Project (CDP), DE-OE0000070 Smart Grid Integration Laboratory. The original program requested in three one-year increments for staff acquisition, curriculum development, and instrumentation all which will benefit the Laboratory. This report focuses on the initial phase of staff acquisition which was directed and administered by DOE NETL/ West Virginia under Project Officer Tom George. Using this CDP funding, we have developed the leadership and intellectual capacity for the SGIC. This was accomplished by investing (hiring) a core team of Smart Grid Systems engineering faculty focused on education, research, and innovation of a secure and smart grid infrastructure. The Smart Grid Integration Laboratory will be housed with the separately funded Integrid Laboratory as part of CSU's overall Smart Grid Integration Center (SGIC). The period of performance of this grant was 10/1/2009 to 9/30/2011 which included one no cost extension due to time delays in faculty hiring. The Smart Grid Integration Laboratory's focus is to build foundations to help graduate and undergraduates acquire systems engineering knowledge; conduct innovative research; and team externally with grid smart organizations. Using the results of the separately funded Smart Grid Workforce Education Workshop (May 2009) sponsored by the City of Fort Collins, Northern Colorado Clean Energy Cluster, Colorado State University Continuing Education, Spirae, and Siemens has been used to guide the hiring of faculty, program curriculum and education plan. This project develops faculty leaders with the intellectual capacity to inspire its students to become leaders that substantially contribute to the development and maintenance of Smart Grid infrastructure through topics such as: (1) Distributed energy systems modeling and control; (2) Energy and power conversion; (3) Simulation of

  3. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-03

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  4. Using R in experimental design with BIBD: An application in health sciences

    NASA Astrophysics Data System (ADS)

    Oliveira, Teresa A.; Francisco, Carla; Oliveira, Amílcar; Ferreira, Agostinho

    2016-06-01

    Considering the implementation of an Experimental Design, in any field, the experimenter must pay particular attention and look for the best strategies in the following steps: planning the design selection, conduct the experiments, collect observed data, proceed to analysis and interpretation of results. The focus is on providing both - a deep understanding of the problem under research and a powerful experimental process at a reduced cost. Mainly thanks to the possibility of allowing to separate variation sources, the importance of Experimental Design in Health Sciences is strongly recommended since long time. Particular attention has been devoted to Block Designs and more precisely to Balanced Incomplete Block Designs - in this case the relevance states from the fact that these designs allow testing simultaneously a number of treatments bigger than the block size. Our example refers to a possible study of inter reliability of the Parkinson disease, taking into account the UPDRS (Unified Parkinson's disease rating scale) in order to test if there are significant differences between the specialists who evaluate the patients performances. Statistical studies on this disease were described for example in Richards et al (1994), where the authors investigate the inter-rater Reliability of the Unified Parkinson's Disease Rating Scale Motor Examination. We consider a simulation of a practical situation in which the patients were observed by different specialists and the UPDRS on assessing the impact of Parkinson's disease in patients was observed. Assigning treatments to the subjects following a particular BIBD(9,24,8,3,2) structure, we illustrate that BIB Designs can be used as a powerful tool to solve emerging problems in this area. Once a structure with repeated blocks allows to have some block contrasts with minimum variance, see Oliveira et al. (2006), the design with cardinality 12 was selected for the example. R software was used for computations.

  5. An Experimental and Theoretical Approach to Optimize a Three-Dimensional Clinostat for Life Science Experiments

    NASA Astrophysics Data System (ADS)

    Kim, Sun Myong; Kim, Hyunju; Yang, Dongmin; Park, Jihyung; Park, Rackhyun; Namkoong, Sim; Lee, Jin I.; Choi, Inho; Kim, Han-Sung; Kim, Hyoungsoon; Park, Junsoo

    2016-12-01

    Gravity affects all biological systems, and various types of platforms have been developed to mimic microgravity on the Earth';s surface. A three-dimensional clinostat (3D clinostat) has been constructed to reduce the directionality of gravitation. In this report, we attempted to optimize a 3D clinostat for a life science experiment. Since a 3D clinostat is equipped with two motors, we fixed the angular velocity of one (primary) motor and varied it for the other (secondary) motor. In this condition, each motor ran constantly and continuously in one direction during the experiment. We monitored the direction of the normal vector using a 3D acceleration sensor, and also performed a computer simulation for comparison with the experimental data. To determine the optimal revolution for our life science experiment (i.e., a revolution yielding the strongest effects), we examined the promoter activity of two genes that were reported to be affected by microgravity. We found that the ratio of velocity of 4:1.8 (0.55) was optimal for our biological system. Our results indicate that changes of the revolutions of a 3D clinostat have a direct impact on the result and furthermore that the revolutions of the two motors have to be separately adjusted in order to guarantee an optimal simulation of microgravity.

  6. An Experimental and Theoretical Approach to Optimize a Three-Dimensional Clinostat for Life Science Experiments

    NASA Astrophysics Data System (ADS)

    Kim, Sun Myong; Kim, Hyunju; Yang, Dongmin; Park, Jihyung; Park, Rackhyun; Namkoong, Sim; Lee, Jin I.; Choi, Inho; Kim, Han-Sung; Kim, Hyoungsoon; Park, Junsoo

    2017-02-01

    Gravity affects all biological systems, and various types of platforms have been developed to mimic microgravity on the Earth';s surface. A three-dimensional clinostat (3D clinostat) has been constructed to reduce the directionality of gravitation. In this report, we attempted to optimize a 3D clinostat for a life science experiment. Since a 3D clinostat is equipped with two motors, we fixed the angular velocity of one (primary) motor and varied it for the other (secondary) motor. In this condition, each motor ran constantly and continuously in one direction during the experiment. We monitored the direction of the normal vector using a 3D acceleration sensor, and also performed a computer simulation for comparison with the experimental data. To determine the optimal revolution for our life science experiment (i.e., a revolution yielding the strongest effects), we examined the promoter activity of two genes that were reported to be affected by microgravity. We found that the ratio of velocity of 4:1.8 (0.55) was optimal for our biological system. Our results indicate that changes of the revolutions of a 3D clinostat have a direct impact on the result and furthermore that the revolutions of the two motors have to be separately adjusted in order to guarantee an optimal simulation of microgravity.

  7. Helping parents to motivate adolescents in mathematics and science: an experimental test of a utility-value intervention.

    PubMed

    Harackiewicz, Judith M; Rozek, Christopher S; Hulleman, Chris S; Hyde, Janet S

    2012-08-01

    The pipeline toward careers in science, technology, engineering, and mathematics (STEM) begins to leak in high school, when some students choose not to take advanced mathematics and science courses. We conducted a field experiment testing whether a theory-based intervention that was designed to help parents convey the importance of mathematics and science courses to their high school-aged children would lead them to take more mathematics and science courses in high school. The three-part intervention consisted of two brochures mailed to parents and a Web site, all highlighting the usefulness of STEM courses. This relatively simple intervention led students whose parents were in the experimental group to take, on average, nearly one semester more of science and mathematics in the last 2 years of high school, compared with the control group. Parents are an untapped resource for increasing STEM motivation in adolescents, and the results demonstrate that motivational theory can be applied to this important pipeline problem.

  8. OGC and Grid Interoperability in enviroGRIDS Project

    NASA Astrophysics Data System (ADS)

    Gorgan, Dorian; Rodila, Denisa; Bacu, Victor; Giuliani, Gregory; Ray, Nicolas

    2010-05-01

    EnviroGRIDS (Black Sea Catchment Observation and Assessment System supporting Sustainable Development) [1] is a 4-years FP7 Project aiming to address the subjects of ecologically unsustainable development and inadequate resource management. The project develops a Spatial Data Infrastructure of the Black Sea Catchment region. The geospatial technologies offer very specialized functionality for Earth Science oriented applications as well as the Grid oriented technology that is able to support distributed and parallel processing. One challenge of the enviroGRIDS project is the interoperability between geospatial and Grid infrastructures by providing the basic and the extended features of the both technologies. The geospatial interoperability technology has been promoted as a way of dealing with large volumes of geospatial data in distributed environments through the development of interoperable Web service specifications proposed by the Open Geospatial Consortium (OGC), with applications spread across multiple fields but especially in Earth observation research. Due to the huge volumes of data available in the geospatial domain and the additional introduced issues (data management, secure data transfer, data distribution and data computation), the need for an infrastructure capable to manage all those problems becomes an important aspect. The Grid promotes and facilitates the secure interoperations of geospatial heterogeneous distributed data within a distributed environment, the creation and management of large distributed computational jobs and assures a security level for communication and transfer of messages based on certificates. This presentation analysis and discusses the most significant use cases for enabling the OGC Web services interoperability with the Grid environment and focuses on the description and implementation of the most promising one. In these use cases we give a special attention to issues such as: the relations between computational grid and

  9. LAPS Grid generation and adaptation

    NASA Astrophysics Data System (ADS)

    Pagliantini, Cecilia; Delzanno, Gia Luca; Guo, Zehua; Srinivasan, Bhuvana; Tang, Xianzhu; Chacon, Luis

    2011-10-01

    LAPS uses a common-data framework in which a general purpose grid generation and adaptation package in toroidal and simply connected domains is implemented. The initial focus is on implementing the Winslow/Laplace-Beltrami method for generating non-overlapping block structured grids. This is to be followed by a grid adaptation scheme based on Monge-Kantorovich optimal transport method [Delzanno et al., J. Comput. Phys,227 (2008), 9841-9864], that equidistributes application-specified error. As an initial set of applications, we will lay out grids for an axisymmetric mirror, a field reversed configuration, and an entire poloidal cross section of a tokamak plasma reconstructed from a CMOD experimental shot. These grids will then be used for computing the plasma equilibrium and transport in accompanying presentations. A key issue for Monge-Kantorovich grid optimization is the choice of error or monitor function for equi-distribution. We will compare the Operator Recovery Error Source Detector (ORESD) [Lapenta, Int. J. Num. Meth. Eng,59 (2004) 2065-2087], the Tau method and a strategy based on the grid coarsening [Zhang et al., AIAA J,39 (2001) 1706-1715] to find an ``optimal'' grid. Work supported by DOE OFES.

  10. An infrastructure for the integration of geoscience instruments and sensors on the Grid

    NASA Astrophysics Data System (ADS)

    Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.

    2009-04-01

    The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV

  11. Grid flexibility and patching techniques

    NASA Technical Reports Server (NTRS)

    Keith, T. G.; Smith, L. W.; Yung, C. N.; Barthelson, S. H.; Dewitt, K. J.

    1984-01-01

    The numerical determination of combustor flowfields is of great value to the combustor designer. An a priori knowledge of the flow behavior can speed the combustor design process and reduce the number of experimental test rigs required to arrive at an optimal design. Even 2-D steady incompressible isothermal flow predictions are of use; many codes of this kind are available, each employing different techniques to surmount the difficulties arising from the nonlinearity of the governing equations and from typically irregular combustor geometries. Mapping techniques (algebraic and elliptic PDE), and adaptive grid methods (both multi-grid and grid embedding) as applied to axisymmetric combustors are discussed.

  12. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    SciTech Connect

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequencies are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.

  13. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  14. Datums, Ellipsoids, Grids, and Grid Reference Systems

    DTIC Science & Technology

    1992-01-01

    Tunisie Grid, Sud Algerie Grid, Sud Maroc Grid, and Sud Tunisie Grid. 4-1.1.8 The...REFERENCES ON THE SUD ALGERIE AND SUD TUNISIE GRIDS 6-8.5.2 When oil reference boxes cannot be accommodated in the margin, the excess is shown in expanses...GIVING REFERENCES ON THE SUD ALGERIE AND SUD TUNISIE GRIDS 6-21 DMA TM 8358.1 I CHAPTER 7 GRIDS ON MAPS AT 1:250,000 AND 1:500,000 SCALE 7.1 GENERAL.

  15. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  16. Changes in Critical Thinking Skills Following a Course on Science and Pseudoscience: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    McLean, Carmen P.; Miller, Nathan A.

    2010-01-01

    We assessed changes in paranormal beliefs and general critical thinking skills among students (n = 23) enrolled in an experimental course designed to teach distinguishing science from pseudoscience and a comparison group of students (n = 30) in an advanced research methods course. On average, both courses were successful in reducing paranormal…

  17. Heritage Education: Exploring the Conceptions of Teachers and Administrators from the Perspective of Experimental and Social Science Teaching

    ERIC Educational Resources Information Center

    Perez, Roque Jimenez; Lopez, Jose Maria Cuenca; Listan, D. Mario Ferreras

    2010-01-01

    This paper describes a research project into heritage education. Taking an interdisciplinary perspective from within the field of Experimental and Social Science Education, it presents an analysis of teachers' and administrators' conceptions of heritage, its teaching and its dissemination in Spain. A statistical description is provided of the…

  18. Apollo-Soyuz pamphlet no. 9: General science. [experimental design in Astronomy, Biology, Geophysics, Aeronomy and Materials science

    NASA Technical Reports Server (NTRS)

    Page, L. W.; From, T. P.

    1977-01-01

    The objectives and planning activities for the Apollo-Soyuz mission are summarized. Aspects of the space flight considered include the docking module and launch configurations, spacecraft orbits, and weightlessness. The 28 NASA experiments conducted onboard the spacecraft are summarized. The contributions of the mission to the fields of astronomy, geoscience, biology, and materials sciences resulting from the experiments are explored.

  19. 'Mind genomics': the experimental, inductive science of the ordinary, and its application to aspects of food and feeding.

    PubMed

    Moskowitz, Howard R

    2012-11-05

    The paper introduces the empirical science of 'mind genomics', whose objective is to understand the dimensions of ordinary, everyday experience, identify mind-set segments of people who value different aspects of that everyday experience, and then assign a new person to a mind-set by a statistically appropriate procedure. By studying different experiences using experimental design of ideas, 'mind genomics' constructs an empirical, inductive science of perception and experience, layer by layer. The ultimate objective of 'mind genomics' is a large-scale science of experience created using induction, with the science based upon emergent commonalities across many different types of daily experience. The particular topic investigated in the paper is the experience of healthful snacks, what makes a person 'want' them, and the dollar value of different sensory aspects of the healthful snack.

  20. From Ions to Wires to the Grid: The Transformational Science of LANL Research in High-Tc Superconducting Tapes and Electric Power Applications

    ScienceCinema

    Marken, Ken [Superconductivity Technology Center, Los Alamos, New Mexico, United States

    2016-07-12

    The Department of Energy (DOE) Office of Electricity Delivery and Energy Reliability (OE) has been tasked to lead national efforts to modernize the electric grid, enhance security and reliability of the energy infrastructure, and facilitate recovery from disruptions to energy supplies. LANL has pioneered the development of coated conductors – high-temperature superconducting (HTS) tapes – which permit dramatically greater current densities than conventional copper cable, and enable new technologies to secure the national electric grid. Sustained world-class research from concept, demonstration, transfer, and ongoing industrial support has moved this idea from the laboratory to the commercial marketplace.

  1. Science, suffrage, and experimentation: Mary Putnam Jacobi and the controversy over vivisection in late nineteenth-century America.

    PubMed

    Bittel, Carla Jean

    2005-01-01

    This article examines the medical activism of the New York physician Mary Putnam Jacobi (1842-1906), to illustrate the problems of gender and science at the center of the vivisection debate in late nineteenth-century America. In the post-Civil War era, individuals both inside and outside the medical community considered vivisection to be a controversial practice. Physicians divided over the value of live animal experimentation, while reformers and activists campaigned against it. Jacobi stepped into the center of the controversy and tried to use her public defense of experimentation to the advantage of women in the medical profession. Her advocacy of vivisection was part of her broader effort to reform medical education, especially at women's institutions. It was also a political strategy aimed at associating women with scientific practices to advance a women's rights agenda. Her work demonstrates how debates over women in medicine and science in medicine, suffrage, and experimentation overlapped at a critical moment of historical transition.

  2. GridMan: A grid manipulation system

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Wang, Zhu

    1992-01-01

    GridMan is an interactive grid manipulation system. It operates on grids to produce new grids which conform to user demands. The input grids are not constrained to come from any particular source. They may be generated by algebraic methods, elliptic methods, hyperbolic methods, parabolic methods, or some combination of methods. The methods are included in the various available structured grid generation codes. These codes perform the basic assembly function for the various elements of the initial grid. For block structured grids, the assembly can be quite complex due to a large number of clock corners, edges, and faces for which various connections and orientations must be properly identified. The grid generation codes are distinguished among themselves by their balance between interactive and automatic actions and by their modest variations in control. The basic form of GridMan provides a much more substantial level of grid control and will take its input from any of the structured grid generation codes. The communication link to the outside codes is a data file which contains the grid or section of grid.

  3. Experimental setup and the system performance for single-grid-based phase-contrast x-ray imaging (PCXI) with a microfocus x-ray tube

    NASA Astrophysics Data System (ADS)

    Lim, Hyunwoo; Park, Yeonok; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Park, Chulkyu; Woo, Taeho; Lee, Minsik; Kim, Jinsoo; Chung, Nagkun; Kim, Jinwon; Kim, Jinguk

    2015-08-01

    In this work, we investigated a simplified approach to phase-contrast x-ray imaging (PCXI) by using a single antiscatter grid and a microfocus x-ray tube, which has potential to open the way to further widespread use of PCXI into the related application areas. We established a table-top setup for PCXI studies of biological and non-biological samples and investigated the system performance. The PCXI system consists of a focused-linear grid having a strip density of 200 lines/in. (JPI Healthcare Corp.), a microfocus x-ray tube having a focal spot size of about 5 μm (Hamamatsu, L7910), and a high-resolution CMOS imaging detector having a pixel size of 48 μm (Rad-icon Imaging Corp., Shad-o-Box 2048). By using our prototype system, we successfully obtained attenuation, scattering, and differential phase-contrast x-ray images of improved visibility from the raw images of several samples at x-ray tube conditions of 50 kVp and 6 mAs. Our initial results indicate that the single-grid-based approach seems a useful method for PCXI with great simplicity and minimal requirements on the setup alignment.

  4. Experimental Methods to Evaluate Science Utility Relative to the Decadal Survey

    NASA Technical Reports Server (NTRS)

    Widergren, Cynthia

    2012-01-01

    The driving factor for competed missions is the science that it plans on performing once it has reached its target body. These science goals are derived from the science recommended by the most current Decadal Survey. This work focuses on science goals in previous Venus mission proposals with respect to the 2013 Decadal Survey. By looking at how the goals compare to the survey and how much confidence NASA has in the mission's ability to accomplish these goals, a method was created to assess the science return utility of each mission. This method can be used as a tool for future Venus mission formulation and serves as a starting point for future development of create science utility assessment tools.

  5. 75 FR 7526 - Consumer Interface With the Smart Grid

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... electric vehicles. The Smart Grid will help to provide consumers with the information, automation, and... advanced electric grid. In many instances, smart meters will have the capability to communicate near-real... TECHNOLOGY POLICY Consumer Interface With the Smart Grid AGENCY: Office of Science and Technology...

  6. Energy Systems Integration: Demonstrating Distributed Grid-Edge Control Hierarchy

    SciTech Connect

    2017-01-01

    Overview fact sheet about the OMNETRIC Group Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project at the Energy Systems Integration Facility. INTEGRATE is part of the U.S. Department of Energy's Grid Modernization Initiative.

  7. An Introduction to Grid Computing Using EGEE

    NASA Astrophysics Data System (ADS)

    Walsh, John; Coghlan, Brian; Childs, Stephen

    Grid is an evolving and maturing architecture based on several well-established services, including amongst others, distributed computing, role and group management, distributed data management and Public Key Encryption systems Currently the largest scientific grid infrastructure is Enabling Grids e-Science (EGEE), comprised of approximately ˜250 sites, ˜50,000 CPUs and tens of petabytes of storage. Moreover, EGEE covers a large variety of scientific disciplines including Astrophysics. The scope of this work is to provide the keen astrophysicist with an introductory overview of the motivations for using Grid, and of the core production EGEE services and its supporting software and/or middleware (known by the name gLite). We present an overview of the available set of commands, tools and portals as used within these Grid communities. In addition, we present the current scheme for supporting MPI programs on these Grids.

  8. Animal Science Technology. An Experimental Developmental Program. Volume II, Curriculum Course Outlines.

    ERIC Educational Resources Information Center

    Brant, Herman G.

    This volume, the second of a two part evaluation report, is devoted exclusively to the presentation of detailed course outlines representing an Animal Science Technology curriculum. Arranged in 6 terms of study (2 academic years), outlines are included on such topics as: (1) Introductory Animal Science, (2) General Microbiology, (3) Zoonoses, (4)…

  9. Correlated Curriculum Program: An Experimental Program. Science Level 1 (9A, 9B, 10A).

    ERIC Educational Resources Information Center

    Loebl, Stanley, Ed.; And Others

    The unit plans in Correlated Science 1 are intended to be of use to the teacher in both lesson and team planning. The course in science was designed for optimum correlation with the work done in business, health, and industrial careers. Behavioral objectives, class routines, time allotments, student evaluation, and the design of the manual are…

  10. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    ERIC Educational Resources Information Center

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-01-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually "do" science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields.…

  11. Your World and Welcome To It, Science (Experimental): 5314.03.

    ERIC Educational Resources Information Center

    Kleinman, David Z.

    Presented is a beginning course in biology with emphasis on ecology for students with limited interest and few experiences in science. These students most likely will not take many more science courses. Included are the basic ecological concepts of communities, population, societies and the effects humans have on the environment. Like all other…

  12. An Experimental Examination of Quick Writing in the Middle School Science Classroom

    ERIC Educational Resources Information Center

    Benedek-Wood, Elizabeth; Mason, Linda H.; Wood, Philip H.; Hoffman, Katie E.; McGuire, Ashley

    2014-01-01

    A staggered A-B design study was used to evaluate the effects of Self- Regulated Strategy Development (SRSD) instruction for quick writing in middle school science across four classrooms. A sixth-grade science teacher delivered all students' writing assessment and SRSD instruction for informative quick writing. Results indicated that performance…

  13. Animal Science Technology. An Experimental Developmental Program. Volume I, Report of the Developmental Program.

    ERIC Educational Resources Information Center

    Brant, Herman G.; And Others

    In 1961, administrative personnel at Delhi College in New York observed that formal training programs for animal science technicians were virtually nonexistant. Response to this apparent need resulted in the initiation of perhaps the first 2-year Animal Science Technology Program in the nation. This two-volume report is the result of an extensive…

  14. Data Grid Management Systems

    NASA Technical Reports Server (NTRS)

    Moore, Reagan W.; Jagatheesan, Arun; Rajasekar, Arcot; Wan, Michael; Schroeder, Wayne

    2004-01-01

    The "Grid" is an emerging infrastructure for coordinating access across autonomous organizations to distributed, heterogeneous computation and data resources. Data grids are being built around the world as the next generation data handling systems for sharing, publishing, and preserving data residing on storage systems located in multiple administrative domains. A data grid provides logical namespaces for users, digital entities and storage resources to create persistent identifiers for controlling access, enabling discovery, and managing wide area latencies. This paper introduces data grids and describes data grid use cases. The relevance of data grids to digital libraries and persistent archives is demonstrated, and research issues in data grids and grid dataflow management systems are discussed.

  15. Friends and Foes of Theory Construction in Psychological Science: Vague Dichotomies, Unified Theories of Cognition, and the New Experimentalism.

    PubMed

    Garcia-Marques, Leonel; Ferreira, Mário B

    2011-03-01

    Newell (1973) criticized the use of vague theoretical dichotomies to account for narrowly defined empirical phenomena. Many of the problems raised by Newell persist today. We argue that these problems derive not from any peculiarity of psychological science but from the hindrances inherent to empirical theory testing. To show the contemporary relevance of these problems, we present two modern illustrations of the encumbrances faced by dichotomy-based research, we review some attempts to rely on nonempirical criteria to overcome the empirical impediments in theory testing, and we bring the question of theoretical mimicry to bear on these problems. Next, we discuss an alternative to theoretical dichotomies: the Unified Theories of Cognition (Newell, 1990). Finally, we introduce the "new experimentalism" approach in philosophy of science (Mayo, 1996), which provides a new perspective on theory construction in psychological science. We conclude with suggestions on how this new perspective can be implemented.

  16. Pedagogical experimentations about participating science, in a european class, in France.

    NASA Astrophysics Data System (ADS)

    Burgio, Marion

    2015-04-01

    A european class is, in France, a class in which we teach a subject in a foreign language, for example science in English. I led, in my European class, during a seven weeks session, group work activities about different participating science actions. There were groups composed of three or four 16 years old students. Each group chose one type of participating science activity among : - Leading a visioconference with an IODP mission on board the Joides Resolution. - Being part of a "science songs community" with Tom Mc Fadden They divided the work and some of them studied the websites and contacted the actors to present the pedagogical or scientific background of their subject. Others had a concrete production like the organization of a visioconference with the Joides Resolution or the creation of a pedagogical song about geology. I will present some results of their work and explain the students motivation linked to this active learning method.

  17. Experimental stations as a tool to teach soil science at the University of Valencia

    NASA Astrophysics Data System (ADS)

    Cerdà, Artemi

    2010-05-01

    This paper shows the strategies used at the University of Valencia (Department of Geography. Soil Erosion and Degradation Research Group) to teach soil science at the Geography and Enviromental Science Degrees. The use of the Montesa and El Teularet research stations contribute with a better knowledge on soil science for the students as they can see the measurements carried out in the field. Students visit the stations and contribute to measurements and sampling every season. The use of meteorological stations, erosion plots, soil moisture and soil temperatures probes, and sampling give the students the chances to understand the theoretical approach they use to have. This presentation will show how the students evolve, and how their knowledge in soil science is improved.

  18. Promises and pitfalls of Web-based experimentation in the advance of replicable psychological science: A reply to Plant (2015).

    PubMed

    van Steenbergen, Henk; Bocanegra, Bruno R

    2016-12-01

    In a recent letter, Plant (2015) reminded us that proper calibration of our laboratory experiments is important for the progress of psychological science. Therefore, carefully controlled laboratory studies are argued to be preferred over Web-based experimentation, in which timing is usually more imprecise. Here we argue that there are many situations in which the timing of Web-based experimentation is acceptable and that online experimentation provides a very useful and promising complementary toolbox to available lab-based approaches. We discuss examples in which stimulus calibration or calibration against response criteria is necessary and situations in which this is not critical. We also discuss how online labor markets, such as Amazon's Mechanical Turk, allow researchers to acquire data in more diverse populations and to test theories along more psychological dimensions. Recent methodological advances that have produced more accurate browser-based stimulus presentation are also discussed. In our view, online experimentation is one of the most promising avenues to advance replicable psychological science in the near future.

  19. Science.

    ERIC Educational Resources Information Center

    Roach, Linda E., Ed.

    This document contains the following papers on science instruction and technology: "A 3-D Journey in Space: A New Visual Cognitive Adventure" (Yoav Yair, Rachel Mintz, and Shai Litvak); "Using Collaborative Inquiry and Interactive Technologies in an Environmental Science Project for Middle School Teachers: A Description and…

  20. Science to the people! (and experimental politics): searching for the roots of participatory discourse in science and technology in the 1970s in France.

    PubMed

    Quet, Mathieu

    2014-08-01

    The current conception of political participation in governmental institutions is deeply marked by the notions of deliberation and precaution. This normative conception of participatory politics neglects, backgrounds or disqualifies other participatory practices, in so far as they are not connected to deliberation and precaution. However, participation has not always been defined in such a restricted way: the current conception of participation is a product of the 1980s and 1990s. In this paper, the meaning ascribed to the notion of participation in the 1970s in France is explored through the study of discourses produced in three fields: the Science Policy Division of the OECD, the French radical science movement, and the emerging STS academic field. As is shown, some of the bases of the current notion of participation originate in the 1970s. Nevertheless, it is argued that in these years, the notion of participation has more to do with experimentation than with deliberation and precaution. Therefore, the conception of participation in the 1970s differs greatly from the current one. Methodologically, this paper combines tools offered by the social history of science and the French school of discourse analysis.

  1. Using newly-designed lint cleaner grid bars to remove seed coat fragments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was conducted to remove seed coat fragments at the saw-type lint cleaner using newly-designed grid bars. The test consisted of five experimental grid bar designs and one control. The experimental grid bars had angles from the sharp toe of the grid bar (or the angle from vertical) of ...

  2. "They Sweat for Science": The Harvard Fatigue Laboratory and Self-Experimentation in American Exercise Physiology.

    PubMed

    Johnson, Andi

    2015-08-01

    In many scientific fields, the practice of self-experimentation waned over the course of the twentieth century. For exercise physiologists working today, however, the practice of self-experimentation is alive and well. This paper considers the role of the Harvard Fatigue Laboratory and its scientific director, D. Bruce Dill, in legitimizing the practice of self-experimentation in exercise physiology. Descriptions of self-experimentation are drawn from papers published by members of the Harvard Fatigue Lab. Attention is paid to the ethical and practical justifications for self-experimentation in both the lab and the field. Born out of the practical, immediate demands of fatigue protocols, self-experimentation performed the long-term, epistemological function of uniting physiological data across time and space, enabling researchers to contribute to a general human biology program.

  3. Novel boron-10-based detectors for neutron scattering science. Helium-3-free detectors for large- and small-area applications: The Multi-Grid and the Multi-Blade prototypes

    NASA Astrophysics Data System (ADS)

    Piscitelli, Francesco

    2015-02-01

    Nowadays neutron scattering science is increasing its instrumental power. Most of the neutron sources in the world are pushing the development of their technologies to be more performing. The neutron scattering development is also pushed by the European Spallation Source (ESS) in Sweden, a neutron facility which has just started construction. Concerning small-area detectors (˜ 1 m2), the 3He technology, which is today cutting edge, is reaching fundamental limits in its development. Counting rate capability, spatial resolution and cost effectiveness, are only a few examples of the features that must be improved to fulfill the new requirements. On the other hand, 3He technology could still satisfy the detector requirements for large-area applications (˜50 m2), however, because of the present 3He shortage that the world is experiencing, this is not practical anymore. The recent detector advances (the Multi-Grid and the Multi-Blade prototypes) developed in the framework of the collaboration between the Institut Laue-Langevin (ILL) and ESS are presented in this paper. In particular two novel 10B-based detectors are described; one for large-area applications (the Multi-Grid prototype) and one for application in neutron reflectometry (small-area applications, the Multi-Blade prototype).

  4. Short communication: On recognizing the proper experimental unit in animal studies in the dairy sciences.

    PubMed

    Bello, Nora M; Kramer, Matthew; Tempelman, Robert J; Stroup, Walter W; St-Pierre, Normand R; Craig, Bruce A; Young, Linda J; Gbur, Edward E

    2016-11-01

    Sound design of experiments combined with proper implementation of appropriate statistical methods for data analysis are critical for producing meaningful scientific results that are both replicable and reproducible. This communication addresses specific aspects of design and analysis of experiments relevant to the dairy sciences and, in so doing, responds to recent concerns raised in a letter to the editor of the Journal of Dairy Science regarding journal policy for research publications on pen-based animal studies. We further elaborate on points raised, rectify interpretation of important concepts, and show how aspects of statistical inference and elicitation of research conclusions are affected.

  5. Qualitative Quantitative and Experimental Concept Possession, Criteria for Identifying Conceptual Change in Science Education

    ERIC Educational Resources Information Center

    Lappi, Otto

    2013-01-01

    Students sometimes misunderstand or misinterpret scientific content because of persistent misconceptions that need to be overcome by science education--a learning process typically called conceptual change. The acquisition of scientific content matter thus requires a transformation of the initial knowledge-state of a common-sense picture of the…

  6. Getting "What Works" Working: Building Blocks for the Integration of Experimental and Improvement Science

    ERIC Educational Resources Information Center

    Peterson, Amelia

    2016-01-01

    As a systemic approach to improving educational practice through research, "What Works" has come under repeated challenge from alternative approaches, most recently that of improvement science. While "What Works" remains a dominant paradigm for centralized knowledge-building efforts, there is need to understand why this…

  7. Mathematics Through Science, Part III: An Experimental Approach to Functions. Teacher's Commentary. Revised Edition.

    ERIC Educational Resources Information Center

    Bolduc, Elroy J., Jr.; And Others

    The purpose of this project is to teach learning and understanding of mathematics at the ninth grade level through the use of science experiments. This part of the program contains significant amounts of material normally found in a beginning algebra class. The material should be found useful for classes in general mathematics as a preparation for…

  8. Critical need for family-based, quasi-experimental designs in integrating genetic and social science research.

    PubMed

    D'Onofrio, Brian M; Lahey, Benjamin B; Turkheimer, Eric; Lichtenstein, Paul

    2013-10-01

    Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene-environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles.

  9. Critical Need for Family-Based, Quasi-Experimental Designs in Integrating Genetic and Social Science Research

    PubMed Central

    Lahey, Benjamin B.; Turkheimer, Eric; Lichtenstein, Paul

    2013-01-01

    Researchers have identified environmental risks that predict subsequent psychological and medical problems. Based on these correlational findings, researchers have developed and tested complex developmental models and have examined biological moderating factors (e.g., gene–environment interactions). In this context, we stress the critical need for researchers to use family-based, quasi-experimental designs when trying to integrate genetic and social science research involving environmental variables because these designs rigorously examine causal inferences by testing competing hypotheses. We argue that sibling comparison, offspring of twins or siblings, in vitro fertilization designs, and other genetically informed approaches play a unique role in bridging gaps between basic biological and social science research. We use studies on maternal smoking during pregnancy to exemplify these principles. PMID:23927516

  10. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    SciTech Connect

    Box, Dennis

    2014-01-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  11. FIFE-Jobsub: a grid submission system for intensity frontier experiments at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis

    2014-06-01

    The Fermilab Intensity Frontier Experiments use an integrated submission system known as FIFE-jobsub, part of the FIFE (Fabric for Frontier Experiments) initiative, to submit batch jobs to the Open Science Grid. FIFE-jobsub eases the burden on experimenters by integrating data transfer and site selection details in an easy to use and well-documented format. FIFE-jobsub automates tedious details of maintaining grid proxies for the lifetime of the grid job. Data transfer is handled using the Intensity Frontier Data Handling Client (IFDHC) [1] tool suite, which facilitates selecting the appropriate data transfer method from many possibilities while protecting shared resources from overload. Chaining of job dependencies into Directed Acyclic Graphs (Condor DAGS) is well supported and made easier through the use of input flags and parameters.

  12. Parallel grid population

    DOEpatents

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  13. SCE: Grid Environment for Scientific Computing

    NASA Astrophysics Data System (ADS)

    Xiao, Haili; Wu, Hong; Chi, Xuebin

    Over the last few years Grid computing has evolved into an innovating technology and gotten increased commercial adoption. However, existing Grids do not have enough users as for sustainable development in the long term. This paper proposes several suggestions to this problem on the basis of long-term experience and careful analysis. The Scientific Computing Environment (SCE) in the Chinese Academy of Sciences is introduced as a completely new model and a feasible solution to this problem.

  14. Power Grid Defense Against Malicious Cascading Failure

    DTIC Science & Technology

    2014-05-01

    Power Grid Defense Against Malicious Cascading Failure Paulo Shakarian Dept. EECS and Network Science Center U.S. Military Academy West Point, NY...adversary looking to disrupt a power grid may look to target certain substations and sources of power genera- tion to initiate a cascading failure that...graph and introduce the cascad - ing failure game in which both the defender and attacker choose a subset of power stations such as to minimize (max

  15. Modeling of the charge-state separation at ITEP experimental facility for material science based on a Bernas ion source

    SciTech Connect

    Barminova, H. Y. Saratovskyh, M. S.

    2016-02-15

    The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10{sup 10} ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.

  16. Distributed data mining on grids: services, tools, and applications.

    PubMed

    Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo

    2004-12-01

    Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.

  17. The word is the deed: the ideology of the research paper in experimental science.

    PubMed

    Rangachari, P K

    1994-12-01

    The research publication epitomizes the practice of contemporary science. This article emphasizes the underlying ideological basis and comments on the educational implications, particularly for graduate students. An attitudinal shift in the acquisitions of knowledge led to Henry Oldenburg's "invention" of the research article in the 17th century. Science was seen to be an open, cooperative activity, incremental in nature, with contributors building on previous work and submitting their work to scrutiny. Brief papers replaced weighty tomes. Subtle changes over the next century led to the current format. Ethnographic and textual analyses have shown that scientific facts are not revealed but constructed and that the research paper is carefully crafted to serve its twin functions, to inform and to persuade. Manufactured knowledge must be communicated and certified to preserve the communal nature of the investigative enterprise. Publication in a recognized forum fulfills that need. The word IS the deed.

  18. A Moving Grid Capability for NPARC

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    1998-01-01

    Version 3.1 of the NPARC computational fluid dynamics flow solver introduces a capability to solve unsteady flow on moving multi-block, structured grids with nominally second-order time accuracy. The grid motion is due to segments of the boundary grid that translate and rotate in a rigid-body manner or deform. The grid is regenerated at each time step to accommodate the boundary grid motion. The flow equations and computational models sense the moving grid through the grid velocities, which are computed from a time-difference of the grids at two consecutive time levels. For three-dimensional flow domains, it is assumed that the grid retains a planar character with respect to one coordinate. The application and accuracy of NPARC v3.1 is demonstrated for flow about a flying wedge, rotating flap, a collapsing bump in a duct, and the upstart / restart flow in a variable-geometry inlet. The results compare well with analytic and experimental results.

  19. GNARE: an environment for Grid-based high-throughput genome analysis.

    SciTech Connect

    Sulakhe, D.; Rodriguez, A.; D'Souza, M.; Wilde, M.; Nefedova, V.; Foster, I.; Maltsev, N.; Mathematics and Computer Science; Univ. of Chicago

    2005-01-01

    Recent progress in genomics and experimental biology has brought exponential growth of the biological information available for computational analysis in public genomics databases. However, applying the potentially enormous scientific value of this information to the understanding of biological systems requires computing and data storage technology of an unprecedented scale. The grid, with its aggregated and distributed computational and storage infrastructure, offers an ideal platform for high-throughput bioinformatics analysis. To leverage this we have developed the Genome Analysis Research Environment (GNARE) - a scalable computational system for the high-throughput analysis of genomes, which provides an integrated database and computational backend for data-driven bioinformatics applications. GNARE efficiently automates the major steps of genome analysis including acquisition of data from multiple genomic databases; data analysis by a diverse set of bioinformatics tools; and storage of results and annotations. High-throughput computations in GNARE are performed using distributed heterogeneous grid computing resources such as Grid2003, TeraGrid, and the DOE science grid. Multi-step genome analysis workflows involving massive data processing, the use of application-specific toots and algorithms and updating of an integrated database to provide interactive Web access to results are all expressed and controlled by a 'virtual data' model which transparently maps computational workflows to distributed grid resources. This paper describes how Grid technologies such as Globus, Condor, and the Gryphyn virtual data system were applied in the development of GNARE. It focuses on our approach to Grid resource allocation and to the use of GNARE as a computational framework for the development of bioinformatics applications.

  20. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2015-09-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and unsatisfying, and "cookbook." Here, both to help understand the reason for such discrepant views and to aid in the design of instructional lab courses, I compare the mental tasks or types of thinking ("cognitive task analysis") associated with a physicist doing tabletop experimental research with the cognitive tasks of students in an introductory physics instructional lab involving traditional verification/confirmation exercises.

  1. Dynamic Power Grid Simulation

    SciTech Connect

    Top, Philip; Woodward, Carol; Smith, Steve; Banks, Lawrence; Kelley, Brian

    2015-09-14

    GridDyn is a part of power grid simulation toolkit. The code is designed using modern object oriented C++ methods utilizing C++11 and recent Boost libraries to ensure compatibility with multiple operating systems and environments.

  2. Method of grid generation

    DOEpatents

    Barnette, Daniel W.

    2002-01-01

    The present invention provides a method of grid generation that uses the geometry of the problem space and the governing relations to generate a grid. The method can generate a grid with minimized discretization errors, and with minimal user interaction. The method of the present invention comprises assigning grid cell locations so that, when the governing relations are discretized using the grid, at least some of the discretization errors are substantially zero. Conventional grid generation is driven by the problem space geometry; grid generation according to the present invention is driven by problem space geometry and by governing relations. The present invention accordingly can provide two significant benefits: more efficient and accurate modeling since discretization errors are minimized, and reduced cost grid generation since less human interaction is required.

  3. System Development of an Experimental Rocket for a Launch Campaign Organized by The Association of Planete Sciences, France

    NASA Astrophysics Data System (ADS)

    Sasaki, Minoru; Nakano, Noriaki; Ohmayu, Satoru; Ogushi, Naoki

    This paper presents the system development of an experimental rocket for a launch campaign organized by the Association of Planete Sciences in France (http://www.planete-sciences.org). A two-stage experimental rocket was developed by 'Space Club Gifu' and the principal author's laboratory at Gifu University. It incorporates GPS, acceleration and pressure sensors as well as two cameras, one omni-directional. The goals of our experiment are as follows: 1. Constant video monitoring of motor combustion and activity during launch and flight. 2. Acquisition of accelerometer, pressure and GPS data for comparison with simulated results. 3. Developing a new mechanism for stage separation in order to build a future vehicle with two rocket motors. 4. The launch and return of a quasi-satellite to a pre-selected location using GPS data. The rocket launched successfully at La Courtine, France in 1st August of 2007, but unfortunately, the first stage could not be recovered. It along with the video footage of the rocket motor burn was lost. However, the second stage and the quasi-satellite were safely found. This project provides excellent training for engineering students in the fundamentals of engineering design and manufacturing.

  4. Review. Establishing an experimental science of culture: animal social diffusion experiments.

    PubMed

    Whiten, Andrew; Mesoudi, Alex

    2008-11-12

    A growing set of observational studies documenting putative cultural variations in wild animal populations has been complemented by experimental studies that can more rigorously distinguish between social and individual learning. However, these experiments typically examine only what one animal learns from another. Since the spread of culture is inherently a group-level phenomenon, greater validity can be achieved through 'diffusion experiments', in which founder behaviours are experimentally manipulated and their spread across multiple individuals tested. Here we review the existing corpus of 33 such studies in fishes, birds, rodents and primates and offer the first systematic analysis of the diversity of experimental designs that have arisen. We distinguish three main transmission designs and seven different experimental/control approaches, generating an array with 21 possible cells, 15 of which are currently represented by published studies. Most but not all of the adequately controlled diffusion experiments have provided robust evidence for cultural transmission in at least some taxa, with transmission spreading across populations of up to 24 individuals and along chains of up to 14 transmission events. We survey the achievements of this work, its prospects for the future and its relationship to diffusion studies with humans discussed in this theme issue and elsewhere.

  5. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    ERIC Educational Resources Information Center

    Pyatt, Kevin; Sims, Rod

    2012-01-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal…

  6. Comparative Cognitive Task Analyses of Experimental Science and Instructional Laboratory Courses

    ERIC Educational Resources Information Center

    Wieman, Carl

    2015-01-01

    Undergraduate instructional labs in physics generate intense opinions. Their advocates are passionate as to their importance for teaching physics as an experimental activity and providing "hands-on" learning experiences, while their detractors (often but not entirely students) offer harsh criticisms that they are pointless, confusing and…

  7. Parallel unstructured grid generation

    NASA Technical Reports Server (NTRS)

    Loehner, Rainald; Camberos, Jose; Merriam, Marshal

    1991-01-01

    A parallel unstructured grid generation algorithm is presented and implemented on the Hypercube. Different processor hierarchies are discussed, and the appropraite hierarchies for mesh generation and mesh smoothing are selected. A domain-splitting algorithm for unstructured grids which tries to minimize the surface-to-volume ratio of each subdomain is described. This splitting algorithm is employed both for grid generation and grid smoothing. Results obtained on the Hypercube demonstrate the effectiveness of the algorithms developed.

  8. IPG Power Grid Overview

    NASA Technical Reports Server (NTRS)

    Hinke, Thomas

    2003-01-01

    This presentation will describe what is meant by grids and then cover the current state of the IPG. This will include an overview of the middleware that is key to the operation of the grid. The presentation will then describe some of the future directions that are planned for the IPG. Finally the presentation will conclude with a brief overview of the Global Grid Forum, which is a key activity that will contribute to the successful availability of grid components.

  9. Collar grids for intersecting geometric components within the Chimera overlapped grid scheme

    NASA Technical Reports Server (NTRS)

    Parks, Steven J.; Buning, Pieter G.; Chan, William M.; Steger, Joseph L.

    1991-01-01

    A method for overcoming problems with using the Chimera overset grid scheme in the region of intersecting geometry components is presented. A 'collar grid' resolves the intersection region and provides communication between the component grids. This approach is validated by comparing computed and experimental data for a flow about a wing/body configuration. Application of the collar grid scheme to the Orbiter fuselage and vertical tail intersection in a computation of the full Space Shuttle launch vehicle demonstrates its usefulness for simulation of flow about complex aerospace vehicles.

  10. AstroGrid-PL

    NASA Astrophysics Data System (ADS)

    Stachowski, Greg; Kundera, Tomasz; Ciecielag, Paweł; AstroGridPL Team

    2016-06-01

    We summarise the achievements AstroGrid-PL project, which aims to provide an infrastructure grid computing, distributed storage and Virtual Observatory services to the Polish astronomical community. It was developed from 2011-2015 as a domain grid component within the large PLGrid Plus project for scientific computing in Poland.

  11. GridKit

    SciTech Connect

    Peles, Slaven

    2016-11-06

    GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.

  12. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  13. e-Science and its implications.

    PubMed

    Hey, Tony; Trefethen, Anne

    2003-08-15

    After a definition of e-science and the Grid, the paper begins with an overview of the technological context of Grid developments. NASA's Information Power Grid is described as an early example of a 'prototype production Grid'. The discussion of e-science and the Grid is then set in the context of the UK e-Science Programme and is illustrated with reference to some UK e-science projects in science, engineering and medicine. The Open Standards approach to Grid middleware adopted by the community in the Global Grid Forum is described and compared with community-based standardization processes used for the Internet, MPI, Linux and the Web. Some implications of the imminent data deluge that will arise from the new generation of e-science experiments in terms of archiving and curation are then considered. The paper concludes with remarks about social and technological issues posed by Grid-enabled 'collaboratories' in both scientific and commercial contexts.

  14. Overview: Homogeneous nucleation from the vapor phase—The experimental science

    NASA Astrophysics Data System (ADS)

    Wyslouzil, Barbara E.; Wölk, Judith

    2016-12-01

    Homogeneous nucleation from the vapor phase has been a well-defined area of research for ˜120 yr. In this paper, we present an overview of the key experimental and theoretical developments that have made it possible to address some of the fundamental questions first delineated and investigated in C. T. R. Wilson's pioneering paper of 1897 [C. T. R. Wilson, Philos. Trans. R. Soc., A 189, 265-307 (1897)]. We review the principles behind the standard experimental techniques currently used to measure isothermal nucleation rates, and discuss the molecular level information that can be extracted from these measurements. We then highlight recent approaches that interrogate the vapor and intermediate clusters leading to particle formation, more directly.

  15. An Experimental Approach to Determine the Flight Dynamics of NASA’s Mars Science Lab Capsule

    DTIC Science & Technology

    2014-01-01

    20 Figure 25. 178-mm HARP gun and experimental setup...and ARL worked together to develop an instrumented model to be fired from the 178-mm-diameter (7-in) High Altitude Research Project ( HARP ) gun. The...maximum diameter was 178 mm with the sabot included, but the actual subscale model diameter was 171 mm. The 178-mm gun was built for HARP , which

  16. Experimental education of Astronomy across the seedbeds of investigation in sciences

    NASA Astrophysics Data System (ADS)

    Taborda, E.

    2009-05-01

    In Colombia, the geographic situation help us in the moment of make academic work of astronomic observation, due to the opportunity of look almost the totality of the nocturnal sky in the hemispheres north and south in on night generating the possibility of make easy our labor as educators and to the astronomy and the related science with the students learn and the socialize in fundamental areas as mathematics, physic, chemistry, biology, art, technology, geography and history between others fundamental areas. In our presentation will be show the results of 3 years of in which we the students of primary and high school studies as a descriptive study of these research. we need economic help for the aid to this event.

  17. Large-Scale Experimental Planetary Science Meets Planetary Defense: Deorbiting an Asteroidal Satellite

    NASA Technical Reports Server (NTRS)

    Cintala, M. J.; Durda, D. D.; Housen, K. R.

    2005-01-01

    Other than remote-sensing and spacecraft-derived data, the only information that exists regarding the physical and chemical properties of asteroids is that inferred through calculations, numerical simulations, extrapolation of experiments, and meteorite studies. Our understanding of the dynamics of accretion of planetesimals, collisional disruption of asteroids, and the macroscopic, shock-induced modification of the surfaces of such small objects is also, for the most part, founded on similar inferences. While considerable strides have been made in improving the state of asteroid science, too many unknowns remain to assert that we understand the parameters necessary for the more practical problem of deflecting an asteroid or asteroid pair on an Earth-intersecting trajectory. Many of these deficiencies could be reduced or eliminated by intentionally deorbiting an asteroidal satellite and monitoring the resulting collision between it and the primary asteroid, a capability that is well within the limitations of current technology.

  18. DOE SciDAC’s Earth System Grid Center for Enabling Technologies Final Report for University of Southern California Information Sciences Institute

    SciTech Connect

    Chervenak, Ann Louise

    2013-12-19

    The mission of the Earth System Grid Federation (ESGF) is to provide the worldwide climate-research community with access to the data, information, model codes, analysis tools, and intercomparison capabilities required to make sense of enormous climate data sets. Its specific goals are to (1) provide an easy-to-use and secure web-based data access environment for data sets; (2) add value to individual data sets by presenting them in the context of other data sets and tools for comparative analysis; (3) address the specific requirements of participating organizations with respect to bandwidth, access restrictions, and replication; (4) ensure that the data are readily accessible through the analysis and visualization tools used by the climate research community; and (5) transfer infrastructure advances to other domain areas. For the ESGF, the U.S. Department of Energy’s (DOE’s) Earth System Grid Center for Enabling Technologies (ESG-CET) team has led international development and delivered a production environment for managing and accessing ultra-scale climate data. This production environment includes multiple national and international climate projects (such as the Community Earth System Model and the Coupled Model Intercomparison Project), ocean model data (such as the Parallel Ocean Program), observation data (Atmospheric Radiation Measurement Best Estimate, Carbon Dioxide Information and Analysis Center, Atmospheric Infrared Sounder, etc.), and analysis and visualization tools, all serving a diverse user community. These data holdings and services are distributed across multiple ESG-CET sites (such as ANL, LANL, LBNL/NERSC, LLNL/PCMDI, NCAR, and ORNL) and at unfunded partner sites, such as the Australian National University National Computational Infrastructure, the British Atmospheric Data Centre, the National Oceanic and Atmospheric Administration Geophysical Fluid Dynamics Laboratory, the Max Planck Institute for Meteorology, the German Climate Computing

  19. Toilets and the Smart Grid: A role for history and art in communicating assessed science for Earth—The Operators' Manual

    NASA Astrophysics Data System (ADS)

    Alley, R. B.; Haines-Stiles, G.; Akuginow, E.

    2010-12-01

    Assessed science consistently shows that an economically efficient response to global warming would begin now, with the likelihood of side benefits including increased employment, security, and environmental quality. This result has been obtained consistently for many years, yet societal responses over this time have fallen well short of the economically efficient path, suggesting that society is being strongly influenced by additional considerations. First-hand experience indicates that many people, including many policy-makers, “know” global-warming “science” that did not come from the scientific assessment bodies or their participating scientists. Instead, this supposedly supporting science was provided by opponents of actions to deal with global warming, and was designed to be inaccurate and easily defeated (e.g., “All of global warming theory rests on the correlation between CO2 and temperature”, or “…rests on the hockey stick.”) A useful discussion of possible wise responses to the problem is difficult when so much that many people “know” just isn’t so. The inaccurate information has been presented very effectively, but we believe that accurate information can be presented even more effectively, honestly showing the costs and benefits of efficient response while explicitly addressing the widespread misconceptions. The history of previous environmental issues offers one path forward, with denial preceding solutions in such diverse cases as the San Francisco earthquake and toilets in Edinburgh. We will provide first-hand reports from preparation of an NSF Informal Science Education-funded project, Earth—The Operators’ Manual.

  20. National power grid simulation capability : need and issues

    SciTech Connect

    Petri, Mark C.

    2009-06-02

    On December 9 and 10, 2008, the Department of Homeland Security (DHS) Science and Technology Directorate sponsored a national workshop at Argonne National Laboratory to explore the need for a comprehensive modeling and simulation capability for the national electric power grid system. The workshop brought together leading electric power grid experts from federal agencies, the national laboratories, and academia to discuss the current state of power grid science and engineering and to assess if important challenges are being met. The workshop helped delineate gaps between grid needs and current capabilities and identify issues that must be addressed if a solution is to be implemented. This report is a result of the workshop and highlights power grid modeling and simulation needs, the barriers that must be overcome to address them, and the benefits of a national power grid simulation capability.

  1. Research Exemption/Experimental Use in the European Union: Patents Do Not Block the Progress of Science

    PubMed Central

    Jaenichen, Hans-Rainer; Pitz, Johann

    2015-01-01

    In the public debate about patents, specifically in the area of biotechnology, the position has been taken that patents block the progress of science. As we demonstrate in this review, this is not the case in the European Union (EU). The national patent acts of the EU member states define research and experimental use exemptions from patent infringement that allow sufficient room for research activities to promote innovation. This review provides a comparative overview of the legal requirements and the extent and limitations of experimental use exemptions, including the so-called Bolar provision, in Germany, the United Kingdom, France, Spain, Italy, and The Netherlands. The legal framework in the respective countries is illustrated with reference to practical examples concerning tests on patent-protected genetic targets and antibodies. Specific questions concerning the use of patent-protected research tools, the outsourcing of research activities, and the use of preparatory and supplying acts for experimental purposes that are necessary for conducting experiments are covered. PMID:25377145

  2. Research exemption/experimental use in the European Union: patents do not block the progress of science.

    PubMed

    Jaenichen, Hans-Rainer; Pitz, Johann

    2014-11-06

    In the public debate about patents, specifically in the area of biotechnology, the position has been taken that patents block the progress of science. As we demonstrate in this review, this is not the case in the European Union (EU). The national patent acts of the EU member states define research and experimental use exemptions from patent infringement that allow sufficient room for research activities to promote innovation. This review provides a comparative overview of the legal requirements and the extent and limitations of experimental use exemptions, including the so-called Bolar provision, in Germany, the United Kingdom, France, Spain, Italy, and The Netherlands. The legal framework in the respective countries is illustrated with reference to practical examples concerning tests on patent-protected genetic targets and antibodies. Specific questions concerning the use of patent-protected research tools, the outsourcing of research activities, and the use of preparatory and supplying acts for experimental purposes that are necessary for conducting experiments are covered.

  3. Which grids are Hamiltonian

    SciTech Connect

    Hedetniemi, S. M.; Hedetniemi, S. T.; Slater, P. J.

    1980-01-01

    A complete grid G/sub m,n/ is a graph having m x n pertices that are connected to form a rectangular lattice in the plane, i.e., all edges of G/sub m,n/ connect vertices along horizontal or vertical lines. A grid is a subgraph of a complete grid. As an illustration, complete grids describe the basic pattern of streets in most cities. This paper examines the existence of Hamiltonian cycles in complete grids and complete grids with one or two vertices removed. It is determined for most values of m,n greater than or equal to 1, which grids G/sub m,n/ - (u) and G/sub m,n/ - (u,v) are Hamiltonian. 12 figures. (RWR)

  4. The role of experimental science in ICF -- examples from X-ray diagnostics and targets

    NASA Astrophysics Data System (ADS)

    Kilkenny, J. D.

    2016-10-01

    The USA Inertial Confinement Fusion (ICF) Program evolved from the Nuclear Test Program which had restricted shot opportunities for experimentalists to develop sophisticated experimental techniques. In contrast the ICF program in the US was able to increase the shot availability on its large facilities, and develop sophisticated targets and diagnostics to measure and understand the properties of the high energy density plasmas (HEDP) formed. Illustrative aspects of this evolution at Lawrence Livermore National Laboratory (LLNL), with examples of the development of diagnostics and target fabrication are described.

  5. Understanding The Smart Grid

    SciTech Connect

    2007-11-15

    The report provides an overview of what the Smart Grid is and what is being done to define and implement it. The electric industry is preparing to undergo a transition from a centralized, producer-controlled network to a decentralized, user-interactive one. Not only will the technology involved in the electric grid change, but the entire business model of the industry will change too. A major objective of the report is to identify the changes that the Smart Grid will bring about so that industry participants can be prepared to face them. A concise overview of the development of the Smart Grid is provided. It presents an understanding of what the Smart Grid is, what new business opportunities or risks might come about due to its introduction, and what activities are already taking place regarding defining or implementing the Smart Grid. This report will be of interest to the utility industry, energy service providers, aggregators, and regulators. It will also be of interest to home/building automation vendors, information technology vendors, academics, consultants, and analysts. The scope of the report includes an overview of the Smart Grid which identifies the main components of the Smart Grid, describes its characteristics, and describes how the Smart Grid differs from the current electric grid. The overview also identifies the key concepts involved in the transition to the Smart Grid and explains why a Smart Grid is needed by identifying the deficiencies of the current grid and the need for new investment. The report also looks at the impact of the Smart Grid, identifying other industries which have gone through a similar transition, identifying the overall benefits of the Smart Grid, and discussing the impact of the Smart Grid on industry participants. Furthermore, the report looks at current activities to implement the Smart Grid including utility projects, industry collaborations, and government initiatives. Finally, the report takes a look at key technology

  6. Experimental pain processing in individuals with cognitive impairment: current state of the science.

    PubMed

    Defrin, Ruth; Amanzio, Martina; de Tommaso, Marina; Dimova, Violeta; Filipovic, Sasa; Finn, David P; Gimenez-Llort, Lydia; Invitto, Sara; Jensen-Dahm, Christina; Lautenbacher, Stefan; Oosterman, Joukje M; Petrini, Laura; Pick, Chaim G; Pickering, Gisele; Vase, Lene; Kunz, Miriam

    2015-08-01

    Cognitive impairment (CI) can develop during the course of ageing and is a feature of many neurological and neurodegenerative diseases. Many individuals with CI have substantial, sustained, and complex health care needs, which frequently include pain. However, individuals with CI can have difficulty communicating the features of their pain to others, which in turn presents a significant challenge for effective diagnosis and treatment of their pain. Herein, we review the literature on responsivity of individuals with CI to experimental pain stimuli. We discuss pain responding across a large number of neurological and neurodegenerative disorders in which CI is typically present. Overall, the existing data suggest that pain processing is altered in most individuals with CI compared with cognitively intact matched controls. The precise nature of these alterations varies with the type of CI (or associated clinical condition) and may also depend on the type of pain stimulation used and the type of pain responses assessed. Nevertheless, it is clear that regardless of the etiology of CI, patients do feel noxious stimuli, with more evidence for hypersensitivity than hyposensitivity to these stimuli compared with cognitively unimpaired individuals. Our current understanding of the neurobiological mechanisms underpinning these alterations is limited but may be enhanced through the use of animal models of CI, which also exhibit alterations in nociceptive responding. Further research using additional behavioural indices of pain is warranted. Increased understanding of altered experimental pain processing in CI will facilitate the development of improved diagnostic and therapeutic approaches for pain in individuals with CI.

  7. Virtual and Physical Experimentation in Inquiry-Based Science Labs: Attitudes, Performance and Access

    NASA Astrophysics Data System (ADS)

    Pyatt, Kevin; Sims, Rod

    2012-02-01

    This study investigated the learning dimensions that occur in physical and virtual inquiry-based lab investigations, in first-year secondary chemistry classes. This study took place over a 2 year period and utilized an experimental crossover design which consisted of two separate trials of laboratory investigation. Assessment data and attitudinal data were gathered and analyzed to measure the instructional value of physical and virtual lab experiences in terms of student performance and attitudes. Test statistics were conducted for differences of means for assessment data. Student attitudes towards virtual experiences in comparison to physical lab experiences were measured using a newly created Virtual and Physical Experimentation Questionnaire (VPEQ). VPEQ was specifically developed for this study, and included new scales of Usefulness of Lab, and Equipment Usability which measured attitudinal dimensions in virtual and physical lab experiences. A factor analysis was conducted for questionnaire data, and reliability of the scales and internal consistency of items within scales were calculated. The new scales were statistically valid and reliable. The instructional value of physical and virtual lab experiences was comparable in terms of student performance. Students showed preference towards the virtual medium in their lab experiences. Students showed positive attitudes towards physical and virtual experiences, and demonstrated a preference towards inquiry-based experiences, physical or virtual. Students found virtual experiences to have higher equipment usability as well as a higher degree of open-endedness. In regards to student access to inquiry-based lab experiences, virtual and online alternatives were viewed favorably by students.

  8. Navigation in Grid Space with the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We present a navigational tool for computational grids. The navigational process is based on measuring the grid characteristics with the NAS Grid Benchmarks (NGB) and using the measurements to assign tasks of a grid application to the grid machines. The tool allows the user to explore the grid space and to navigate the execution at a grid application to minimize its turnaround time. We introduce the notion of gridscape as a user view of the grid and show how it can be me assured by NGB, Then we demonstrate how the gridscape can be used with two different schedulers to navigate a grid application through a rudimentary grid.

  9. Large-eddy simulation of wind turbine wake interactions on locally refined Cartesian grids

    NASA Astrophysics Data System (ADS)

    Angelidis, Dionysios; Sotiropoulos, Fotis

    2014-11-01

    Performing high-fidelity numerical simulations of turbulent flow in wind farms remains a challenging issue mainly because of the large computational resources required to accurately simulate the turbine wakes and turbine/turbine interactions. The discretization of the governing equations on structured grids for mesoscale calculations may not be the most efficient approach for resolving the large disparity of spatial scales. A 3D Cartesian grid refinement method enabling the efficient coupling of the Actuator Line Model (ALM) with locally refined unstructured Cartesian grids adapted to accurately resolve tip vortices and multi-turbine interactions, is presented. Second order schemes are employed for the discretization of the incompressible Navier-Stokes equations in a hybrid staggered/non-staggered formulation coupled with a fractional step method that ensures the satisfaction of local mass conservation to machine zero. The current approach enables multi-resolution LES of turbulent flow in multi-turbine wind farms. The numerical simulations are in good agreement with experimental measurements and are able to resolve the rich dynamics of turbine wakes on grids containing only a small fraction of the grid nodes that would be required in simulations without local mesh refinement. This material is based upon work supported by the Department of Energy under Award Number DE-EE0005482 and the National Science Foundation under Award number NSF PFI:BIC 1318201.

  10. Grid enabled Service Support Environment - SSE Grid

    NASA Astrophysics Data System (ADS)

    Goor, Erwin; Paepen, Martine

    2010-05-01

    The SSEGrid project is an ESA/ESRIN project which started in 2009 and is executed by two Belgian companies, Spacebel and VITO, and one Dutch company, Dutch Space. The main project objectives are the introduction of a Grid-based processing on demand infrastructure at the Image Processing Centre for earth observation products at VITO and the inclusion of Grid processing services in the Service Support Environment (SSE) at ESRIN. The Grid-based processing on demand infrastructure is meant to support a Grid processing on demand model for Principal Investigators (PI) and allow the design and execution of multi-sensor applications with geographically spread data while minimising the transfer of huge volumes of data. In the first scenario, 'support a Grid processing on demand model for Principal Investigators', we aim to provide processing power close to the EO-data at the processing and archiving centres. We will allow a PI (non-Grid expert user) to upload his own algorithm, as a process, and his own auxiliary data from the SSE Portal and use them in an earth observation workflow on the SSEGrid Infrastructure. The PI can design and submit workflows using his own processes, processes made available by VITO/ESRIN and possibly processes from other users that are available on the Grid. These activities must be user-friendly and not requiring detailed knowledge about the underlying Grid middleware. In the second scenario we aim to design, implement and demonstrate a methodology to set up an earth observation processing facility, which uses large volumes of data from various geographically spread sensors. The aim is to provide solutions for problems that we face today, like wasting bandwidth by copying large volumes of data to one location. We will avoid this by processing the data where they are. The multi-mission Grid-based processing on demand infrastructure will allow developing and executing complex and massive multi-sensor data (re-)processing applications more

  11. Science wars—How much risk should soldiers be exposed to in military experimentation?

    PubMed Central

    Savulescu, Julian

    2015-01-01

    With the threat of biological war becoming a more and more distinct possibility, there is a growing need for vaccines and cures for diseases. As warfare moves from the battlefield to the laboratory, the military must adapt its tactics in order to preserve national security. At the moment, soldiers consent to the risk associated with combat, but with the changing nature of war, the need may arise for soldiers to put themselves at risk not only through combat, but also through scientific experimentation, in order to produce vaccines or cures and ultimately maintain national security. By allowing soldiers to trade risk on the battlefield with risk in the laboratory, deeper research can be made into diseases and biological agents, and this would therefore lessen the threat of biological war or terrorism. PMID:27774185

  12. Experimental studies in fluid mechanics and materials science using acoustic levitation

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.; Robey, J.; Arce, A.; Gaspar, M.

    1987-01-01

    Ground-based and short-duration low gravity experiments have been carried out with the use of ultrasonic levitators to study the dynamics of freely suspended liquid drops under the influence of predominantly capillary and acoustic radiation forces. Some of the effects of the levitating field on the shape as well as the fluid flow fields within the drop have been determined. The development and refinement of measurement techniques using levitated drops with size on the order of 2 mm in diameter have yielded methods having direct application to experiments in microgravity. In addition, containerless melting, undercooling, and freezing of organic materials as well as low melting metals have provided experimental data and observations on the application of acoustic positioning techniques to materials studies.

  13. Political science. Reverse-engineering censorship in China: randomized experimentation and participant observation.

    PubMed

    King, Gary; Pan, Jennifer; Roberts, Margaret E

    2014-08-22

    Existing research on the extensive Chinese censorship organization uses observational methods with well-known limitations. We conducted the first large-scale experimental study of censorship by creating accounts on numerous social media sites, randomly submitting different texts, and observing from a worldwide network of computers which texts were censored and which were not. We also supplemented interviews with confidential sources by creating our own social media site, contracting with Chinese firms to install the same censoring technologies as existing sites, and--with their software, documentation, and even customer support--reverse-engineering how it all works. Our results offer rigorous support for the recent hypothesis that criticisms of the state, its leaders, and their policies are published, whereas posts about real-world events with collective action potential are censored.

  14. Corrosion chemistry closing comments: opportunities in corrosion science facilitated by operando experimental characterization combined with multi-scale computational modelling.

    PubMed

    Scully, John R

    2015-01-01

    Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.

  15. Securing smart grid technology

    NASA Astrophysics Data System (ADS)

    Chaitanya Krishna, E.; Kosaleswara Reddy, T.; Reddy, M. YogaTeja; Reddy G. M., Sreerama; Madhusudhan, E.; AlMuhteb, Sulaiman

    2013-03-01

    In the developing countries electrical energy is very important for its all-round improvement by saving thousands of dollars and investing them in other sector for development. For Growing needs of power existing hierarchical, centrally controlled grid of the 20th Century is not sufficient. To produce and utilize effective power supply for industries or people we should have Smarter Electrical grids that address the challenges of the existing power grid. The Smart grid can be considered as a modern electric power grid infrastructure for enhanced efficiency and reliability through automated control, high-power converters, modern communications infrastructure along with modern IT services, sensing and metering technologies, and modern energy management techniques based on the optimization of demand, energy and network availability and so on. The main objective of this paper is to provide a contemporary look at the current state of the art in smart grid communications as well as critical issues on smart grid technologies primarily in terms of information and communication technology (ICT) issues like security, efficiency to communications layer field. In this paper we propose new model for security in Smart Grid Technology that contains Security Module(SM) along with DEM which will enhance security in Grid. It is expected that this paper will provide a better understanding of the technologies, potential advantages and research challenges of the smart grid and provoke interest among the research community to further explore this promising research area.

  16. Multi-scale Laboratory Experimentation in Hydrologic Sciences- Challenges and Opportunities.

    NASA Astrophysics Data System (ADS)

    Illangasekare, T. H.

    2015-12-01

    Problems of water sustainability to meet the increasing needs of a growing world population, further exacerbated by climate change, will continually challenge hydrologists and other earth and environmental scientists. Significant theoretical, modeling, and computational advances, and technology developments for improved observations, monitoring, and characterization that have taken place during the last several decades have helped to meet some of these challenges. In parallel, field and laboratory studies for conceptualization, hypothesis testing, and model improvements have continued to advance hydrologic sciences. However, the data to study some of the problems in hydrology cannot always be obtained from field studies where many factors contribute to the uncertainty of measurements and parameter estimates. The primary thesis of this talk is that laboratory experiments conducted at multiple test scales will play an important role by providing new insights into complex processes and accurate data for model improvement, leading to increased accuracy and reliability of predictions. However, performing such controlled experiments poses many challenges such as acquiring data at different observational scales, capturing relevant features of geologic heterogeneity, mimicking field specific pressure and temperature dependent phase interactions in the laboratory, and simulating climate drivers, among others. Focusing on the subsurface and using examples from multiphase systems, coastal aquifer salinization, and land/atmospheric interactions, I will show how to design and implement theory-driven experiments to address some of these challenges. I will make the case that addressing problems in hydrology requires continuous interaction among laboratory and field studies and modeling. It is imperative that hydrologists work at the disciplinary interfaces related to earth, water, energy, and the environment to address current and emerging problems that are of global importance.

  17. Public judgment on science expenditure in the national budget of Japan: An experimental approach to examining the effects of unpacking science.

    PubMed

    Yokoyama, Hiromi M; Nakayachi, Kazuya

    2014-07-01

    How does the public assess an appropriate financial allocation to science promotion? This article empirically examined the subadditivity effect in the judgment of budgetary allocation. Results of the first experiment showed that the ratio of the national budget allocated for science promotion by participants increased when science was decomposed into more specific categories compared to when it was presented as "science promotion" alone. Consistent with these findings, results of the second experiment showed that the allotment ratio to science promotion decreased when the number of other expenditure items increased. Meanwhile, the third experiment revealed that in the case of a budgetary cutback, the total amount taken from science promotion greatly increased when science was decomposed into subcategories. The subadditivity effect and increase in the total allotment ratio by unpacking science promotion was confirmed by these three experiments not only on budgetary allocation but also on budgetary cutback.

  18. DICOM image communication in globus-based medical grids.

    PubMed

    Vossberg, Michal; Tolxdorff, Thomas; Krefting, Dagmar

    2008-03-01

    Grid computing, the collaboration of distributed resources across institutional borders, is an emerging technology to meet the rising demand on computing power and storage capacity in fields such as high-energy physics, climate modeling, or more recently, life sciences. A secure, reliable, and highly efficient data transport plays an integral role in such grid environments and even more so in medical grids. Unfortunately, many grid middleware distributions, such as the well-known Globus Toolkit, lack the integration of the world-wide medical image communication standard Digital Imaging and Communication in Medicine (DICOM). Currently, the DICOM protocol first needs to be converted to the file transfer protocol (FTP) that is offered by the grid middleware. This effectively reduces most of the advantages and security an integrated network of DICOM devices offers. In this paper, a solution is proposed that adapts the DICOM protocol to the Globus grid security infrastructure and utilizes routers to transparently route traffic to and from DICOM systems. Thus, all legacy DICOM devices can be seamlessly integrated into the grid without modifications. A prototype of the grid routers with the most important DICOM functionality has been developed and successfully tested in the MediGRID test bed, the German grid project for life sciences.

  19. Grid Application for the BaBar Experiment

    SciTech Connect

    Khan, A.; Wilson, F.; /Rutherford

    2006-08-14

    This paper discusses the use of e-Science Grid in providing computational resources for modern international High Energy Physics (HEP) experiments. We investigate the suitability of the current generation of Grid software to provide the necessary resources to perform large-scale simulation of the experiment and analysis of data in the context of multinational collaboration.

  20. ITIL and Grid services at GridKa

    NASA Astrophysics Data System (ADS)

    Marten, H.; Koenig, T.

    2010-04-01

    The Steinbuch Centre for Computing (SCC) is a new organizational unit of the Karlsruhe Institute of Technology (KIT). Founded in February 2008 as a merger of the previous Institute for Scientific Computing of Forschungszentrum Karlsruhe and the Computing Centre of the Technical University Karlsruhe, SCC provides a broad spectrum of IT services for 8.000 employees and 18.000 students and carries out research and development in key areas of information technology under the same roof. SCC is also known to host the German WLCG [1] Tier-1 centre GridKa. In order to accompany the merging of the two existing computing centres located at a distance of about 10 km and to provide common first class services for science, SCC has selected the IT service management according to the industrial quasi-standard "IT Infrastructure Library (ITIL)" [3] as a strategic element. The paper discusses the implementation of a few ITIL key components from the perspective of a Scientific Computing Centre using examples of Grid services at GridKa.

  1. Grid computing and biomolecular simulation.

    PubMed

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  2. Spatial Data Infrastructures and Grid Computing: the GDI-Grid project

    NASA Astrophysics Data System (ADS)

    Padberg, A.; Kiehle, C.

    2009-04-01

    Distribution of spatial data through standards compliant spatial data infrastructures (SDI) is a fairly common practice nowadays. The Open Geospatial Consortium (OGC) offers a broad range of implementation specifications for accessing and presenting spatial data. In December 2007 the OGC published the Web Processing Service specification (WPS) for extending the capabilities of a SDI to include the processing of distributed data. By utilizing a WPS it is possible to shift the workload from the client to the server. Furthermore it is possible to create automated workflows that include data processing without the need for user interaction or manual computation of data via a desktop GIS. When complex processes are offered or large amounts of data are processed by a WPS, the computational power of the server might not suffice. Especially when such processes are invoked by a multitude of users the server might not able to provide the wanted performance. In this case, Grid Computing is one way to provide the required computational power by accessing great quantities of worker nodes in an existing Grid infrastructure through a Grid middleware. Due to their respective origins the paradigms of SDIs and Grid infrastructures differ significantly in several important matters. For instance security is handled differently in the scope of OWS and Grid Computing. While the OGC does not yet specify a comprehensive security concept, strict security rules are a top priority in Grid Computing where providers need a certain degree of control over their resources and users want to process sensitive data on external resources. To create a SDI that is able to benefit from the computational power and the vast storage capacities of a Grid infrastructure it is necessary to overcome all the conceptual differences between OWS and Grid Computing. GDI-Grid (english: SDI-Grid) is a project funded by the German Ministry for Science and Education. It aims at bridging the aforementioned gaps between

  3. Impingement-Current-Erosion Characteristics of Accelerator Grids on Two-Grid Ion Thrusters

    NASA Technical Reports Server (NTRS)

    Barker, Timothy

    1996-01-01

    Accelerator grid sputter erosion resulting from charge-exchange-ion impingement is considered to be a primary cause of failure for electrostatic ion thrusters. An experimental method was developed and implemented to measure erosion characteristics of ion-thruster accel-grids for two-grid systems as a function of beam current, accel-grid potential, and facility background pressure. Intricate accelerator grid erosion patterns, that are typically produced in a short time (a few hours), are shown. Accelerator grid volumetric and depth-erosion rates are calculated from these erosion patterns and reported for each of the parameters investigated. A simple theoretical volumetric erosion model yields results that are compared to experimental findings. Results from the model and experiments agree to within 10%, thereby verifying the testing technique. In general, the local distribution of erosion is concentrated in pits between three adjacent holes and trenches that join pits. The shapes of the pits and trenches are shown to be dependent upon operating conditions. Increases in beam current and the accel-grid voltage magnitude lead to deeper pits and trenches. Competing effects cause complex changes in depth-erosion rates as background pressure is increased. Shape factors that describe pits and trenches (i.e. ratio of the average erosion width to the maximum possible width) are also affected in relatively complex ways by changes in beam current, ac tel-grid voltage magnitude, and background pressure. In all cases, however, gross volumetric erosion rates agree with theoretical predictions.

  4. Ambiguities in the grid-inefficiency correction for Frisch-Grid Ionization Chambers

    NASA Astrophysics Data System (ADS)

    Al-Adili, A.; Hambsch, F.-J.; Bencardino, R.; Oberstedt, S.; Pomp, S.

    2012-05-01

    Ionization chambers with Frisch grids have been very successfully applied to neutron-induced fission-fragment studies during the past 20 years. They are radiation resistant and can be easily adapted to the experimental conditions. The use of Frisch grids has the advantage to remove the angular dependency from the charge induced on the anode plate. However, due to the Grid Inefficiency (GI) in shielding the charges, the anode signal remains slightly angular dependent. The correction for the GI is, however, essential to determine the correct energy of the ionizing particles. GI corrections can amount to a few percent of the anode signal. Presently, two contradicting correction methods are considered in literature. The first method adding the angular-dependent part of the signal to the signal pulse height; the second method subtracting the former from the latter. Both additive and subtractive approaches were investigated in an experiment where a Twin Frisch-Grid Ionization Chamber (TFGIC) was employed to detect the spontaneous fission fragments (FF) emitted by a 252Cf source. Two parallel-wire grids with different wire spacing (1 and 2 mm, respectively), were used individually, in the same chamber side. All the other experimental conditions were unchanged. The 2 mm grid featured more than double the GI of the 1 mm grid. The induced charge on the anode in both measurements was compared, before and after GI correction. Before GI correction, the 2 mm grid resulted in a lower pulse-height distribution than the 1 mm grid. After applying both GI corrections to both measurements only the additive approach led to consistent grid independent pulse-height distributions. The application of the subtractive correction on the contrary led to inconsistent, grid-dependent results. It is also shown that the impact of either of the correction methods is small on the FF mass distributions of 235U(nth, f).

  5. Understanding the Grid

    SciTech Connect

    2016-01-14

    The electric power grid has been rightly celebrated as the single most important engineering feat of the 20th century. The grid powers our homes, offices, hospitals, and schools; and, increasingly, it powers our favorite devices from smartphones to HDTVs. With those and other modern innovations and challenges, our grid will need to evolve. Grid modernization efforts will help the grid make full use of today’s advanced technologies and serve our needs in the 21st century. While the vast majority of upgrades are implemented by private sector energy companies that own and operate the grid, DOE has been investing in technologies that are revolutionizing the way we generate, store and transmit power.

  6. Enhanced Elliptic Grid Generation

    NASA Technical Reports Server (NTRS)

    Kaul, Upender K.

    2007-01-01

    An enhanced method of elliptic grid generation has been invented. Whereas prior methods require user input of certain grid parameters, this method provides for these parameters to be determined automatically. "Elliptic grid generation" signifies generation of generalized curvilinear coordinate grids through solution of elliptic partial differential equations (PDEs). Usually, such grids are fitted to bounding bodies and used in numerical solution of other PDEs like those of fluid flow, heat flow, and electromagnetics. Such a grid is smooth and has continuous first and second derivatives (and possibly also continuous higher-order derivatives), grid lines are appropriately stretched or clustered, and grid lines are orthogonal or nearly so over most of the grid domain. The source terms in the grid-generating PDEs (hereafter called "defining" PDEs) make it possible for the grid to satisfy requirements for clustering and orthogonality properties in the vicinity of specific surfaces in three dimensions or in the vicinity of specific lines in two dimensions. The grid parameters in question are decay parameters that appear in the source terms of the inhomogeneous defining PDEs. The decay parameters are characteristic lengths in exponential- decay factors that express how the influences of the boundaries decrease with distance from the boundaries. These terms govern the rates at which distance between adjacent grid lines change with distance from nearby boundaries. Heretofore, users have arbitrarily specified decay parameters. However, the characteristic lengths are coupled with the strengths of the source terms, such that arbitrary specification could lead to conflicts among parameter values. Moreover, the manual insertion of decay parameters is cumbersome for static grids and infeasible for dynamically changing grids. In the present method, manual insertion and user specification of decay parameters are neither required nor allowed. Instead, the decay parameters are

  7. A grid amplifier

    NASA Technical Reports Server (NTRS)

    Kim, Moonil; Weikle, Robert M., II; Hacker, Jonathan B.; Delisio, Michael P.; Rutledge, David B.; Rosenberg, James J.; Smith, R. P.

    1991-01-01

    A 50-MESFET grid amplifier is reported that has a gain of 11 dB at 3.3 GHz. The grid isolates the input from the output by using vertical polarization for the input beam and horizontal polarization for the transmitted output beam. The grid unit cell is a two-MESFET differential amplifier. A simple calibration procedure allows the gain to be calculated from a relative power measurement. This grid is a hybrid circuit, but the structure is suitable for fabrication as a monolithic wafer-scale integrated circuit, particularly at millimeter wavelengths.

  8. Unstructured surface grid generation

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1993-01-01

    Viewgraphs on unstructured surface grid generation are presented. Topics covered include: requirements for curves, surfaces, solids, and text; surface approximation; triangulation; advancing; projection; mapping; and parametric curves.

  9. Grid generation for a complex aircraft configuration

    NASA Technical Reports Server (NTRS)

    Bruns, Jim

    1992-01-01

    The procedure used to create a grid around the F/A-18 fighter aircraft is presented. This work was done for the NASA High Alpha Technology Program. As part of this program, LeRC is numerically and experimentally investigating the flow in the F/A-18 inlet duct at high angles of attack. A grid was needed which could be used to calculate both the external and internal flow around the F/A-18. The grid had to be compatible with the computational fluid dynamics (CFD) codes PARC3D and CFL3D. The programs used to create this grid were I3GVIRGO and GRIDGEN. A surface definition used to create the grid was obtained from McDonnell Aircraft Company (MCAIR) and was composed of numerous files each containing a point definition of a portion of the aircraft. These files were read into the geometry manipulation program I3GVIRGO, where they were modified and grouped into smaller GRIDGEN database files. Next, the block outlines and boundary conditions were specified in the GRIDBLOCK program. The GRIDGEN2D program was used to create the surface grid on the block faces, and GRIDGEN3D was used to create the full 3-D grid.

  10. SYSTEMS MANUAL FOR THE EXPERIMENTAL LITERATURE COLLECTION AND REFERENCE RETRIEVAL SYSTEM OF THE CENTER FOR THE INFORMATION SCIENCES. EXPERIMENTAL RETRIEVAL SYSTEMS STUDIES, REPORT NUMBER 2.

    ERIC Educational Resources Information Center

    ANDERSON, RONALD R.; TAYLOR, ROBERT S.

    THE MANUAL DESCRIBES AND DOCUMENTS THE RETRIEVAL SYSTEM IN TERMS OF ITS TAPE AND DISK FILE PROGRAMS AND ITS SEARCH PROGRAMS AS USED BY THE LEHIGH CENTER FOR THE INFORMATION SCIENCES FOR SELECTED CURRENT LITERATURE OF THE INFORMATION SCIENCES, ABOUT 2500 DOCUMENT REFERENCES. THE SYSTEM IS PRESENTLY ON-LINE VIA TELETYPE AND CONVERSION IS IN PROCESS…

  11. Spaceflight Operations Services Grid (SOSG)

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Thigpen, William W.

    2004-01-01

    environment that incorporates existing and new spaceflight services into a standards-based framework providing current and future NASA programs with cost savings and new and evolvable methods to conduct science. This project will demonstrate how the use of new programming paradigms such as web and grid services can provide three significant benefits to the cost-effective delivery of spaceflight services. They will enable applications to operate more efficiently by being able to utilize pooled resources. They will also permit the reuse of common services to rapidly construct new and more powerful applications. Finally they will permit easy and secure access to services via a combination of grid and portal technology by a distributed user community consisting of NASA operations centers, scientists, the educational community and even the general population as outreach. The approach will be to deploy existing mission support applications such as the Telescience Resource Kit (TReK) and new applications under development, such as the Grid Video Distribution System (GViDS), together with existing grid applications and services such as high-performance computing and visualization services provided by NASA s Information Power Grid (IPG) in the MSFC s Payload Operations Integration Center (POIC) HOSC Annex. Once the initial applications have been moved to the grid, a process will begin to apply the new programming paradigms to integrate them where possible. For example, with GViDS, instead of viewing the Distribution service as an application that must run on a single node, the new approach is to build it such that it can be dispatched across a pool of resources in response to dynamic loads. To make this a reality, reusable services will be critical, such as a brokering service to locate appropriate resource within the pool. This brokering service can then be used by other applications such as the TReK. To expand further, if the GViDS application is constructed using a services

  12. Security for grids

    SciTech Connect

    Humphrey, Marty; Thompson, Mary R.; Jackson, Keith R.

    2005-08-14

    Securing a Grid environment presents a distinctive set of challenges. This paper groups the activities that need to be secured into four categories: naming and authentication; secure communication; trust, policy, and authorization; and enforcement of access control. It examines the current state of the art in securing these processes and introduces new technologies that promise to meet the security requirements of Grids more completely.

  13. Internet 2 Access Grid.

    ERIC Educational Resources Information Center

    Simco, Greg

    2002-01-01

    Discussion of the Internet 2 Initiative, which is based on collaboration among universities, businesses, and government, focuses on the Access Grid, a Computational Grid that includes interactive multimedia within high-speed networks to provide resources to enable remote collaboration among the research community. (Author/LRW)

  14. Geometric grid generation

    NASA Technical Reports Server (NTRS)

    Ives, David

    1995-01-01

    This paper presents a highly automated hexahedral grid generator based on extensive geometrical and solid modeling operations developed in response to a vision of a designer-driven one day turnaround CFD process which implies a designer-driven one hour grid generation process.

  15. FURSMASA: a new approach to rapid scoring functions that uses a MD-averaged potential energy grid and a solvent-accessible surface area term with parameters GA fit to experimental data.

    PubMed

    Pearlman, David A; Rao, B Govinda; Charifson, Paul

    2008-05-15

    We demonstrate a new approach to the development of scoring functions through the formulation and parameterization of a new function, which can be used both for rapidly ranking the binding of ligands to proteins and for estimating relative aqueous molecular solubilities. The intent of this work is to introduce a new paradigm for creation of scoring functions, wherein we impose the following criteria upon the function: (1) simple; (2) intuitive; (3) requires no postparameterization tweaking; (4) can be applied (without reparameterization) to multiple target systems; and (5) can be rapidly evaluated for any potential ligand. Following these criteria, a new function, FURSMASA (function for rapid scoring using an MD-averaged grid and the accessible surface area) has been developed. Three novel features of the function include: (1) use of an MD-averaged potential energy grid for ligand-protein interactions, rather than a simple static grid; (2) inclusion of a term that depends on the change in the solvent-accessible surface area changes on an atomic (not molecular) basis; and (3) use of the recently derived predictive index (PI) target when optimizing the function, which focuses the function on its intended purpose of relative ranking. A genetic algorithm is used to optimize the function against test data sets that include ligands for the following proteins: IMPDH, p38, gyrase B, HIV-1, and TACE, as well as the Syracuse Research solubility database. We find that the function is predictive, and can simultaneously fit all the test data sets with cross-validated predictive indices ranging from 0.68 to 0.82. As a test of the ability of this function to predict binding for systems not in the training set, the resulting fitted FURSAMA function is then applied to 23 ligands of the COX-2 enzyme. Comparing the results for COX-2 against those obtained using a variety of well-known rapid scoring functions demonstrates that FURSMASA outperforms all of them in terms of the PI and

  16. Optimization Of A Computational Grid

    NASA Technical Reports Server (NTRS)

    Pearce, Daniel G.

    1993-01-01

    In improved method of generation of computational grid, grid-generation process decoupled from definition of geometry. Not necessary to redefine boundary. Instead, continuous boundaries in physical domain specified, and then grid points in computational domain mapped onto continuous boundaries.

  17. The experimental teaching reform in biochemistry and molecular biology for undergraduate students in Peking University Health Science Center.

    PubMed

    Yang, Xiaohan; Sun, Luyang; Zhao, Ying; Yi, Xia; Zhu, Bin; Wang, Pu; Lin, Hong; Ni, Juhua

    2015-01-01

    Since 2010, second-year undergraduate students of an eight-year training program leading to a Doctor of Medicine degree or Doctor of Philosophy degree in Peking University Health Science Center (PKUHSC) have been required to enter the "Innovative talent training project." During that time, the students joined a research lab and participated in some original research work. There is a critical educational need to prepare these students for the increasing accessibility of research experience. The redesigned experimental curriculum of biochemistry and molecular biology was developed to fulfill such a requirement, which keeps two original biochemistry experiments (Gel filtration and Enzyme kinetics) and adds a new two-experiment component called "Analysis of anti-tumor drug induced apoptosis." The additional component, also known as the "project-oriented experiment" or the "comprehensive experiment," consists of Western blotting and a DNA laddering assay to assess the effects of etoposide (VP16) on the apoptosis signaling pathways. This reformed laboratory teaching system aims to enhance the participating students overall understanding of important biological research techniques and the instrumentation involved, and to foster a better understanding of the research process all within a classroom setting. Student feedback indicated that the updated curriculum helped them improve their operational and self-learning capability, and helped to increase their understanding of theoretical knowledge and actual research processes, which laid the groundwork for their future research work.

  18. Using Arts Integration to Make Science Learning Memorable in the Upper Elementary Grades: A Quasi-Experimental Study

    ERIC Educational Resources Information Center

    Graham, Nicholas James; Brouillette, Liane

    2016-01-01

    The Next Generation Science Standards (NGSS) have brought a stronger emphasis on engineering into K-12 STEM (science, technology, engineering and mathematics) instruction. Introducing the design process used in engineering into science classrooms simulated a dialogue among some educators about adding the arts to the mix. This led to proposals for…

  19. Network Science Experimentation Vision

    DTIC Science & Technology

    2015-09-01

    is referred to here as a multi-genre composite network . Given that the term “ network ” is used in a multiplicity of ways in a variety of contexts...expertise, models, and tools in multiple domains. These areas of expertise include, but are not limited to, the following: • networks and network ...composite networks are there to support multiple missions. While this report focuses on experiments that involve a single mission, extending them to

  20. Decentral Smart Grid Control

    NASA Astrophysics Data System (ADS)

    Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk

    2015-01-01

    Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals.

  1. Grid Connected Functionality

    DOE Data Explorer

    Baker, Kyri; Jin, Xin; Vaidynathan, Deepthi; Jones, Wesley; Christensen, Dane; Sparn, Bethany; Woods, Jason; Sorensen, Harry; Lunacek, Monte

    2016-08-04

    Dataset demonstrating the potential benefits that residential buildings can provide for frequency regulation services in the electric power grid. In a hardware-in-the-loop (HIL) implementation, simulated homes along with a physical laboratory home are coordinated via a grid aggregator, and it is shown that their aggregate response has the potential to follow the regulation signal on a timescale of seconds. Connected (communication-enabled), devices in the National Renewable Energy Laboratory's (NREL's) Energy Systems Integration Facility (ESIF) received demand response (DR) requests from a grid aggregator, and the devices responded accordingly to meet the signal while satisfying user comfort bounds and physical hardware limitations.

  2. Energy Systems Integration: Demonstrating the Grid Benefits of Connected Devices

    SciTech Connect

    2017-01-01

    Overview fact sheet about the Electric Power Research Institute (EPRI) and the University of Delaware Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project at the Energy Systems Integration Facility. INTEGRATE is part of the U.S. Department of Energy's Grid Modernization Initiative.

  3. Surface grid generation for multi-block structured grids

    NASA Astrophysics Data System (ADS)

    Spekreijse, S. P.; Boerstoel, J. W.; Kuyvenhoven, J. L.; van der Marel, M. J.

    A new grid generation technique for the computation of a structured grid on a generally curved surface in 3D is discussed. The starting assumption is that the parameterization of the surface exists, i.e. a smooth geometrical shape function exists which maps the parametric space (the unit square) one-to-one on the surface. The grid generation system computes a grid on the surface with as boundary conditions the following data specified along the four edges of the surface: (1) the position of the boundary grid points, (2) the grid line slopes at the boundary grid points, (3) the first grid cell lengths at the boundary grid points. The fourth-order elliptic biharmonic equations are used to compute the two families of grid lines in the parametric space. After that, each grid point in the parametric space is found as the intersection point between two individual grid lines, one from each family. The grid points on the surface are finally found by mapping the grid points in the parametric space on the surface via the geometrical shape function. Results are shown for an O-type 2D Euler grid, a C-type 2D Navier-Stokes grid and on some curved surfaces in 3D space.

  4. Grid Computing Education Support

    SciTech Connect

    Steven Crumb

    2008-01-15

    The GGF Student Scholar program enabled GGF the opportunity to bring over sixty qualified graduate and under-graduate students with interests in grid technologies to its three annual events over the three-year program.

  5. Space Development Grid Portal

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2004-01-01

    This viewgraph presentation provides information on the development of a portal to provide secure and distributed grid computing for Payload Operations Integrated Center and Mission Control Center ground services.

  6. Random array grid collimator

    DOEpatents

    Fenimore, E.E.

    1980-08-22

    A hexagonally shaped quasi-random no-two-holes touching grid collimator. The quasi-random array grid collimator eliminates contamination from small angle off-axis rays by using a no-two-holes-touching pattern which simultaneously provides for a self-supporting array increasng throughput by elimination of a substrate. The presentation invention also provides maximum throughput using hexagonally shaped holes in a hexagonal lattice pattern for diffraction limited applications. Mosaicking is also disclosed for reducing fabrication effort.

  7. Consideration of Experimental Approaches in the Physical and Biological Sciences in Designing Long-Term Watershed Studies in Forested Landscapes

    NASA Astrophysics Data System (ADS)

    Stallard, R. F.

    2011-12-01

    The importance of biological processes in controlling weathering, erosion, stream-water composition, soil formation, and overall landscape development is generally accepted. The U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) Project in eastern Puerto Rico and Panama and the Smithsonian Tropical Research Institute (STRI) Panama Canal Watershed Experiment (PCWE) are landscape-scale studies based in the humid tropics where the warm temperatures, moist conditions, and luxuriant vegetation promote especially rapid biological and chemical processes - photosynthesis, respiration, decay, and chemical weathering. In both studies features of small-watershed, large-watershed, and landscape-scale-biology experiments are blended to satisfy the research needs of the physical and biological sciences. The WEBB Project has successfully synthesized its first fifteen years of data, and has addressed the influence of land cover, geologic, topographic, and hydrologic variability, including huge storms on a wide range of hydrologic, physical, and biogeochemical processes. The ongoing PCWE should provide a similar synthesis of a moderate-sized humid tropical watershed. The PCWE and the Agua Salud Project (ASP) within the PCWE are now addressing the role of land cover (mature forests, pasture, invasive-grass dominated, secondary succession, native species plantation, and teak) at scales ranging from small watersheds to the whole Panama Canal watershed. Biologists have participated in the experimental design at both watershed scales, and small (0.1 ha) to large (50 ha) forest-dynamic plots have a central role in interfacing between physical scientists and biologists. In these plots, repeated, high-resolution mapping of all woody plants greater than 1-cm diameter provides a description of population changes through time presumably reflecting individual life histories, interactions with other organisms and the influence of landscape processes and climate

  8. Visual Methods for Model and Grid Validation

    NASA Technical Reports Server (NTRS)

    Pang, Alex

    1998-01-01

    This joint research interchange proposal allowed us to contribute in two directions that are of interest to NASA. These are: (a) data level comparative visualization of experimental and computational fluid flow, and (b) visualization tools for analysis of adaptively refined Cartesian grids.

  9. Near-Body Grid Adaption for Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; Pulliam, Thomas H.

    2016-01-01

    A solution adaption capability for curvilinear near-body grids has been implemented in the OVERFLOW overset grid computational fluid dynamics code. The approach follows closely that used for the Cartesian off-body grids, but inserts refined grids in the computational space of original near-body grids. Refined curvilinear grids are generated using parametric cubic interpolation, with one-sided biasing based on curvature and stretching ratio of the original grid. Sensor functions, grid marking, and solution interpolation tasks are implemented in the same fashion as for off-body grids. A goal-oriented procedure, based on largest error first, is included for controlling growth rate and maximum size of the adapted grid system. The adaption process is almost entirely parallelized using MPI, resulting in a capability suitable for viscous, moving body simulations. Two- and three-dimensional examples are presented.

  10. Exploring Hypersonic, Unstructured-Grid Issues through Structured Grids

    NASA Technical Reports Server (NTRS)

    Mazaheri, Ali R.; Kleb, Bill

    2007-01-01

    Pure-tetrahedral unstructured grids have been shown to produce asymmetric heat transfer rates for symmetric problems. Meanwhile, two-dimensional structured grids produce symmetric solutions and as documented here, introducing a spanwise degree of freedom to these structured grids also yields symmetric solutions. The effects of grid skewness and other perturbations of structured-grids are investigated to uncover possible mechanisms behind the unstructured-grid solution asymmetries. By using controlled experiments around a known, good solution, the effects of particular grid pathologies are uncovered. These structured-grid experiments reveal that similar solution degradation occurs as for unstructured grids, especially for heat transfer rates. Non-smooth grids within the boundary layer is also shown to produce large local errors in heat flux but do not affect surface pressures.

  11. Using Grid Benchmarks for Dynamic Scheduling of Grid Applications

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert

    2003-01-01

    Navigation or dynamic scheduling of applications on computational grids can be improved through the use of an application-specific characterization of grid resources. Current grid information systems provide a description of the resources, but do not contain any application-specific information. We define a GridScape as dynamic state of the grid resources. We measure the dynamic performance of these resources using the grid benchmarks. Then we use the GridScape for automatic assignment of the tasks of a grid application to grid resources. The scalability of the system is achieved by limiting the navigation overhead to a few percent of the application resource requirements. Our task submission and assignment protocol guarantees that the navigation system does not cause grid congestion. On a synthetic data mining application we demonstrate that Gridscape-based task assignment reduces the application tunaround time.

  12. Beyond grid security

    NASA Astrophysics Data System (ADS)

    Hoeft, B.; Epting, U.; Koenig, T.

    2008-07-01

    While many fields relevant to Grid security are already covered by existing working groups, their remit rarely goes beyond the scope of the Grid infrastructure itself. However, security issues pertaining to the internal set-up of compute centres have at least as much impact on Grid security. Thus, this talk will present briefly the EU ISSeG project (Integrated Site Security for Grids). In contrast to groups such as OSCT (Operational Security Coordination Team) and JSPG (Joint Security Policy Group), the purpose of ISSeG is to provide a holistic approach to security for Grid computer centres, from strategic considerations to an implementation plan and its deployment. The generalised methodology of Integrated Site Security (ISS) is based on the knowledge gained during its implementation at several sites as well as through security audits, and this will be briefly discussed. Several examples of ISS implementation tasks at the Forschungszentrum Karlsruhe will be presented, including segregation of the network for administration and maintenance and the implementation of Application Gateways. Furthermore, the web-based ISSeG training material will be introduced. This aims to offer ISS implementation guidance to other Grid installations in order to help avoid common pitfalls.

  13. Grid generation strategies for turbomachinery configurations

    NASA Technical Reports Server (NTRS)

    Lee, K. D.; Henderson, T. L.

    1991-01-01

    Turbomachinery flow fields involve unique grid generation issues due to their geometrical and physical characteristics. Several strategic approaches are discussed to generate quality grids. The grid quality is further enhanced through blending and adapting. Grid blending smooths the grids locally through averaging and diffusion operators. Grid adaptation redistributes the grid points based on a grid quality assessment. These methods are demonstrated with several examples.

  14. Condenser Microphone Protective Grid Correction for High Frequency Measurements

    NASA Technical Reports Server (NTRS)

    Lee, Erik; Bennett, Reginald

    2010-01-01

    Use of a protective grid on small diameter microphones can prolong the lifetime of the unit, but the high frequency effects can complicate data interpretation. Analytical methods have been developed to correct for the grid effect at high frequencies. Specifically, the analysis pertains to quantifying the microphone protective grid response characteristics in the acoustic near field of a rocket plume noise source. A frequency response function computation using two microphones will be explained. Experimental and instrumentation setup details will be provided. The resulting frequency response function for a B&K 4944 condenser microphone protective grid will be presented, along with associated uncertainties

  15. Arc Length Based Grid Distribution For Surface and Volume Grids

    NASA Technical Reports Server (NTRS)

    Mastin, C. Wayne

    1996-01-01

    Techniques are presented for distributing grid points on parametric surfaces and in volumes according to a specified distribution of arc length. Interpolation techniques are introduced which permit a given distribution of grid points on the edges of a three-dimensional grid block to be propagated through the surface and volume grids. Examples demonstrate how these methods can be used to improve the quality of grids generated by transfinite interpolation.

  16. Grid Enabled Geospatial Catalogue Web Service

    NASA Technical Reports Server (NTRS)

    Chen, Ai-Jun; Di, Li-Ping; Wei, Ya-Xing; Liu, Yang; Bui, Yu-Qi; Hu, Chau-Min; Mehrotra, Piyush

    2004-01-01

    Geospatial Catalogue Web Service is a vital service for sharing and interoperating volumes of distributed heterogeneous geospatial resources, such as data, services, applications, and their replicas over the web. Based on the Grid technology and the Open Geospatial Consortium (0GC) s Catalogue Service - Web Information Model, this paper proposes a new information model for Geospatial Catalogue Web Service, named as GCWS which can securely provides Grid-based publishing, managing and querying geospatial data and services, and the transparent access to the replica data and related services under the Grid environment. This information model integrates the information model of the Grid Replica Location Service (RLS)/Monitoring & Discovery Service (MDS) with the information model of OGC Catalogue Service (CSW), and refers to the geospatial data metadata standards from IS0 19115, FGDC and NASA EOS Core System and service metadata standards from IS0 191 19 to extend itself for expressing geospatial resources. Using GCWS, any valid geospatial user, who belongs to an authorized Virtual Organization (VO), can securely publish and manage geospatial resources, especially query on-demand data in the virtual community and get back it through the data-related services which provide functions such as subsetting, reformatting, reprojection etc. This work facilitates the geospatial resources sharing and interoperating under the Grid environment, and implements geospatial resources Grid enabled and Grid technologies geospatial enabled. It 2!so makes researcher to focus on science, 2nd not cn issues with computing ability, data locztic, processir,g and management. GCWS also is a key component for workflow-based virtual geospatial data producing.

  17. The Benefits of Grid Networks

    ERIC Educational Resources Information Center

    Tennant, Roy

    2005-01-01

    In the article, the author talks about the benefits of grid networks. In speaking of grid networks the author is referring to both networks of computers and networks of humans connected together in a grid topology. Examples are provided of how grid networks are beneficial today and the ways in which they have been used.

  18. Uniformity on the grid via a configuration framework

    SciTech Connect

    Igor V Terekhov et al.

    2003-03-11

    As Grid permeates modern computing, Grid solutions continue to emerge and take shape. The actual Grid development projects continue to provide higher-level services that evolve in functionality and operate with application-level concepts which are often specific to the virtual organizations that use them. Physically, however, grids are comprised of sites whose resources are diverse and seldom project readily onto a grid's set of concepts. In practice, this also creates problems for site administrators who actually instantiate grid services. In this paper, we present a flexible, uniform framework to configure a grid site and its facilities, and otherwise describe the resources and services it offers. We start from a site configuration and instantiate services for resource advertisement, monitoring and data handling; we also apply our framework to hosting environment creation. We use our ideas in the Information Management part of the SAM-Grid project, a grid system which will deliver petabyte-scale data to the hundreds of users. Our users are High Energy Physics experimenters who are scattered worldwide across dozens of institutions and always use facilities that are shared with other experiments as well as other grids. Our implementation represents information in the XML format and includes tools written in XQuery and XSLT.

  19. D. Carlos de Braganca, a Pioneer of Experimental Marine Oceanography: Filling the Gap between Formal and Informal Science Education

    ERIC Educational Resources Information Center

    Faria, Claudia; Pereira, Goncalo; Chagas, Isabel

    2012-01-01

    The activities presented in this paper are part of a wider project that investigates the effects of infusing the history of science in science teaching, toward students' learning and attitude. Focused on the work of D. Carlos de Braganca, King of Portugal from 1889 to 1908, and a pioneer oceanographer, the activities are addressed at the secondary…

  20. Unstructured grids on SIMD torus machines

    NASA Technical Reports Server (NTRS)

    Bjorstad, Petter E.; Schreiber, Robert

    1994-01-01

    Unstructured grids lead to unstructured communication on distributed memory parallel computers, a problem that has been considered difficult. Here, we consider adaptive, offline communication routing for a SIMD processor grid. Our approach is empirical. We use large data sets drawn from supercomputing applications instead of an analytic model of communication load. The chief contribution of this paper is an experimental demonstration of the effectiveness of certain routing heuristics. Our routing algorithm is adaptive, nonminimal, and is generally designed to exploit locality. We have a parallel implementation of the router, and we report on its performance.

  1. Lightweight Grid Shell Pavilion - Design, Manufacture and Erection of Full Scale Grid Shell Prototypes

    NASA Astrophysics Data System (ADS)

    Vaněk, Aleš

    2016-12-01

    The main goal of author's research is to design and construct grid shell structures, which are subsequently realized as experimental structures in full scale. These structures should make the place suitable for various events and also a friendly, pleasant, relaxing and free time space. By thinking about how such structure should look like and what materials and structure types are suitable, there were many kinds of lightweight structures considered. The most logical solution is to create a grid shell structure combining with a single layer membrane that would fulfill all aspects of elegant remarkable lightweight structure using some original details and workflow advancements. These grid shell projects should demonstrate another possibility to build and think about unconventional structures and provoke a deeper interest in these unique structures. The goal of this project was to create a feasible design of a grid shell structure and to build up the structures while being capable to understand the core of such an interesting phenomenon.

  2. Development of experimental platform for high energy density sciences using high-intensity optical lasers at the SACLA x-ray free electron laser facility

    NASA Astrophysics Data System (ADS)

    Yabuuchi, Toshinori; Yabashi, Makina; Inubushi, Yuichi; Kon, Akira; Togashi, Tadashi; Tomizawa, Hiromitsu

    2016-10-01

    Combinations of high intensity optical laser and x-ray free electron laser (XFEL) open new frontiers in high energy density (HED) sciences. An experimental platform equipped with high-power Ti:Sapphire laser systems is under commissioning for HED sciences at the XFEL facility, SACLA. The Ti:Sapphire laser system is designed to deliver two laser beams with a maximum power of 500 TW in each to the sample chamber. A hard x-ray beamline of SACLA is also transported to the chamber with a beam focusing capability down to a few microns using sets of compound refractive lenses. The second optical laser pulse or the energetic particles and photons generated by the laser pulse can provide additional flexibilities for HED-related pump-probe experiments, which have been generally performed using single optical laser and XFEL. The development status and future perspectives of the experimental platform will be presented.

  3. Spaceflight Operations Services Grid Prototype

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Mehrotra, Piyush; Lisotta, Anthony

    2004-01-01

    NASA over the years has developed many types of technologies and conducted various types of science resulting in numerous variations of operations, data and applications. For example, operations range from deep space projects managed by JPL, Saturn and Shuttle operations managed from JSC and KSC, ISS science operations managed from MSFC and numerous low earth orbit satellites managed from GSFC that are varied and intrinsically different but require many of the same types of services to fulfill their missions. Also, large data sets (databases) of Shuttle flight data, solar system projects and earth observing data exist which because of their varied and sometimes outdated technologies are not and have not been fully examined for additional information and knowledge. Many of the applications/systems supporting operational services e.g. voice, video, telemetry and commanding, are outdated and obsolete. The vast amounts of data are located in various formats, at various locations and range over many years. The ability to conduct unified space operations, access disparate data sets and to develop systems and services that can provide operational services does not currently exist in any useful form. In addition, adding new services to existing operations is generally expensive and with the current budget constraints not feasible on any broad level of implementation. To understand these services a discussion of each one follows. The Spaceflight User-based Services are those services required to conduct space flight operations. Grid Services are those Grid services that will be used to overcome, through middleware software, some or all the problems that currently exists. In addition, Network Services will be discussed briefly. Network Services are crucial to any type of remedy and are evolving adequately to support any technology currently in development.

  4. Complex Volume Grid Generation Through the Use of Grid Reusability

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This paper presents a set of surface and volume grid generation techniques which reuse existing surface and volume grids. These methods use combinations of data manipulations to reduce grid generation time, improve grid characteristics, and increase the capabilities of existing domain discretization software. The manipulation techniques utilize physical and computational domains to produce basis function on which to operate and modify grid character and smooth grids using Trans-Finite Interpolation, a vector interpolation method and parametric re-mapping technique. With these new techniques, inviscid grids can be converted to viscous grids, multiple zone grid adaption can be performed to improve CFD solver efficiency, and topological changes to improve modeling of flow fields can be done simply and quickly. Examples of these capabilities are illustrated as applied to various configurations.

  5. NREL Smart Grid Projects

    SciTech Connect

    Hambrick, J.

    2012-01-01

    Although implementing Smart Grid projects at the distribution level provides many advantages and opportunities for advanced operation and control, a number of significant challenges must be overcome to maintain the high level of safety and reliability that the modern grid must provide. For example, while distributed generation (DG) promises to provide opportunities to increase reliability and efficiency and may provide grid support services such as volt/var control, the presence of DG can impact distribution operation and protection schemes. Additionally, the intermittent nature of many DG energy sources such as photovoltaics (PV) can present a number of challenges to voltage regulation, etc. This presentation provides an overview a number of Smart Grid projects being performed by the National Renewable Energy Laboratory (NREL) along with utility, industry, and academic partners. These projects include modeling and analysis of high penetration PV scenarios (with and without energy storage), development and testing of interconnection and microgrid equipment, as well as the development and implementation of advanced instrumentation and data acquisition used to analyze the impacts of intermittent renewable resources. Additionally, standards development associated with DG interconnection and analysis as well as Smart Grid interoperability will be discussed.

  6. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  7. Two grid iteration with a conjugate gradient fine grid smoother applied to a groundwater flow model

    SciTech Connect

    Hagger, M.J.; Spence, A.; Cliffe, K.A.

    1994-12-31

    This talk is concerned with the efficient solution of Ax=b, where A is a large, sparse, symmetric positive definite matrix arising from a standard finite element discretisation of the groundwater flow problem {triangledown}{sm_bullet}(k{triangledown}p)=0. Here k is the coefficient of rock permeability in applications and is highly discontinuous. The discretisation is carried out using the Harwell NAMMU finite element package, using, for 2D, 9 node biquadratic rectangular elements, and 27 node biquadratics for 3D. The aim is to develop a robust technique for iterative solutions of 3D problems based on a regional groundwater flow model of a geological area with sharply varying hydrogeological properties. Numerical experiments with polynomial preconditioned conjugate gradient methods on a 2D groundwater flow model were found to yield very poor results, converging very slowly. In order to utilise the fact that A comes from the discretisation of a PDE the authors try the two grid method as is well analysed from studies of multigrid methods, see for example {open_quotes}Multi-Grid Methods and Applications{close_quotes} by W. Hackbusch. Specifically they consider two discretisations resulting in stiffness matrices A{sub N} and A{sub n}, of size N and n respectively, where N > n, for both a model problem and the geological model. They perform a number of conjugate gradient steps on the fine grid, ie using A{sub N}, followed by an exact coarse grid solve, using A{sub n}, and then update the fine grid solution, the exact coarse grid solve being done using a frontal method factorisation of A{sub n}. Note that in the context of the standard two grid method this is equivalent to using conjugate gradients as a fine grid smoothing step. Experimental results are presented to show the superiority of the two grid iteration method over the polynomial preconditioned conjugate gradient method.

  8. Grid Data Management and Customer Demands at MeteoSwiss

    NASA Astrophysics Data System (ADS)

    Rigo, G.; Lukasczyk, Ch.

    2010-09-01

    Data grids constitute the required input form for a variety of applications. Therefore, customers increasingly expect climate services to not only provide measured data, but also grids of these with the required configurations on an operational basis. Currently, MeteoSwiss is establishing a production chain for delivering data grids by subscription directly from the data warehouse in order to meet the demand for precipitation data grids by governmental, business and science customers. The MeteoSwiss data warehouse runs on an Oracle database linked with an ArcGIS Standard edition geodatabase. The grids are produced by Unix-based software written in R called GRIDMCH which extracts the station data from the data warehouse and stores the files in the file system. By scripts, the netcdf-v4 files are imported via an FME interface into the database. Currently daily and monthly deliveries of daily precipitation grids are available from MeteoSwiss with a spatial resolution of 2.2km x 2.2km. These daily delivered grids are a preliminary based on 100 measuring sites whilst the grid of the monthly delivery of daily sums is calculated out of about 430 stations. Crucial for the absorption by the customers is the understanding of and the trust into the new grid product. Clearly stating needs which can be covered by grid products, the customers require a certain lead time to develop applications making use of the particular grid. Therefore, early contacts and a continuous attendance as well as flexibility in adjusting the production process to fulfill emerging customer needs are important during the introduction period. Gridding over complex terrain can lead to temporally elevated uncertainties in certain areas depending on the weather situation and coverage of measurements. Therefore, careful instructions on the quality and use and the possibility to communicate the uncertainties of gridded data proofed to be essential especially to the business and science customers who require

  9. The PacCAF Grid portal for the CDF experiment

    NASA Astrophysics Data System (ADS)

    Hou, Suen

    Distributed computing for the CDF experiment has been developed and is evolving towards shared resources on the computing Grid. Dedicated CAFs (CDF Analysis Farm) were constructed on Condor pools with a suit of services for user authentication, software distribution, and network connection to worker nodes.With the Condor Glide-in mechanism, the CAFs are extended to using dynamic worker pools collected from the Grid. The PacCAF (Pacific CAF) is the Glide CAF thus built to provide a single point portal to LCG (LHC ComputingGrid) and OSG (Open Science Grid) sites in the Pacific Asia region. We discuss the implementation and service as a late-binding solution towards Grid computing.

  10. RealityGrid: an integrated approach to middleware through ICENI.

    PubMed

    Cohen, Jeremy; McGough, A Stephen; Darlington, John; Furmento, Nathalie; Kong, Gary; Mayer, Anthony

    2005-08-15

    The advancement of modelling and simulation within complex scientific applications is currently constrained by the rate at which knowledge can be extracted from the data produced. As Grid computing evolves, new means of increasing the efficiency of data analysis are being explored. RealityGrid aims to enable more efficient use of scientific computing resources within the condensed matter, materials and biological science communities. The Imperial College e-Science Networked Infrastructure (ICENI) Grid middleware provides an end-to-end pipeline that simplifies the stages of computation, simulation and collaboration. The intention of this work is to allow all scientists to have access to these features without the need for heroic efforts that have been associated with this sort of work in the past. Scientists can utilise advanced scheduling mechanisms to ensure efficient planning of computations, visualize and interactively steer simulations and securely collaborate with colleagues via the Access Grid through a single integrated middleware application.

  11. Simulation of Unsteady Flows Using an Unstructured Navier-Stokes Solver on Moving and Stationary Grids

    NASA Technical Reports Server (NTRS)

    Biedron, Robert T.; Vatsa, Veer N.; Atkins, Harold L.

    2005-01-01

    We apply an unsteady Reynolds-averaged Navier-Stokes (URANS) solver for unstructured grids to unsteady flows on moving and stationary grids. Example problems considered are relevant to active flow control and stability and control. Computational results are presented using the Spalart-Allmaras turbulence model and are compared to experimental data. The effect of grid and time-step refinement are examined.

  12. Grid infrastructures for developing mammography CAD systems.

    PubMed

    Ramos-Pollan, Raul; Franco, Jose M; Sevilla, Jorge; Guevara-Lopez, Miguel A; de Posada, Naimy Gonzalez; Loureiro, Joanna; Ramos, Isabel

    2010-01-01

    This paper presents a set of technologies developed to exploit Grid infrastructures for breast cancer CAD, that include (1) federated repositories of mammography images and clinical data over Grid storage, (2) a workstation for mammography image analysis and diagnosis and (3) a framework for data analysis and training machine learning classifiers over Grid computing power specially tuned for medical image based data. An experimental mammography digital repository of approximately 300 mammograms from the MIAS database was created and classifiers were built achieving a 0.85 average area under the ROC curve in a dataset of 100 selected mammograms with representative pathological lesions and normal cases. Similar results were achieved with classifiers built for the UCI Breast Cancer Wisconsin dataset (699 features vectors). Now these technologies are being validated in a real medical environment at the Faculty of Medicine in Porto University after a process of integrating the tools within the clinicians workflows and IT systems.

  13. Fusion Data Grid Service

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana; Wang, Nanbor

    2004-11-01

    Simulations and experiments in the fusion and plasma physics community generate large datasets at remote sites. Visualization and analysis of these datasets are difficult because of the incompatibility among the various data formats adopted by simulation, experiments, and analysis tools, and the large sizes of analyzed data. Grids and Web Services technologies are capable of providing solutions for such heterogeneous settings, but need to be customized to the field-specific needs and merged with distributed technologies currently used by the community. This paper describes how we are addressing these issues in the Fusion Grid Service under development. We also present performance results of relevant data transfer mechanisms including binary SOAP, DIME, GridFTP and MDSplus and CORBA. We will describe the status of data converters (between HDF5 and MDSplus data types), developed in collaboration with MIT (J. Stillerman). Finally, we will analyze bottlenecks of MDSplus data transfer mechanism (work performed in collaboration with General Atomics (D. Schissel and M. Qian).

  14. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  15. Research on the comparison of extension mechanism of cellular automaton based on hexagon grid and rectangular grid

    NASA Astrophysics Data System (ADS)

    Zhai, Xiaofang; Zhu, Xinyan; Xiao, Zhifeng; Weng, Jie

    2009-10-01

    Historically, cellular automata (CA) is a discrete dynamical mathematical structure defined on spatial grid. Research on cellular automata system (CAS) has focused on rule sets and initial condition and has not discussed its adjacency. Thus, the main focus of our study is the effect of adjacency on CA behavior. This paper is to compare rectangular grids with hexagonal grids on their characteristics, strengths and weaknesses. They have great influence on modeling effects and other applications including the role of nearest neighborhood in experimental design. Our researches present that rectangular and hexagonal grids have different characteristics. They are adapted to distinct aspects, and the regular rectangular or square grid is used more often than the hexagonal grid. But their relative merits have not been widely discussed. The rectangular grid is generally preferred because of its symmetry, especially in orthogonal co-ordinate system and the frequent use of raster from Geographic Information System (GIS). However, in terms of complex terrain, uncertain and multidirectional region, we have preferred hexagonal grids and methods to facilitate and simplify the problem. Hexagonal grids can overcome directional warp and have some unique characteristics. For example, hexagonal grids have a simpler and more symmetric nearest neighborhood, which avoids the ambiguities of the rectangular grids. Movement paths or connectivity, the most compact arrangement of pixels, make hexagonal appear great dominance in the process of modeling and analysis. The selection of an appropriate grid should be based on the requirements and objectives of the application. We use rectangular and hexagonal grids respectively for developing city model. At the same time we make use of remote sensing images and acquire 2002 and 2005 land state of Wuhan. On the base of city land state in 2002, we make use of CA to simulate reasonable form of city in 2005. Hereby, these results provide a proof of

  16. GridPV Toolbox

    SciTech Connect

    Broderick, Robert; Quiroz, Jimmy; Grijalva, Santiago; Reno, Matthew; Coogan, Kyle

    2014-07-15

    Matlab Toolbox for simulating the impact of solar energy on the distribution grid. The majority of the functions are useful for interfacing OpenDSS and MATLAB, and they are of generic use for commanding OpenDSS from MATLAB and retrieving GridPV Toolbox information from simulations. A set of functions is also included for modeling PV plant output and setting up the PV plant in the OpenDSS simulation. The toolbox contains functions for modeling the OpenDSS distribution feeder on satellite images with GPS coordinates. Finally, example simulations functions are included to show potential uses of the toolbox functions.

  17. SLGRID: spectral synthesis software in the grid

    NASA Astrophysics Data System (ADS)

    Sabater, J.; Sánchez, S.; Verdes-Montenegro, L.

    2011-11-01

    SLGRID (http://www.e-ciencia.es/wiki/index.php/Slgrid) is a pilot project proposed by the e-Science Initiative of Andalusia (eCA) and supported by the Spanish e-Science Network in the frame of the European Grid Initiative (EGI). The aim of the project was to adapt the spectral synthesis software Starlight (Cid-Fernandes et al. 2005) to the Grid infrastructure. Starlight is used to estimate the underlying stellar populations (their ages and metallicities) using an optical spectrum, hence, it is possible to obtain a clean nebular spectrum that can be used for the diagnostic of the presence of an Active Galactic Nucleus (Sabater et al. 2008, 2009). The typical serial execution of the code for big samples of galaxies made it ideal to be integrated into the Grid. We obtain an improvement on the computational time of order N, being N the number of nodes available in the Grid. In a real case we obtained our results in 3 hours with SLGRID instead of the 60 days spent using Starlight in a PC. The code has already been ported to the Grid. The first tests were made within the e-CA infrastrusture and, later, itwas tested and improved with the colaboration of the CETA-CIEMAT. The SLGRID project has been recently renewed. In a future it is planned to adapt the code for the reduction of data from Integral Field Units where each dataset is composed of hundreds of spectra. Electronic version of the poster at http://www.iaa.es/~jsm/SEA2010

  18. Modelling noise propagation using Grid Resources. Progress within GDI-Grid

    NASA Astrophysics Data System (ADS)

    Kiehle, Christian; Mayer, Christian; Padberg, Alexander; Stapelfeld, Hartmut

    2010-05-01

    Modelling noise propagation using Grid Resources. Progress within GDI-Grid. GDI-Grid (english: SDI-Grid) is a research project funded by the German Ministry for Science and Education (BMBF). It aims at bridging the gaps between OGC Web Services (OWS) and Grid infrastructures and identifying the potential of utilizing the superior storage capacities and computational power of grid infrastructures for geospatial applications while keeping the well-known service interfaces specified by the OGC. The project considers all major OGC webservice interfaces for Web Mapping (WMS), Feature access (Web Feature Service), Coverage access (Web Coverage Service) and processing (Web Processing Service). The major challenge within GDI-Grid is the harmonization of diverging standards as defined by standardization bodies for Grid computing and spatial information exchange. The project started in 2007 and will continue until June 2010. The concept for the gridification of OWS developed by lat/lon GmbH and the Department of Geography of the University of Bonn is applied to three real-world scenarios in order to check its practicability: a flood simulation, a scenario for emergency routing and a noise propagation simulation. The latter scenario is addressed by the Stapelfeldt Ingenieurgesellschaft mbH located in Dortmund adapting their LimA software to utilize grid resources. Noise mapping of e.g. traffic noise in urban agglomerates and along major trunk roads is a reoccurring demand of the EU Noise Directive. Input data requires road net and traffic, terrain, buildings and noise protection screens as well as population distribution. Noise impact levels are generally calculated in 10 m grid and along relevant building facades. For each receiver position sources within a typical range of 2000 m are split down into small segments, depending on local geometry. For each of the segments propagation analysis includes diffraction effects caused by all obstacles on the path of sound propagation

  19. Ion Accelerator With Negatively Biased Decelerator Grid

    NASA Technical Reports Server (NTRS)

    Brophy, John R.

    1994-01-01

    Three-grid ion accelerator in which accelerator grid is biased at negative potential and decelerator grid downstream of accelerator grid biased at smaller negative potential. This grid and bias arrangement reduces frequency of impacts, upon accelerator grid, of charge-exchange ions produced downstream in collisions between accelerated ions and atoms and molecules of background gas. Sputter erosion of accelerator grid reduced.

  20. Essential Grid Workflow Monitoring Elements

    SciTech Connect

    Gunter, Daniel K.; Jackson, Keith R.; Konerding, David E.; Lee,Jason R.; Tierney, Brian L.

    2005-07-01

    Troubleshooting Grid workflows is difficult. A typicalworkflow involves a large number of components networks, middleware,hosts, etc. that can fail. Even when monitoring data from all thesecomponents is accessible, it is hard to tell whether failures andanomalies in these components are related toa given workflow. For theGrid to be truly usable, much of this uncertainty must be elim- inated.We propose two new Grid monitoring elements, Grid workflow identifiersand consistent component lifecycle events, that will make Gridtroubleshooting easier, and thus make Grids more usable, by simplifyingthe correlation of Grid monitoring data with a particular Gridworkflow.

  1. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  2. Unlocking the smart grid

    SciTech Connect

    Rokach, Joshua Z.

    2010-10-15

    The country has progressed in a relatively short time from rotary dial phones to computers, cell phones, and iPads. With proper planning and orderly policy implementation, the same will happen with the Smart Grid. Here are some suggestions on how to proceed. (author)

  3. NSTAR Smart Grid Pilot

    SciTech Connect

    Rabari, Anil; Fadipe, Oloruntomi

    2014-03-31

    NSTAR Electric & Gas Corporation (“the Company”, or “NSTAR”) developed and implemented a Smart Grid pilot program beginning in 2010 to demonstrate the viability of leveraging existing automated meter reading (“AMR”) deployments to provide much of the Smart Grid functionality of advanced metering infrastructure (“AMI”), but without the large capital investment that AMI rollouts typically entail. In particular, a central objective of the Smart Energy Pilot was to enable residential dynamic pricing (time-of-use “TOU” and critical peak rates and rebates) and two-way direct load control (“DLC”) by continually capturing AMR meter data transmissions and communicating through customer-sited broadband connections in conjunction with a standardsbased home area network (“HAN”). The pilot was supported by the U.S. Department of Energy’s (“DOE”) through the Smart Grid Demonstration program. NSTAR was very pleased to not only receive the funding support from DOE, but the guidance and support of the DOE throughout the pilot. NSTAR is also pleased to report to the DOE that it was able to execute and deliver a successful pilot on time and on budget. NSTAR looks for future opportunities to work with the DOE and others in future smart grid projects.

  4. Grid computing enhances standards-compatible geospatial catalogue service

    NASA Astrophysics Data System (ADS)

    Chen, Aijun; Di, Liping; Bai, Yuqi; Wei, Yaxing; Liu, Yang

    2010-04-01

    A catalogue service facilitates sharing, discovery, retrieval, management of, and access to large volumes of distributed geospatial resources, for example data, services, applications, and their replicas on the Internet. Grid computing provides an infrastructure for effective use of computing, storage, and other resources available online. The Open Geospatial Consortium has proposed a catalogue service specification and a series of profiles for promoting the interoperability of geospatial resources. By referring to the profile of the catalogue service for Web, an innovative information model of a catalogue service is proposed to offer Grid-enabled registry, management, retrieval of and access to geospatial resources and their replicas. This information model extends the e-business registry information model by adopting several geospatial data and service metadata standards—the International Organization for Standardization (ISO)'s 19115/19119 standards and the US Federal Geographic Data Committee (FGDC) and US National Aeronautics and Space Administration (NASA) metadata standards for describing and indexing geospatial resources. In order to select the optimal geospatial resources and their replicas managed by the Grid, the Grid data management service and information service from the Globus Toolkits are closely integrated with the extended catalogue information model. Based on this new model, a catalogue service is implemented first as a Web service. Then, the catalogue service is further developed as a Grid service conforming to Grid service specifications. The catalogue service can be deployed in both the Web and Grid environments and accessed by standard Web services or authorized Grid services, respectively. The catalogue service has been implemented at the George Mason University/Center for Spatial Information Science and Systems (GMU/CSISS), managing more than 17 TB of geospatial data and geospatial Grid services. This service makes it easy to share and

  5. The surveillance error grid.

    PubMed

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  6. Can Clouds replace Grids? Will Clouds replace Grids?

    NASA Astrophysics Data System (ADS)

    Shiers, J. D.

    2010-04-01

    The world's largest scientific machine - comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground - currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared "open" and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability - as seen by the experiments, as opposed to that measured by the official tools - still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently "Cloud Computing" - in terms of pay-per-use fabric provisioning - has emerged as a potentially viable alternative but with rather different strengths and no doubt weaknesses too. Based on the concrete needs of the LHC experiments - where the total data volume that will be acquired over the full lifetime of the project, including the additional data copies that are required by the Computing Models of the experiments, approaches 1 Exabyte - we analyze the pros and cons of Grids versus Clouds. This analysis covers not only technical issues - such as those related to demanding database and data management needs - but also sociological aspects, which cannot be ignored, neither in terms of funding nor in the wider context of the essential but often overlooked role of science in society, education and economy.

  7. A VO-Driven Astronomical Data Grid in China

    NASA Astrophysics Data System (ADS)

    Cui, C.; He, B.; Yang, Y.; Zhao, Y.

    2010-12-01

    With the implementation of many ambitious observation projects, including LAMOST, FAST, and Antarctic observatory at Doom A, observational astronomy in China is stepping into a brand new era with emerging data avalanche. In the era of e-Science, both these cutting-edge projects and traditional astronomy research need much more powerful data management, sharing and interoperability. Based on data-grid concept, taking advantages of the IVOA interoperability technologies, China-VO is developing a VO-driven astronomical data grid environment to enable multi-wavelength science and large database science. In the paper, latest progress and data flow of the LAMOST, architecture of the data grid, and its supports to the VO are discussed.

  8. Equil: A Global Grid System

    NASA Astrophysics Data System (ADS)

    Hahn, Sebastian; Reimer, Christioph; Paulik, Christoph; Wagner, Wolfgang

    2016-08-01

    Geophysical parameters derived from space-borne Earth Observation Systems are either assigned to discrete points on a fixed Earth grid (e.g. regular lon/lat grid) or located on orbital point nodes with a customized arrangement, often in-line with the instrument's measurement geometry. The driving factors of the choice and structure of a spatial reference system (i.e. the grid) are typically spatial resolution, instrument geometry, measurement technique or application.In this study we propose a global grid system, the so- called Equil grid, and demonstrate its realization and structure. An exemplary Equil grid with a base sampling distance of 12.5 km is compared against two other grids commonly used in the domain of remote sensing of soil moisture. The simple nearly-equidistant grid design makes it interesting for a wide range of other geophysical parameters as well.

  9. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  10. A comparative analysis of dynamic grids vs. virtual grids using the A3pviGrid framework.

    PubMed

    Shankaranarayanan, Avinas; Amaldas, Christine

    2010-11-01

    With the proliferation of Quad/Multi-core micro-processors in mainstream platforms such as desktops and workstations; a large number of unused CPU cycles can be utilized for running virtual machines (VMs) as dynamic nodes in distributed environments. Grid services and its service oriented business broker now termed cloud computing could deploy image based virtualization platforms enabling agent based resource management and dynamic fault management. In this paper we present an efficient way of utilizing heterogeneous virtual machines on idle desktops as an environment for consumption of high performance grid services. Spurious and exponential increases in the size of the datasets are constant concerns in medical and pharmaceutical industries due to the constant discovery and publication of large sequence databases. Traditional algorithms are not modeled at handing large data sizes under sudden and dynamic changes in the execution environment as previously discussed. This research was undertaken to compare our previous results with running the same test dataset with that of a virtual Grid platform using virtual machines (Virtualization). The implemented architecture, A3pviGrid utilizes game theoretic optimization and agent based team formation (Coalition) algorithms to improve upon scalability with respect to team formation. Due to the dynamic nature of distributed systems (as discussed in our previous work) all interactions were made local within a team transparently. This paper is a proof of concept of an experimental mini-Grid test-bed compared to running the platform on local virtual machines on a local test cluster. This was done to give every agent its own execution platform enabling anonymity and better control of the dynamic environmental parameters. We also analyze performance and scalability of Blast in a multiple virtual node setup and present our findings. This paper is an extension of our previous research on improving the BLAST application framework

  11. Photofabricated Wire-Grid Polarizers

    NASA Technical Reports Server (NTRS)

    Siegel, Peter H.; Dengler, Robert J.

    1992-01-01

    Freestanding metallic grids for use as polarizers for electromagnetic radiation at millimeter and submillimeter wavelengths made by simple modification of designs of freestanding square- and nearly-square cell metallic grids, according to proposal. Cross wires provide mechanical support, but distance between cross wires made greater than one wavelength so cross wires have little effect on polarizing characteristics of grid. Possible to fabricate grids commercially for frequencies up to several terahertz.

  12. Applications of algebraic grid generation

    NASA Technical Reports Server (NTRS)

    Eiseman, Peter R.; Smith, Robert E.

    1990-01-01

    Techniques and applications of algebraic grid generation are described. The techniques are univariate interpolations and transfinite assemblies of univariate interpolations. Because algebraic grid generation is computationally efficient, the use of interactive graphics in conjunction with the techniques is advocated. A flexible approach, which works extremely well in an interactive environment, called the control point form of algebraic grid generation is described. The applications discussed are three-dimensional grids constructed about airplane and submarine configurations.

  13. A grid spacing control technique for algebraic grid generation methods

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Kudlinski, R. A.; Everton, E. L.

    1982-01-01

    A technique which controls the spacing of grid points in algebraically defined coordinate transformations is described. The technique is based on the generation of control functions which map a uniformly distributed computational grid onto parametric variables defining the physical grid. The control functions are smoothed cubic splines. Sets of control points are input for each coordinate directions to outline the control functions. Smoothed cubic spline functions are then generated to approximate the input data. The technique works best in an interactive graphics environment where control inputs and grid displays are nearly instantaneous. The technique is illustrated with the two-boundary grid generation algorithm.

  14. Exploring the Opinions of Pre-Service Science Teachers in Their Experimental Designs Prepared Based on Various Approaches

    ERIC Educational Resources Information Center

    Benzer, Elif

    2015-01-01

    The students in working in laboratories in 21st century are preferred to take place as active participants in the experiments coming up with their own designs and projects by developing new ideas and problems rather than implementing the ones told and ordered by others during these experiments. The science teachers that would have the students…

  15. The Experimental Teaching Reform in Biochemistry and Molecular Biology for Undergraduate Students in Peking University Health Science Center

    ERIC Educational Resources Information Center

    Yang, Xiaohan; Sun, Luyang; Zhao, Ying; Yi, Xia; Zhu, Bin; Wang, Pu; Lin, Hong; Ni, Juhua

    2015-01-01

    Since 2010, second-year undergraduate students of an eight-year training program leading to a Doctor of Medicine degree or Doctor of Philosophy degree in Peking University Health Science Center (PKUHSC) have been required to enter the "Innovative talent training project." During that time, the students joined a research lab and…

  16. Grid Interaction Technical Team Roadmap

    SciTech Connect

    2013-06-01

    The mission of the Grid Interaction Technical Team (GITT) is to support a transition scenario to large scale grid-connected vehicle charging with transformational technology, proof of concept and information dissemination. The GITT facilitates technical coordination and collaboration between vehicle-grid connectivity and communication activities among U.S. DRIVE government and industry partners.

  17. Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid

    SciTech Connect

    2012-02-08

    GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improve the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.

  18. Gridded electron reversal ionizer

    NASA Technical Reports Server (NTRS)

    Chutjian, Ara (Inventor)

    1993-01-01

    A gridded electron reversal ionizer forms a three dimensional cloud of zero or near-zero energy electrons in a cavity within a filament structure surrounding a central electrode having holes through which the sample gas, at reduced pressure, enters an elongated reversal volume. The resultant negative ion stream is applied to a mass analyzer. The reduced electron and ion space-charge limitations of this configuration enhances detection sensitivity for material to be detected by electron attachment, such as narcotic and explosive vapors. Positive ions may be generated by generating electrons having a higher energy, sufficient to ionize the target gas and pulsing the grid negative to stop the electron flow and pulsing the extraction aperture positive to draw out the positive ions.

  19. Shuttle computational grid generation

    NASA Technical Reports Server (NTRS)

    Ing, Chang

    1987-01-01

    The well known Karman-Trefftz conformal transformation, consisting of repeated applications of the same basic formula, were found to be quite successful to body, wing, and wing-body cross sections. This grid generation technique is extended to cross sections of more complex forms, and also more automatic. Computer programs were written for the selection of hinge points on cross section with angular shapes, the Karman-Trefftz tranformation of arbitrary shapes, and the special transform of hinge point on the imaginary axis. A feasibility study is performed for the future application of conformal mapping grid generation to complex three dimensional configurations. Examples such as Orbiter vehicle section and a few others were used.

  20. Smart Grid Demonstration Project

    SciTech Connect

    Miller, Craig; Carroll, Paul; Bell, Abigail

    2015-03-11

    The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives, to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and

  1. Grid generation and inviscid flow computation about aircraft geometries

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1989-01-01

    Grid generation and Euler flow about fighter aircraft are described. A fighter aircraft geometry is specified by an area ruled fuselage with an internal duct, cranked delta wing or strake/wing combinations, canard and/or horizontal tail surfaces, and vertical tail surfaces. The initial step before grid generation and flow computation is the determination of a suitable grid topology. The external grid topology that has been applied is called a dual-block topology which is a patched C (exp 1) continuous multiple-block system where inner blocks cover the highly-swept part of a cranked wing or strake, rearward inner-part of the wing, and tail components. Outer-blocks cover the remainder of the fuselage, outer-part of the wing, canards and extend to the far field boundaries. The grid generation is based on transfinite interpolation with Lagrangian blending functions. This procedure has been applied to the Langley experimental fighter configuration and a modified F-18 configuration. Supersonic flow between Mach 1.3 and 2.5 and angles of attack between 0 degrees and 10 degrees have been computed with associated Euler solvers based on the finite-volume approach. When coupling geometric details such as boundary layer diverter regions, duct regions with inlets and outlets, or slots with the general external grid, imposing C (exp 1) continuity can be extremely tedious. The approach taken here is to patch blocks together at common interfaces where there is no grid continuity, but enforce conservation in the finite-volume solution. The key to this technique is how to obtain the information required for a conservative interface. The Ramshaw technique which automates the computation of proportional areas of two overlapping grids on a planar surface and is suitable for coding was used. Researchers generated internal duct grids for the Langley experimental fighter configuration independent of the external grid topology, with a conservative interface at the inlet and outlet.

  2. Wireless Communications in Smart Grid

    NASA Astrophysics Data System (ADS)

    Bojkovic, Zoran; Bakmaz, Bojan

    Communication networks play a crucial role in smart grid, as the intelligence of this complex system is built based on information exchange across the power grid. Wireless communications and networking are among the most economical ways to build the essential part of the scalable communication infrastructure for smart grid. In particular, wireless networks will be deployed widely in the smart grid for automatic meter reading, remote system and customer site monitoring, as well as equipment fault diagnosing. With an increasing interest from both the academic and industrial communities, this chapter systematically investigates recent advances in wireless communication technology for the smart grid.

  3. USA National Phenology Network gridded products documentation

    USGS Publications Warehouse

    Crimmins, Theresa M.; Marsh, R. Lee; Switzer, Jeff R.; Crimmins, Michael A.; Gerst, Katharine L.; Rosemartin, Alyssa H.; Weltzin, Jake F.

    2017-02-23

    The goals of the USA National Phenology Network (USA-NPN, www.usanpn.org) are to advance science, inform decisions, and communicate and connect with the public regarding phenology and species’ responses to environmental variation and climate change. The USA-NPN seeks to facilitate informed ecosystem stewardship and management by providing phenological information freely and openly. One way the USA-NPN is endeavoring to accomplish these goals is by providing data and data products in a wide range of formats, including gridded real-time, short-term forecasted, and historical maps of phenological events, patterns and trends. This document describes the suite of gridded phenologically relevant data products produced and provided by the USA National Phenology Network, which can be accessed at www.usanpn.org/data/phenology_maps and also through web services at geoserver.usanpn.org/geoserver/wms?request=GetCapabilities.

  4. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  5. TRMM Gridded Text Products

    NASA Technical Reports Server (NTRS)

    Stocker, Erich Franz

    2007-01-01

    NASA's Tropical Rainfall Measuring Mission (TRMM) has many products that contain instantaneous or gridded rain rates often among many other parameters. However, these products because of their completeness can often seem intimidating to users just desiring surface rain rates. For example one of the gridded monthly products contains well over 200 parameters. It is clear that if only rain rates are desired, this many parameters might prove intimidating. In addition, for many good reasons these products are archived and currently distributed in HDF format. This also can be an inhibiting factor in using TRMM rain rates. To provide a simple format and isolate just the rain rates from the many other parameters, the TRMM product created a series of gridded products in ASCII text format. This paper describes the various text rain rate products produced. It provides detailed information about parameters and how they are calculated. It also gives detailed format information. These products are used in a number of applications with the TRMM processing system. The products are produced from the swath instantaneous rain rates and contain information from the three major TRMM instruments: radar, radiometer, and combined. They are simple to use, human readable, and small for downloading.

  6. 3D Structured Grid Adaptation

    NASA Technical Reports Server (NTRS)

    Banks, D. W.; Hafez, M. M.

    1996-01-01

    Grid adaptation for structured meshes is the art of using information from an existing, but poorly resolved, solution to automatically redistribute the grid points in such a way as to improve the resolution in regions of high error, and thus the quality of the solution. This involves: (1) generate a grid vis some standard algorithm, (2) calculate a solution on this grid, (3) adapt the grid to this solution, (4) recalculate the solution on this adapted grid, and (5) repeat steps 3 and 4 to satisfaction. Steps 3 and 4 can be repeated until some 'optimal' grid is converged to but typically this is not worth the effort and just two or three repeat calculations are necessary. They also may be repeated every 5-10 time steps for unsteady calculations.

  7. Constructing the ASCI computational grid

    SciTech Connect

    BEIRIGER,JUDY I.; BIVENS,HUGH P.; HUMPHREYS,STEVEN L.; JOHNSON,WILBUR R.; RHEA,RONALD E.

    2000-06-01

    The Accelerated Strategic Computing Initiative (ASCI) computational grid is being constructed to interconnect the high performance computing resources of the nuclear weapons complex. The grid will simplify access to the diverse computing, storage, network, and visualization resources, and will enable the coordinated use of shared resources regardless of location. To match existing hardware platforms, required security services, and current simulation practices, the Globus MetaComputing Toolkit was selected to provide core grid services. The ASCI grid extends Globus functionality by operating as an independent grid, incorporating Kerberos-based security, interfacing to Sandia's Cplant{trademark},and extending job monitoring services. To fully meet ASCI's needs, the architecture layers distributed work management and criteria-driven resource selection services on top of Globus. These services simplify the grid interface by allowing users to simply request ''run code X anywhere''. This paper describes the initial design and prototype of the ASCI grid.

  8. On transferring the grid technology to the biomedical community.

    PubMed

    Mohammed, Yassene; Sax, Ulrich; Dickmann, Frank; Lippert, Joerg; Solodenko, Juri; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which resulted in the Grid. The inter domain transfer process of this technology has been an intuitive process. Some difficulties facing the life science community can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies that have achieved certain stability. Grid and Cloud solutions are technologies that are still in flux. We illustrate how Grid computing creates new difficulties for the technology transfer process that are not considered in Bozeman's model. We show why the success of health Grids should be measured by the qualified scientific human capital and opportunities created, and not primarily by the market impact. With two examples we show how the Grid technology transfer theory corresponds to the reality. We conclude with recommendations that can help improve the adoption of Grid solutions into the biomedical community. These results give a more concise explanation of the difficulties most life science IT projects are facing in the late funding periods, and show some leveraging steps which can help to overcome the "vale of tears".

  9. Progress in Grid Generation: From Chimera to DRAGON Grids

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Kao, Kai-Hsiung

    1994-01-01

    Hybrid grids, composed of structured and unstructured grids, combines the best features of both. The chimera method is a major stepstone toward a hybrid grid from which the present approach is evolved. The chimera grid composes a set of overlapped structured grids which are independently generated and body-fitted, yielding a high quality grid readily accessible for efficient solution schemes. The chimera method has been shown to be efficient to generate a grid about complex geometries and has been demonstrated to deliver accurate aerodynamic prediction of complex flows. While its geometrical flexibility is attractive, interpolation of data in the overlapped regions - which in today's practice in 3D is done in a nonconservative fashion, is not. In the present paper we propose a hybrid grid scheme that maximizes the advantages of the chimera scheme and adapts the strengths of the unstructured grid while at the same time keeps its weaknesses minimal. Like the chimera method, we first divide up the physical domain by a set of structured body-fitted grids which are separately generated and overlaid throughout a complex configuration. To eliminate any pure data manipulation which does not necessarily follow governing equations, we use non-structured grids only to directly replace the region of the arbitrarily overlapped grids. This new adaptation to the chimera thinking is coined the DRAGON grid. The nonstructured grid region sandwiched between the structured grids is limited in size, resulting in only a small increase in memory and computational effort. The DRAGON method has three important advantages: (1) preserving strengths of the chimera grid; (2) eliminating difficulties sometimes encountered in the chimera scheme, such as the orphan points and bad quality of interpolation stencils; and (3) making grid communication in a fully conservative and consistent manner insofar as the governing equations are concerned. To demonstrate its use, the governing equations are

  10. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  11. Smart Grid Risk Management

    NASA Astrophysics Data System (ADS)

    Abad Lopez, Carlos Adrian

    Current electricity infrastructure is being stressed from several directions -- high demand, unreliable supply, extreme weather conditions, accidents, among others. Infrastructure planners have, traditionally, focused on only the cost of the system; today, resilience and sustainability are increasingly becoming more important. In this dissertation, we develop computational tools for efficiently managing electricity resources to help create a more reliable and sustainable electrical grid. The tools we present in this work will help electric utilities coordinate demand to allow the smooth and large scale integration of renewable sources of energy into traditional grids, as well as provide infrastructure planners and operators in developing countries a framework for making informed planning and control decisions in the presence of uncertainty. Demand-side management is considered as the most viable solution for maintaining grid stability as generation from intermittent renewable sources increases. Demand-side management, particularly demand response (DR) programs that attempt to alter the energy consumption of customers either by using price-based incentives or up-front power interruption contracts, is more cost-effective and sustainable in addressing short-term supply-demand imbalances when compared with the alternative that involves increasing fossil fuel-based fast spinning reserves. An essential step in compensating participating customers and benchmarking the effectiveness of DR programs is to be able to independently detect the load reduction from observed meter data. Electric utilities implementing automated DR programs through direct load control switches are also interested in detecting the reduction in demand to efficiently pinpoint non-functioning devices to reduce maintenance costs. We develop sparse optimization methods for detecting a small change in the demand for electricity of a customer in response to a price change or signal from the utility

  12. The BioGRID interaction database: 2015 update.

    PubMed

    Chatr-Aryamontri, Andrew; Breitkreutz, Bobby-Joe; Oughtred, Rose; Boucher, Lorrie; Heinicke, Sven; Chen, Daici; Stark, Chris; Breitkreutz, Ashton; Kolas, Nadine; O'Donnell, Lara; Reguly, Teresa; Nixon, Julie; Ramage, Lindsay; Winter, Andrew; Sellam, Adnane; Chang, Christie; Hirschman, Jodi; Theesfeld, Chandra; Rust, Jennifer; Livstone, Michael S; Dolinski, Kara; Tyers, Mike

    2015-01-01

    The Biological General Repository for Interaction Datasets (BioGRID: http://thebiogrid.org) is an open access database that houses genetic and protein interactions curated from the primary biomedical literature for all major model organism species and humans. As of September 2014, the BioGRID contains 749,912 interactions as drawn from 43,149 publications that represent 30 model organisms. This interaction count represents a 50% increase compared to our previous 2013 BioGRID update. BioGRID data are freely distributed through partner model organism databases and meta-databases and are directly downloadable in a variety of formats. In addition to general curation of the published literature for the major model species, BioGRID undertakes themed curation projects in areas of particular relevance for biomedical sciences, such as the ubiquitin-proteasome system and various human disease-associated interaction networks. BioGRID curation is coordinated through an Interaction Management System (IMS) that facilitates the compilation interaction records through structured evidence codes, phenotype ontologies, and gene annotation. The BioGRID architecture has been improved in order to support a broader range of interaction and post-translational modification types, to allow the representation of more complex multi-gene/protein interactions, to account for cellular phenotypes through structured ontologies, to expedite curation through semi-automated text-mining approaches, and to enhance curation quality control.

  13. The pilot way to Grid resources using glideinWMS

    SciTech Connect

    Sfiligoi, Igor; Bradley, Daniel C.; Holzman, Burt; Mhashilkar, Parag; Padhi, Sanjay; Wurthwrin, Frank; /UC, San Diego

    2010-09-01

    Grid computing has become very popular in big and widespread scientific communities with high computing demands, like high energy physics. Computing resources are being distributed over many independent sites with only a thin layer of Grid middleware shared between them. This deployment model has proven to be very convenient for computing resource providers, but has introduced several problems for the users of the system, the three major being the complexity of job scheduling, the nonuniformity of computer resources, and the lack of good job monitoring. Pilot jobs address all the above problems by creating a virtual private computing pool on top of Grid resources. This paper presents both the general pilot concept, as well as a concrete implementation, called glideinWMS, deployed in the Open Science Grid.

  14. Integrating Grid Services into the Cray XT4 Environment

    SciTech Connect

    NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy

    2009-05-01

    The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic grid interfaces that mask the underlying system-specific details for the end user.

  15. Grist : grid-based data mining for astronomy

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S.; Miller, Craig D.; Walia, Harshpreet; Williams, Roy; Djorgovski, S. George; Graham, Matthew J.; Mahabal, Ashish; Babu, Jogesh; Berk, Daniel E. Vanden; Nichol, Robert

    2004-01-01

    The Grist project is developing a grid-technology based system as a research environment for astronomy with massive and complex datasets. This knowledge extraction system will consist of a library of distributed grid services controlled by a workflow system, compliant with standards emerging from the grid computing, web services, and virtual observatory communities. This new technology is being used to find high redshift quasars, study peculiar variable objects, search for transients in real time, and fit SDSS QSO spectra to measure black hole masses. Grist services are also a component of the 'hyperatlas' project to serve high-resolution multi-wavelength imagery over the Internet. In support of these science and outreach objectives, the Grist framework will provide the enabling fabric to tie together distributed grid services in the areas of data access, federation, mining, subsetting, source extraction, image mosaicking, statistics, and visualization.

  16. Thundercloud: Domain specific information security training for the smart grid

    NASA Astrophysics Data System (ADS)

    Stites, Joseph

    In this paper, we describe a cloud-based virtual smart grid test bed: ThunderCloud, which is intended to be used for domain-specific security training applicable to the smart grid environment. The test bed consists of virtual machines connected using a virtual internal network. ThunderCloud is remotely accessible, allowing students to undergo educational exercises online. We also describe a series of practical exercises that we have developed for providing the domain-specific training using ThunderCloud. The training exercises and attacks are designed to be realistic and to reflect known vulnerabilities and attacks reported in the smart grid environment. We were able to use ThunderCloud to offer practical domain-specific security training for smart grid environment to computer science students at little or no cost to the department and no risk to any real networks or systems.

  17. Science Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1985

    1985-01-01

    Presents 23 experiments, activities, field projects and computer programs in the biological and physical sciences. Instructional procedures, experimental designs, materials, and background information are suggested. Topics include fluid mechanics, electricity, crystals, arthropods, limpets, acid neutralization, and software evaluation. (ML)

  18. Grid crusher apparatus and method

    SciTech Connect

    McDaniels, J.D. Jr.

    1994-01-11

    A grid crusher apparatus and method are provided for a nuclear fuel rod consolidation system. Spacer grids are crushed within a basket which is then placed in a storage canister. The grid crusher apparatus has a ram assembly and a basket driving mechanism. The ram assembly has a sleeve ram and a central ram. The sleeve ram surrounds the central ram which is longitudinally movable within the sleeve ram. The central ram protrudes from the sleeve ram at a ram contact end and is retractable upon application of a preselected force to the central ram so that the central ram is flush with the sleeve ram at the ram contact end. The basket driving mechanism is configured to move the basket containing a spacer grid towards the ram contact end so that the spacer grid is crushed within the basket. The spacer grid is crushed by the combination of successive forces from the central ram and the sleeve ram, respectively. Essentially, the central portion of the spacer grid is crushed first, and then the remaining outer portion of the spacer grid is crushed to complete the crushing action of the spacer grid. The foregoing process is repeated for other spacer grids until the basket reaches a predetermined allowable capacity, and then the basket is stored in a storage canister. 11 figs.

  19. The International Symposium on Grids and Clouds

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.

  20. AstroGrid: the UK's Virtual Observatory Initiative

    NASA Astrophysics Data System (ADS)

    Mann, Robert G.; Astrogrid Consortium; Lawrence, Andy; Davenhall, Clive; Mann, Bob; McMahon, Richard; Irwin, Mike; Walton, Nic; Rixon, Guy; Watson, Mike; Osborne, Julian; Page, Clive; Allan, Peter; Giaretta, David; Perry, Chris; Pike, Dave; Sherman, John; Murtagh, Fionn; Harra, Louise; Bentley, Bob; Mason, Keith; Garrington, Simon

    AstroGrid is the UK's Virtual Observatory (VO) initiative. It brings together the principal astronomical data centres in the UK, and has been funded to the tune of ˜pounds 5M over the next three years, via PPARC, as part of the UK e--science programme. Its twin goals are the provision of the infrastructure and tools for the federation and exploitation of large astronomical (X-ray to radio), solar and space plasma physics datasets, and the delivery of federations of current datasets for its user communities to exploit using those tools. Whilst AstroGrid's work will be centred on existing and future (e.g. VISTA) UK datasets, it will seek solutions to generic VO problems and will contribute to the developing international virtual observatory framework: AstroGrid is a member of the EU-funded Astrophysical Virtual Observatory project, has close links to a second EU Grid initiative, the European Grid of Solar Observations (EGSO), and will seek an active role in the development of the common standards on which the international virtual observatory will rely. In this paper we shall primarily describe the concrete plans for AstroGrid's one-year Phase A study, which will centre on: (i) the definition of detailed science requirements through community consultation; (ii) the undertaking of a ``functionality market survey" to test the utility of existing technologies for the VO; and (iii) a pilot programme of database federations, each addressing different aspects of the general database federation problem. Further information on AstroGrid can be found at AstroGrid .

  1. Evaluating the Information Power Grid using the NAS Grid Benchmarks

    NASA Technical Reports Server (NTRS)

    VanderWijngaartm Rob F.; Frumkin, Michael A.

    2004-01-01

    The NAS Grid Benchmarks (NGB) are a collection of synthetic distributed applications designed to rate the performance and functionality of computational grids. We compare several implementations of the NGB to determine programmability and efficiency of NASA's Information Power Grid (IPG), whose services are mostly based on the Globus Toolkit. We report on the overheads involved in porting existing NGB reference implementations to the IPG. No changes were made to the component tasks of the NGB can still be improved.

  2. A laboratory course for teaching laboratory techniques, experimental design, statistical analysis, and peer review process to undergraduate science students.

    PubMed

    Gliddon, C M; J Rosengren, R

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and written discussion of results. The laboratory practices were a guided inquiry based around retinol's ability to potentiate acetaminophen-mediated hepatotoxicity. To induce critical thinking, students were given a choice as to which assay they could use to determine how retinol affected acetaminophen hepatotoxicity. Short summaries were handed in following each assay and formed the bases of the formative assessment. To complete the feedback loop, a summative assessment that consisted of all the graphs and concepts from the short summaries were combined into a manuscript. To give the students exposure to science communication, the manuscript had to be written in accordance to the submission guidelines for Toxicological Sciences. Evaluation of this course was determined by a student questionnaire using a Likert scale and students' responses were very favorable. While the subject matter was toxicological centric, the content could be easily modified to suit another subject matter in biochemistry and molecular biology.

  3. The Volume Grid Manipulator (VGM): A Grid Reusability Tool

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    1997-01-01

    This document is a manual describing how to use the Volume Grid Manipulation (VGM) software. The code is specifically designed to alter or manipulate existing surface and volume structured grids to improve grid quality through the reduction of grid line skewness, removal of negative volumes, and adaption of surface and volume grids to flow field gradients. The software uses a command language to perform all manipulations thereby offering the capability of executing multiple manipulations on a single grid during an execution of the code. The command language can be input to the VGM code by a UNIX style redirected file, or interactively while the code is executing. The manual consists of 14 sections. The first is an introduction to grid manipulation; where it is most applicable and where the strengths of such software can be utilized. The next two sections describe the memory management and the manipulation command language. The following 8 sections describe simple and complex manipulations that can be used in conjunction with one another to smooth, adapt, and reuse existing grids for various computations. These are accompanied by a tutorial section that describes how to use the commands and manipulations to solve actual grid generation problems. The last two sections are a command reference guide and trouble shooting sections to aid in the use of the code as well as describe problems associated with generated scripts for manipulation control.

  4. Experience in grid optimization

    NASA Technical Reports Server (NTRS)

    Mastin, C. W.; Soni, B. K.; Mcclure, M. D.

    1987-01-01

    Two optimization methods for solving a variational problem in grid generation are described and evaluated. The smoothness, cell volumes, and orthogonality of the variational integrals are examined. The Jacobi-Newton iterative method is compared to the Fletcher-Reeves conjugate gradient method. It is observed that a combination of the Jacobi-Newton iteration and the direct solution of the variational problem produces an algorithm which is easy to program and requires less storage and computer time/iteration than the conjugate gradient method.

  5. TASMANIAN Sparse Grids Module

    SciTech Connect

    and Drayton Munster, Miroslav Stoyanov

    2013-09-20

    Sparse Grids are the family of methods of choice for multidimensional integration and interpolation in low to moderate number of dimensions. The method is to select extend a one dimensional set of abscissas, weights and basis functions by taking a subset of all possible tensor products. The module provides the ability to create global and local approximations based on polynomials and wavelets. The software has three components, a library, a wrapper for the library that provides a command line interface via text files ad a MATLAB interface via the command line tool.

  6. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  7. Petascale Flow Simulations Using Particles and Grids

    NASA Astrophysics Data System (ADS)

    Koumoutsakos, Petros

    2014-11-01

    How to chose the discretization of flow models in order to harness the power of available computer architectures? Our group explores this question for particle (vortex methods, molecular and dissipative particle dynamics) and grid based (finite difference, finite volume) discretisations for flow simulations across scales. I will discuss methodologies to transition between these methods and their implementation in massively parallel computer architectures. I will present simulations ranging from flows of cells in microfluidic channels to cloud cavitation collapse at 14.5 PFLOP/s. This research was supported by the European Research Council, the Swiss National Science Foundation and the Swiss National Supercomputing Center.

  8. Experimental capabilities of 0.4 PW, 1 shot/min Scarlet laser facility for high energy density science.

    PubMed

    Poole, P L; Willis, C; Daskalova, R L; George, K M; Feister, S; Jiang, S; Snyder, J; Marketon, J; Schumacher, D W; Akli, K U; Van Woerkom, L; Freeman, R R; Chowdhury, E A

    2016-06-10

    We report on the recently completed 400 TW upgrade to the Scarlet laser at The Ohio State University. Scarlet is a Ti:sapphire-based ultrashort pulse system that delivers >10  J in 30 fs pulses to a 2 μm full width at half-maximum focal spot, resulting in intensities exceeding 5×1021  W/cm2. The laser fires at a repetition rate of once per minute and is equipped with a suite of on-demand and on-shot diagnostics detailed here, allowing for rapid collection of experimental statistics. As part of the upgrade, the entire laser system has been redesigned to facilitate consistent, characterized high intensity data collection at high repetition rates. The design and functionality of the laser and target chambers are described along with initial data from commissioning experimental shots.

  9. Teachers' personal didactical models and obstacles to professional development: Case-studies with secondary experimental science teachers

    NASA Astrophysics Data System (ADS)

    Wamba Aguado, Ana Maria

    The aim of this thesis has been to elaborate criteria which characterise how teachers teach, as a curriculum component of their professional knowledge and to infer the obstacles which hinder their desired professional development, in such a way that they are considered in the design of proposals for teacher training in secondary education. In addition to this, a further objective was to elaborate and validate data analysis instruments. Case studies were carried out on three natural science secondary teachers with more than ten years' experience, enabling the characterisation of the teachers' science and science teaching conceptions as well as the description of classroom practice. Finally, with the help of these data together with the material used by the teachers, the inference of the teachers' personal didactical models and the obstacles to their professional development were made possible. Instruments for data collection used a questionnaire to facilitate the realisation of a semi-structured interview, video recordings of the classroom intervention of each teacher which correspond to a teaching unit taught over a two-week period and all the written material produced for the unit was collected. For the data analysis a taxonomy of classroom intervention patterns and a progression hypothesis towards desirable professional knowledge were elaborated, from the perspective of a research in the classroom model and according to a system of categories and subcategories which refer to their concepts about scientific knowledge, school knowledge, how to teach and evaluation. With the interview and the questionnaire a profile of exposed conceptions was obtained. The intervention profile was obtained using the classroom recordings; according to the patterns identified and their sequencing, both of which determine the characteristic structures and routines of these teachers. An outcome of these results was the validation of the previously mentioned taxonomy as an instrument of

  10. Adaptive Grid Techniques for Elliptic Fluid-Flow Problems,

    DTIC Science & Technology

    1985-12-01

    J . F . (1984), "Grid Generation Techniques in Computational Fluid Dynamics," AIAA Jnl., Vol. 22, No. 11, pp. 1505-1523. Thompson , J . F . (1983...Procedure," Ph.D. Thesis, Dept. of Computer Science, Stanford University, Calif. Tang, W. P., W. Skamarock, and J. Oliger (1985). To appear. Thompson

  11. GRID[subscript C] Renewable Energy Data Streaming into Classrooms

    ERIC Educational Resources Information Center

    DeLuca, V. William; Carpenter, Pam; Lari, Nasim

    2010-01-01

    For years, researchers have shown the value of using real-world data to enhance instruction in mathematics, science, and social studies. In an effort to help develop students' higher-order thinking skills in a data-rich learning environment, Green Research for Incorporating Data in the Classroom (GRID[subscript C]), a National Science…

  12. National Grid Deep Energy Retrofit Pilot Program—Clark Residence

    SciTech Connect

    2010-03-30

    In this case study, Building Science Corporation partnered with local utility company, National Grid, Massachusetts homes. This project involved the renovation of a 18th century Cape-style building and achieved a super-insulated enclosure (R-35 walls, R-50+ roof, R-20+ foundation), extensive water management improvements, high-efficiency water heater, and state-of-the-art ventilation.

  13. Reading to learn experimental practice: The role of text and firsthand experience in the acquisition of an abstract science principle

    NASA Astrophysics Data System (ADS)

    Richmond, Erica Kesin

    2008-10-01

    From the onset of schooling, texts are used as important educational tools. In the primary years, they are integral to learning how to decode and develop fluency. In the later elementary years, they are often essential to the acquisition of academic content. Unfortunately, many children experience difficulties with this process, which is due in large part to their unfamiliarity with the genre of academic texts. The articles presented in this dissertation share an underlying theme of how to develop children's ability to comprehend and learn from academic, and specifically, non-narrative texts. The first article reviews research on the development of non-narrative discourse to elucidate the linguistic precursors to non-narrative text comprehension. The second and third articles draw from an empirical study that investigated the best way to integrate text, manipulation, and first-hand experience for children's acquisition and application of an abstract scientific principle. The scientific principle introduced in the study was the Control of Variables Strategy (CVS), a fundamental idea underlying scientific reasoning and a strategy for designing unconfounded experiments. Eight grade 4 classes participated in the study (N = 129), in one of three conditions: (a) read procedural text and manipulate experimental materials, (b) listen to procedural text and manipulate experimental materials, or (c) read procedural text with no opportunity to manipulate experimental materials. Findings from the study indicate that children who had the opportunity to read and manipulate materials were most effective at applying the strategy to designing and justifying unconfounded experiments, and evaluating written and physical experimental designs; however, there was no effect of instructional condition on a written assessment of evaluating familiar and unfamiliar experimental designs one week after the intervention. These results suggest that the acquisition and application of an abstract

  14. Empower your Smart Grid Transformation

    DTIC Science & Technology

    2016-06-13

    TWITTER: #seiwebinar © 2011 Carnegie Mellon University Empower your Smart Grid Transformation David White SGMM Project Manager 10 March 2011 Report...2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Empower your Smart Grid Transformation 5a. CONTRACT NUMBER...and a core development team member for the SEI Smart Grid Maturity Model (SGMM), a business tool to assist utilities with planning and tracking

  15. Les apports de l'experimentation assistee par ordinateur (ExAO) en pedagogie par projet en Sciences de la nature au collegial

    NASA Astrophysics Data System (ADS)

    Marcotte, Alice

    The goals of this research were to conceptualize and to produce a test synthesis model for the Sciences program, where the student had to demonstrate his or her competency using the approach Considering New Situations from Acquired Knowledge. The test took the form of a student-structured project utilizing the experimental process: the student's scientific investigation was supported and facilitated by computer-assisted experimentation (CAEx). The model of action was elaborated in developmental research within the school setting, tested in biology, and continued in an interdisciplinary context. Our study focused on the advantages and the constraints of this new learning environment, which modify laboratories using traditional instrumentation. The final research was not to evaluate a type of test synthesis, but to propose and to improve this model of test synthesis based on experimental process and supported by CAEx. In order to implement the competency approach within an integration activity, we chose a cooperative learning environment contained within the pedagogical project. This didactic environment was inspired by socio-constructivism which involves students in open scientific problem-solving. Computer-assisted experimentation turned out to be a valuable tool for this environment, facilitating the implementation of the scientific process by increased induction. Resistance to confronted and uncircumvented reality changes students' perception of scientific knowledge. They learn to integrate the building of this knowledge, and then to realize the extent of their learning and their training. Students' opinions, which were gathered from questionnaires, reveal that they favorably perceive this type of environment in interaction with their peers and the experimentation. While this new knowledge contributes to CAEx within the pedagogical project, the products of this research included a teaching guide for the test synthesis, a booklet featuring the projects carried out

  16. Smart Grid Enabled EVSE

    SciTech Connect

    None, None

    2015-01-12

    The combined team of GE Global Research, Federal Express, National Renewable Energy Laboratory, and Consolidated Edison has successfully achieved the established goals contained within the Department of Energy’s Smart Grid Capable Electric Vehicle Supply Equipment funding opportunity. The final program product, shown charging two vehicles in Figure 1, reduces by nearly 50% the total installed system cost of the electric vehicle supply equipment (EVSE) as well as enabling a host of new Smart Grid enabled features. These include bi-directional communications, load control, utility message exchange and transaction management information. Using the new charging system, Utilities or energy service providers will now be able to monitor transportation related electrical loads on their distribution networks, send load control commands or preferences to individual systems, and then see measured responses. Installation owners will be able to authorize usage of the stations, monitor operations, and optimally control their electricity consumption. These features and cost reductions have been developed through a total system design solution.

  17. Colorado Electrical Transmission Grid

    DOE Data Explorer

    Zehner, Richard E.

    2012-02-01

    Citation Information: Originator: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Originator: Xcel Energy Publication Date: 2012 Title: Colorado XcelEnergy NonXcel Transmission Network Edition: First Publication Information: Publication Place: Earth Science & Observation Center, Cooperative Institute for Research in Environmental Science (CIRES), University of Colorado, Boulder Publisher: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Description: This layer contains transmission network of Colorado Spatial Domain: Extent: Top: 4540689.017558 m Left: 160606.141934 m Right: 758715.946645 m Bottom: 4098910.893397m Contact Information: Contact Organization: Earth Science &Observation Center (ESOC), CIRES, University of Colorado at Boulder Contact Person: Khalid Hussein Address: CIRES, Ekeley Building Earth Science & Observation Center (ESOC) 216 UCB City: Boulder State: CO Postal Code: 80309-0216 Country: USA Contact Telephone: 303-492-6782 Spatial Reference Information: Coordinate System: Universal Transverse Mercator (UTM) WGS’1984 Zone 13N False Easting: 500000.00000000 False Northing: 0.00000000 Central Meridian: -105.00000000 Scale Factor: 0.99960000 Latitude of Origin: 0.00000000 Linear Unit: Meter Datum: World Geodetic System ’1984 (WGS ’1984) Prime Meridian: Greenwich Angular Unit: Degree Digital Form: Format Name: Shapefile

  18. Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale

    NASA Astrophysics Data System (ADS)

    Barrios, M. I.

    2013-12-01

    The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues

  19. From the grid to the smart grid, topologically

    NASA Astrophysics Data System (ADS)

    Pagani, Giuliano Andrea; Aiello, Marco

    2016-05-01

    In its more visionary acceptation, the smart grid is a model of energy management in which the users are engaged in producing energy as well as consuming it, while having information systems fully aware of the energy demand-response of the network and of dynamically varying prices. A natural question is then: to make the smart grid a reality will the distribution grid have to be upgraded? We assume a positive answer to the question and we consider the lower layers of medium and low voltage to be the most affected by the change. In our previous work, we analyzed samples of the Dutch distribution grid (Pagani and Aiello, 2011) and we considered possible evolutions of these using synthetic topologies modeled after studies of complex systems in other technological domains (Pagani and Aiello, 2014). In this paper, we take an extra important step by defining a methodology for evolving any existing physical power grid to a good smart grid model, thus laying the foundations for a decision support system for utilities and governmental organizations. In doing so, we consider several possible evolution strategies and apply them to the Dutch distribution grid. We show how increasing connectivity is beneficial in realizing more efficient and reliable networks. Our proposal is topological in nature, enhanced with economic considerations of the costs of such evolutions in terms of cabling expenses and economic benefits of evolving the grid.

  20. Density separation of solids in ferrofluids with magnetic grids

    SciTech Connect

    Fay, H.; Quets, J.M.

    1980-04-01

    Nonmagnetic solids in a superparamagnetic ferrofluid are subjected to body forces proportional to the intensity of magnetization of the fluid and the gradient of the magnetic field. An apparent density of the fluid can be defined from the force equations, and since the apparent density can be much larger than the true density, it is possible to levitate or float dense objects. Mixtures of solids with a density greater than the apparent density sink while lower density solids float. In practice it is difficult to create a uniform gradient over a large volume and single gap magnetic separators require very large magnets or have a limited throughput. To overcome that problem, multiple gap magnetic grids have been designed. Such grids consist of planar arrays of parallel bars of alternating polarity, driven by permanent magnets. When immersed in ferrofluid, magnetic grids create nonuniform field gradients and apparent densities in the fluid. However, both analysis and experimental measurements show that the grid acts as a barrier to particles below a critical density, while permitting more dense particles to fall through the grid. Thus, a magnetic grid filter can be used as a high throughput binary separator of solids according to their densities. Such filters can be cascaded for more complex separations. Several magnetic grid filters have been designed, built, and tested. Magnetic measurements qualitatively agree with the theoretical predictions. Experiments with synthetic mixtures have demonstrated that good binary separations can be made.

  1. Topology and grid adaption for high-speed flow computations

    NASA Astrophysics Data System (ADS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-03-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  2. Topology and grid adaption for high-speed flow computations

    NASA Technical Reports Server (NTRS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-01-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  3. Research of the grid computing system applied in optical simulation

    NASA Astrophysics Data System (ADS)

    Jin, Wei-wei; Wang, Yu-dong; Liu, Qiangsheng; Cen, Zhao-feng; Li, Xiao-tong; Lin, Yi-qun

    2008-03-01

    A grid computing in the field of optics is presented in this paper. Firstly, the basic principles and research background of grid computing are outlined in this paper, along with the overview of its applications and the development status quo. The paper also discusses several typical tasks scheduling algorithms. Secondly, it focuses on describing a task scheduling of grid computing applied in optical computation. The paper gives details about the task scheduling system, including the task partition, granularity selection and tasks allocation, especially the structure of the system. In addition, some details of communication on grid computing are also illustrated. In this system, the "makespan" and "load balancing" are comprehensively considered. Finally, we build a grid model to test the task scheduling strategy, and the results are analyzed in detail. Compared to one isolated computer, a grid comprised of one server and four processors can shorten the "makespan" to 1/4. At the same time, the experimental results of the simulation also illustrate that the proposed scheduling system is able to balance loads of all processors. In short, the system performs scheduling well in the grid environment.

  4. NAS Grid Benchmarks: A Tool for Grid Space Exploration

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; VanderWijngaart, Rob F.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We present an approach for benchmarking services provided by computational Grids. It is based on the NAS Parallel Benchmarks (NPB) and is called NAS Grid Benchmark (NGB) in this paper. We present NGB as a data flow graph encapsulating an instance of an NPB code in each graph node, which communicates with other nodes by sending/receiving initialization data. These nodes may be mapped to the same or different Grid machines. Like NPB, NGB will specify several different classes (problem sizes). NGB also specifies the generic Grid services sufficient for running the bench-mark. The implementor has the freedom to choose any specific Grid environment. However, we describe a reference implementation in Java, and present some scenarios for using NGB.

  5. Are Clinical Trials With Mesenchymal Stem/Progenitor Cells too Far Ahead of the Science? Lessons From Experimental Hematology

    PubMed Central

    Prockop, Darwin J; Prockop, Susan E; Bertoncello, Ivan

    2014-01-01

    The cells referred to as mesenchymal stem/progenitor cells (MSCs) are currently being used to treat thousands of patients with diseases of essentially all the organs and tissues of the body. Strikingly positive results have been reported in some patients, but there have been few prospective controlled studies. Also, the reasons for the beneficial effects are frequently unclear. As a result there has been a heated debate as to whether the clinical trials with these new cell therapies are too far ahead of the science. The debate is not easily resolved, but important insights are provided by the 60-year history that was required to develop the first successful stem cell therapy, the transplantation of hematopoietic stem cells. The history indicates that development of a dramatically new therapy usually requires patience and a constant dialogue between basic scientists and physicians carrying out carefully designed clinical trials. It also suggests that the field can be moved forward by establishing better records of how MSCs are prepared, by establishing a large supply of reference MSCs that can be used to validate assays and compare MSCs prepared in different laboratories, and by continuing efforts to establish in vivo assays for the efficacy of MSCs. Stem Cells 2014;32:3055–3061 PMID:25100155

  6. Dynamic virtual AliEn Grid sites on Nimbus with CernVM

    NASA Astrophysics Data System (ADS)

    Harutyunyan, A.; Buncic, P.; Freeman, T.; Keahey, K.

    2010-04-01

    We describe the work on enabling one click deployment of Grid sites of AliEn Grid framework on the Nimbus 'science cloud' at the University of Chicago. The integration of computing resources of the cloud with the resource pool of AliEn Grid is achieved by leveraging two mechanisms: the Nimbus Context Broker developed at Argonne National Laboratory and the University of Chicago, and CernVM - a baseline virtual software appliance for LHC experiments developed at CERN. Two approaches of dynamic virtual AliEn Grid site deployment are presented.

  7. At the source of western science: the organization of experimentalism at the Accademia del Cimento (1657-1667).

    PubMed

    Beretta, M

    2000-05-01

    The Accademia del Cimento, founded by the Medici princes, Ferdinando II, Grand Duke of Tuscany, and his brother, Leopoldo, later Cardinal, had members and programmes of research very different from earlier academies in Italy. The Cimento foreshadowed later European academies and institutions specifically devoted to research and improvement of natural knowledge. It issued only one publication, the Saggi di naturali esperienze, and most of the observations and experimental results from its brief life remain unpublished. The Roman Accademia fisica-matematica, associated with Queen Christina of Sweden, continued to some extent its emphasis on experiment, while The Royal Society, with which it maintained links, placed even greater reliance on experiment and its validation through unvarnished publication. Comparisons between the Cimento and its contemporaries, The Royal Society and the French academy, illuminate the origin of scientific institutions in the early modern period.

  8. Experimental Investigation of Space Radiation Processing in Lunar Soil Ilmenite: Combining Perspectives from Surface Science and Transmission Electron Microscopy

    NASA Technical Reports Server (NTRS)

    Christoffersen, R.; Keller, L. P.; Rahman, Z.; Baragiola, R.

    2010-01-01

    Energetic ions mostly from the solar wind play a major role in lunar space weathering because they contribute structural and chemical changes to the space-exposed surfaces of lunar regolith grains. In mature mare soils, ilmenite (FeTiO3) grains in the finest size fraction have been shown in transmission electron microscope (TEM) studies to exhibit key differences in their response to space radiation processing relative to silicates [1,2,3]. In ilmenite, solar ion radiation alters host grain outer margins to produce 10-100 nm thick layers that are microstructurally complex, but dominantly crystalline compared to the amorphous radiation-processed rims on silicates [1,2,3]. Spatially well-resolved analytical TEM measurements also show nm-scale compositional and chemical state changes in these layers [1,3]. These include shifts in Fe/Ti ratio from strong surface Fe-enrichment (Fe/Ti >> 1), to Fe depletion (Fe/Ti < 1) at 40-50 nm below the grain surface [1,3]. These compositional changes are not observed in the radiation-processed rims on silicates [4]. Several mechanism(s) to explain the overall relations in the ilmenite grain rims by radiation processing and/or additional space weathering processes were proposed by [1], and remain under current consideration [3]. A key issue has concerned the ability of ion radiation processing alone to produce some of the deeper- penetrating compositional changes. In order to provide some experimental constraints on these questions, we have performed a combined X-ray photoelectron spectroscopy (XPS) and field-emission scanning transmission electron (FE-STEM) study of experimentally ion-irradiated ilmenite. A key feature of this work is the combination of analytical techniques sensitive to changes in the irradiated samples at depth scales going from the immediate surface (approx.5 nm; XPS), to deeper in the grain interior (5-100 nm; FE-STEM).

  9. The impact of the topology on cascading failures in a power grid model

    NASA Astrophysics Data System (ADS)

    Koç, Yakup; Warnier, Martijn; Mieghem, Piet Van; Kooij, Robert E.; Brazier, Frances M. T.

    2014-05-01

    Cascading failures are one of the main reasons for large scale blackouts in power transmission grids. Secure electrical power supply requires, together with careful operation, a robust design of the electrical power grid topology. Currently, the impact of the topology on grid robustness is mainly assessed by purely topological approaches, that fail to capture the essence of electric power flow. This paper proposes a metric, the effective graph resistance, to relate the topology of a power grid to its robustness against cascading failures by deliberate attacks, while also taking the fundamental characteristics of the electric power grid into account such as power flow allocation according to Kirchhoff laws. Experimental verification on synthetic power systems shows that the proposed metric reflects the grid robustness accurately. The proposed metric is used to optimize a grid topology for a higher level of robustness. To demonstrate its applicability, the metric is applied on the IEEE 118 bus power system to improve its robustness against cascading failures.

  10. Grid Erosion Modeling of the NEXT Ion Thruster Optics

    NASA Technical Reports Server (NTRS)

    Ernhoff, Jerold W.; Boyd, Iain D.; Soulas, George (Technical Monitor)

    2003-01-01

    Results from several different computational studies of the NEXT ion thruster optics are presented. A study of the effect of beam voltage on accelerator grid aperture wall erosion shows a non-monotonic, complex behavior. Comparison to experimental performance data indicates improvements in simulation of the accelerator grid current, as well as very good agreement with other quantities. Also examined is the effect of ion optics choice on the thruster life, showing that TAG optics provide better margin against electron backstreaming than NSTAR optics. The model is used to predict the change in performance with increasing accelerator grid voltage, showing that although the current collected on the accel grid downstream face increases, the erosion rate decreases. A study is presented for varying doubly-ionized Xenon current fraction. The results show that performance data is not extremely sensitive to the current fraction.

  11. 15 MW HArdware-in-the-loop Grid Simulation Project

    SciTech Connect

    Rigas, Nikolaos; Fox, John Curtiss; Collins, Randy; Tuten, James; Salem, Thomas; McKinney, Mark; Hadidi, Ramtin; Gislason, Benjamin; Boessneck, Eric; Leonard, Jesse

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  12. Using a Non-Equivalent Groups Quasi Experimental Design to Reduce Internal Validity Threats to Claims Made by Math and Science K-12 Teacher Recruitment Programs

    NASA Astrophysics Data System (ADS)

    Moin, Laura

    2009-10-01

    The American Recovery and Reinvestment Act national policy established in 2009 calls for ``meaningful data'' that demonstrate educational improvements, including the recruitment of high-quality teachers. The scant data available and the low credibility of many K-12 math/science teacher recruitment program evaluations remain the major barriers for the identification of effective recruitment strategies. Our study presents a methodology to better evaluate the impact of recruitment programs on increasing participants' interest in teaching careers. The research capitalizes on the use of several control groups and presents a non-equivalent groups quasi-experimental evaluation design that produces program effect claims with higher internal validity than claims generated by current program evaluations. With this method that compares responses to a teaching career interest question from undergraduates all along a continuum from just attending an information session to participating (or not) in the recruitment program, we were able to compare the effect of the program in increasing participants' interest in teaching careers versus the evolution of the same interest but in the absence of the program. We were also able to make suggestions for program improvement and further research. While our findings may not apply to other K-12 math/science teacher recruitment programs, we believe that our evaluation methodology does and will contribute to conduct stronger program evaluations. In so doing, our evaluation procedure may inform recruitment program designers and policy makers.

  13. Experimental evidence shows no fractionation of strontium isotopes ((87)Sr/(86)Sr) among soil, plants, and herbivores: implications for tracking wildlife and forensic science.

    PubMed

    Flockhart, D T Tyler; Kyser, T Kurt; Chipley, Don; Miller, Nathan G; Norris, D Ryan

    2015-01-01

    Strontium isotopes ((87)Sr/(86)Sr) can be useful biological markers for a wide range of forensic science applications, including wildlife tracking. However, one of the main advantages of using (87)Sr/(86)Sr values, that there is no fractionation from geological bedrock sources through the food web, also happens to be a critical assumption that has never been tested experimentally. We test this assumption by measuring (87)Sr/(86)Sr values across three trophic levels in a controlled greenhouse experiment. Adult monarch butterflies were raised on obligate larval host milkweed plants that were, in turn, grown on seven different soil types collected across Canada. We found no significant differences between (87)Sr/(86)Sr values in leachable Sr from soil minerals, organic soil, milkweed leaves, and monarch butterfly wings. Our results suggest that strontium isoscapes developed from (87)Sr/(86)Sr values in bedrock or soil may serve as a reliable biological marker in forensic science for a range of taxa and across large geographic areas.

  14. Grid generation using classical techniques

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1980-01-01

    A brief historical review of conformal mapping and its applications to problems in fluid mechanics and electromagnetism is presented. The use of conformal mapping as a grid generator is described. The philosophy of the 'closed form' approach and its application to a Neumann problem is discussed. Karman-Trefftz mappings and grids for ablated, three dimensional bodies are also discussed.

  15. Some Observations on Grid Convergence

    NASA Technical Reports Server (NTRS)

    Salas, manuel D.

    2006-01-01

    It is claimed that current practices in grid convergence studies, particularly in the field of external aerodynamics, are flawed. The necessary conditions to properly establish grid convergence are presented. A theoretical model and a numerical example are used to demonstrate these ideas.

  16. Intelligent automated surface grid generation

    NASA Technical Reports Server (NTRS)

    Yao, Ke-Thia; Gelsey, Andrew

    1995-01-01

    The goal of our research is to produce a flexible, general grid generator for automated use by other programs, such as numerical optimizers. The current trend in the gridding field is toward interactive gridding. Interactive gridding more readily taps into the spatial reasoning abilities of the human user through the use of a graphical interface with a mouse. However, a sometimes fruitful approach to generating new designs is to apply an optimizer with shape modification operators to improve an initial design. In order for this approach to be useful, the optimizer must be able to automatically grid and evaluate the candidate designs. This paper describes and intelligent gridder that is capable of analyzing the topology of the spatial domain and predicting approximate physical behaviors based on the geometry of the spatial domain to automatically generate grids for computational fluid dynamics simulators. Typically gridding programs are given a partitioning of the spatial domain to assist the gridder. Our gridder is capable of performing this partitioning. This enables the gridder to automatically grid spatial domains of wide range of configurations.

  17. In silico drug discovery approaches on grid computing infrastructures.

    PubMed

    Wolf, Antje; Shahid, Mohammad; Kasam, Vinod; Ziegler, Wolfgang; Hofmann-Apitius, Martin

    2010-02-01

    The first step in finding a "drug" is screening chemical compound databases against a protein target. In silico approaches like virtual screening by molecular docking are well established in modern drug discovery. As molecular databases of compounds and target structures are becoming larger and more and more computational screening approaches are available, there is an increased need in compute power and more complex workflows. In this regard, computational Grids are predestined and offer seamless compute and storage capacity. In recent projects related to pharmaceutical research, the high computational and data storage demands of large-scale in silico drug discovery approaches have been addressed by using Grid computing infrastructures, in both; pharmaceutical industry as well as academic research. Grid infrastructures are part of the so-called eScience paradigm, where a digital infrastructure supports collaborative processes by providing relevant resources and tools for data- and compute-intensive applications. Substantial computing resources, large data collections and services for data analysis are shared on the Grid infrastructure and can be mobilized on demand. This review gives an overview on the use of Grid computing for in silico drug discovery and tries to provide a vision of future development of more complex and integrated workflows on Grids, spanning from target identification and target validation via protein-structure and ligand dependent screenings to advanced mining of large scale in silico experiments.

  18. The old faith and the new science: the Nuremberg Code and human experimentation ethics in Britain, 1946-73.

    PubMed

    Hazelgrove, Jenny

    2002-04-01

    This article explores the impact of the Nuremberg Code on post-Second World War research ethics in Britain. Against the background of the Nuremberg Medical Trial, the Code received international endorsement, but how much did its ethical percepts influence actual research? This paper shows that, despite British involvement in the formulation of the Code, the experience of war-time and changing career structures were more influential in shaping the approach of investigators to their subjects. Where medical debates ensued, primarily over controversial research practices at the British Postgraduate Medical School, Hammersmith Hospital, they were set in the context of a much older division between 'bedside' and 'scientific' medicine. But whatever differences there may have been between those physicians who advocated research and those who questioned its use and ethical basis, most clung to the paternalist assumption that it was the doctor's place to decide what was best for his patients. Faced with rising public and medical criticism of contemporary research practices, the medical élite of the 1960s and 1970s safeguarded the reputation of the profession and medical control of research by negotiating new voluntary codes. In a similar move, their predecessors had helped to negotiate the Nuremberg Code in anticipation of public criticism of experimentation arising from the Nuremberg Medical Trial.

  19. Framework for Interactive Parallel Dataset Analysis on the Grid

    SciTech Connect

    Alexander, David A.; Ananthan, Balamurali; Johnson, Tony; Serbo, Victor; /SLAC

    2007-01-10

    We present a framework for use at a typical Grid site to facilitate custom interactive parallel dataset analysis targeting terabyte-scale datasets of the type typically produced by large multi-institutional science experiments. We summarize the needs for interactive analysis and show a prototype solution that satisfies those needs. The solution consists of desktop client tool and a set of Web Services that allow scientists to sign onto a Grid site, compose analysis script code to carry out physics analysis on datasets, distribute the code and datasets to worker nodes, collect the results back to the client, and to construct professional-quality visualizations of the results.

  20. Tools and Techniques for Measuring and Improving Grid Performance

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Frumkin, M.; Smith, W.; VanderWijngaart, R.; Wong, P.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on NASA's geographically dispersed computing resources, and the various methods by which the disparate technologies are integrated within a nationwide computational grid. Many large-scale science and engineering projects are accomplished through the interaction of people, heterogeneous computing resources, information systems and instruments at different locations. The overall goal is to facilitate the routine interactions of these resources to reduce the time spent in design cycles, particularly for NASA's mission critical projects. The IPG (Information Power Grid) seeks to implement NASA's diverse computing resources in a fashion similar to the way in which electric power is made available.

  1. Sensing and Measurement Architecture for Grid Modernization

    SciTech Connect

    Taft, Jeffrey D.; De Martini, Paul

    2016-02-01

    This paper addresses architecture for grid sensor networks, with primary emphasis on distribution grids. It describes a forward-looking view of sensor network architecture for advanced distribution grids, and discusses key regulatory, financial, and planning issues.

  2. Single grid accelerator for an ion thrustor

    NASA Technical Reports Server (NTRS)

    Margosian, P. M.; Nakanishi, S. (Inventor)

    1973-01-01

    A single grid accelerator system for an ion thrustor is discussed. A layer of dielectric material is interposed between this metal grid and the chamber containing an ionized propellant for protecting the grid against sputtering erosion.

  3. Grid Integration Studies: Data Requirements, Greening the Grid

    SciTech Connect

    Katz, Jessica

    2015-06-01

    A grid integration study is an analytical framework used to evaluate a power system with high penetration levels of variable renewable energy (VRE). A grid integration study simulates the operation of the power system under different VRE scenarios, identifying reliability constraints and evaluating the cost of actions to alleviate those constraints. These VRE scenarios establish where, how much, and over what timeframe to build generation and transmission capacity, ideally capturing the spatial diversity benefits of wind and solar resources. The results help build confidence among policymakers, system operators, and investors to move forward with plans to increase the amount of VRE on the grid.

  4. Decentralized control experiments on NASA's flexible grid

    NASA Technical Reports Server (NTRS)

    Ozguner, U.; Yurkowich, S.; Martin, J., III; Al-Abbass, F.

    1986-01-01

    Methods arising from the area of decentralized control are emerging for analysis and control synthesis for large flexible structures. In this paper the control strategy involves a decentralized model reference adaptive approach using a variable structure control. Local models are formulated based on desired damping and response time in a model-following scheme for various modal configurations. Variable structure controllers are then designed employing co-located angular rate and position feedback. In this scheme local control forces the system to move on a local sliding mode in some local error space. An important feature of this approach is that the local subsystem is made insensitive to dynamical interactions with other subsystems once the sliding surface is reached. Experiments based on the above have been performed for NASA's flexible grid experimental apparatus. The grid is designed to admit appreciable low-frequency structural dynamics, and allows for implementation of distributed computing components, inertial sensors, and actuation devices. A finite-element analysis of the grid provides the model for control system design and simulation; results of several simulations are reported on here, and a discussion of application experiments on the apparatus is presented.

  5. Hybrid Scheduling Model for Independent Grid Tasks

    PubMed Central

    Shanthini, J.; Kalaikumaran, T.; Karthik, S.

    2015-01-01

    Grid computing facilitates the resource sharing through the administrative domains which are geographically distributed. Scheduling in a distributed heterogeneous environment is intrinsically very hard because of the heterogeneous nature of resource collection. Makespan and tardiness are two different measures of scheduling, and many of the previous researches concentrated much on reduction of makespan, which measures the machine utilization. In this paper, we propose a hybrid scheduling algorithm for scheduling independent grid tasks with the objective of reducing total weighted tardiness of grid tasks. Tardiness is to measure the due date performance, which has a direct impact on cost for executing the jobs. In this paper we propose BG_ATC algorithm which is a combination of best gap (BG) search and Apparent Tardiness Cost (ATC) indexing algorithm. Furthermore, we implemented these two algorithms in two different phases of the scheduling process. In addition to that, the comparison was made on results with various benchmark algorithms and the experimental results show that our algorithm outperforms the benchmark algorithms. PMID:26543897

  6. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    SciTech Connect

    Ortega, John; Turnipseed, A.; Guenther, Alex B.; Karl, Thomas G.; Day, D. A.; Gochis, David; Huffman, J. A.; Prenni, Anthony J.; Levin, E. J.; Kreidenweis, Sonia M.; DeMott, Paul J.; Tobo, Y.; Patton, E. G.; Hodzic, Alma; Cui, Y. Y.; Harley, P.; Hornbrook, R. S.; Apel, E. C.; Monson, Russell K.; Eller, A. S.; Greenberg, J. P.; Barth, Mary; Campuzano-Jost, Pedro; Palm, B. B.; Jiminez, J. L.; Aiken, A. C.; Dubey, Manvendra K.; Geron, Chris; Offenberg, J.; Ryan, M. G.; Fornwalt, Paula J.; Pryor, S. C.; Keutsch, Frank N.; DiGangi, J. P.; Chan, A. W.; Goldstein, Allen H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, Chris; Mauldin, R. L.; Smith, James N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and interrelationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include – soil property measurements; – hydrological studies; – measurements of high-frequency turbulence parameters; – eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; – determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; – aerosol number and mass distributions; – chemical speciation of aerosol particles; – characterization of ice and cloud condensation nuclei; – trace gas measurements; and – model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  7. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008-2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y.; Harley, P. C.; Hornbrook, R. H.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L., III; Smith, J. N.

    2014-01-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air, but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include: - soil property measurements, - hydrological studies, - measurements of high-frequency turbulence parameters, - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy, - biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry, - aerosol number and mass distributions, - chemical speciation of aerosol particles, - characterization of ice and cloud condensation nuclei, - trace gas measurements, and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurement, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  8. Overview of the Manitou Experimental Forest Observatory: site description and selected science results from 2008 to 2013

    NASA Astrophysics Data System (ADS)

    Ortega, J.; Turnipseed, A.; Guenther, A. B.; Karl, T. G.; Day, D. A.; Gochis, D.; Huffman, J. A.; Prenni, A. J.; Levin, E. J. T.; Kreidenweis, S. M.; DeMott, P. J.; Tobo, Y.; Patton, E. G.; Hodzic, A.; Cui, Y. Y.; Harley, P. C.; Hornbrook, R. S.; Apel, E. C.; Monson, R. K.; Eller, A. S. D.; Greenberg, J. P.; Barth, M. C.; Campuzano-Jost, P.; Palm, B. B.; Jimenez, J. L.; Aiken, A. C.; Dubey, M. K.; Geron, C.; Offenberg, J.; Ryan, M. G.; Fornwalt, P. J.; Pryor, S. C.; Keutsch, F. N.; DiGangi, J. P.; Chan, A. W. H.; Goldstein, A. H.; Wolfe, G. M.; Kim, S.; Kaser, L.; Schnitzhofer, R.; Hansel, A.; Cantrell, C. A.; Mauldin, R. L.; Smith, J. N.

    2014-06-01

    The Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics & Nitrogen (BEACHON) project seeks to understand the feedbacks and inter-relationships between hydrology, biogenic emissions, carbon assimilation, aerosol properties, clouds and associated feedbacks within water-limited ecosystems. The Manitou Experimental Forest Observatory (MEFO) was established in 2008 by the National Center for Atmospheric Research to address many of the BEACHON research objectives, and it now provides a fixed field site with significant infrastructure. MEFO is a mountainous, semi-arid ponderosa pine-dominated forest site that is normally dominated by clean continental air but is periodically influenced by anthropogenic sources from Colorado Front Range cities. This article summarizes the past and ongoing research activities at the site, and highlights some of the significant findings that have resulted from these measurements. These activities include - soil property measurements; - hydrological studies; - measurements of high-frequency turbulence parameters; - eddy covariance flux measurements of water, energy, aerosols and carbon dioxide through the canopy; - determination of biogenic and anthropogenic volatile organic compound emissions and their influence on regional atmospheric chemistry; - aerosol number and mass distributions; - chemical speciation of aerosol particles; - characterization of ice and cloud condensation nuclei; - trace gas measurements; and - model simulations using coupled chemistry and meteorology. In addition to various long-term continuous measurements, three focused measurement campaigns with state-of-the-art instrumentation have taken place since the site was established, and two of these studies are the subjects of this special issue: BEACHON-ROCS (Rocky Mountain Organic Carbon Study, 2010) and BEACHON-RoMBAS (Rocky Mountain Biogenic Aerosol Study, 2011).

  9. Aeroacoustic Simulation of Nose Landing Gear on Adaptive Unstructured Grids With FUN3D

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Park, Michael A.; Lockhard, David P.

    2013-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed nose landing gear configuration that was tested in NASA Langley s closed-wall Basic Aerodynamic Research Tunnel (BART) and in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D, developed at NASA Langley Research center, is used to compute the unsteady flow field for this configuration. Starting with a coarse grid, a series of successively finer grids were generated using the adaptive gridding methodology available in the FUN3D code. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these computations. Time-averaged and instantaneous solutions obtained on these grids are compared with the measured data. In general, the correlation with the experimental data improves with grid refinement. A similar trend is observed for sound pressure levels obtained by using these CFD solutions as input to a FfowcsWilliams-Hawkings noise propagation code to compute the farfield noise levels. In general, the numerical solutions obtained on adapted grids compare well with the hand-tuned enriched fine grid solutions and experimental data. In addition, the grid adaption strategy discussed here simplifies the grid generation process, and results in improved computational efficiency of CFD simulations.

  10. National Smart Water Grid

    SciTech Connect

    Beaulieu, R A

    2009-07-13

    The United States repeatedly experiences floods along the Midwest's large rivers and droughts in the arid Western States that cause traumatic environmental conditions with huge economic impact. With an integrated approach and solution these problems can be alleviated. Tapping into the Mississippi River and its tributaries, the world's third largest fresh water river system, during flood events will mitigate the damage of flooding and provide a new source of fresh water to the Western States. The trend of increased flooding on the Midwest's large rivers is supported by a growing body of scientific literature. The Colorado River Basin and the western states are experiencing a protracted multi-year drought. Fresh water can be pumped via pipelines from areas of overabundance/flood to areas of drought or high demand. Calculations document 10 to 60 million acre-feet (maf) of fresh water per flood event can be captured from the Midwest's Rivers and pumped via pipelines to the Colorado River and introduced upstream of Lake Powell, Utah, to destinations near Denver, Colorado, and used in areas along the pipelines. Water users of the Colorado River include the cities in southern Nevada, southern California, northern Arizona, Colorado, Utah, Indian Tribes, and Mexico. The proposed start and end points, and routes of the pipelines are documented, including information on right-of-ways necessary for state and federal permits. A National Smart Water Grid{trademark} (NSWG) Project will create thousands of new jobs for construction, operation, and maintenance and save billions in drought and flood damage reparations tax dollars. The socio-economic benefits of NWSG include decreased flooding in the Midwest; increased agriculture, and recreation and tourism; improved national security, transportation, and fishery and wildlife habitats; mitigated regional climate change and global warming such as increased carbon capture; decreased salinity in Colorado River water crossing the US

  11. Random grid fern for visual tracking

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Kai; Zhang, Jin; Li, YunSong

    2014-05-01

    Visual tracking is one of the significant research directions in computer vision. Although standard random ferns tracking method obtains a good performance for the random spatial arrangement of binary tests, the effect of the locality of image on ferns description ability are ignored and prevent them to describe the object more accurately and robustly. This paper proposes a novel spatial arrangement of binary tests to divide the bounding box into grids in order to keep more details of the image for visual tracking. Experimental results show that this method can improve tracking accuracy effectively.

  12. High energy collimating fine grids

    NASA Astrophysics Data System (ADS)

    Arrieta, Victor M.; Tuffias, Robert H.; Laferla, Raffaele

    1995-02-01

    The objective of this project was to demonstrate the fabrication of extremely tight tolerance collimating grids using a high-Z material, specifically tungsten. The approach taken was to fabricate grids by a replication method involving the coating of a silicon grid substrate with tungsten by chemical vapor deposition (CVD). A negative of the desired grid structure was fabricated in silicon using highly wafering techniques developed for the semiconductor industry and capable of producing the required tolerances. Using diamond wafering blades, a network of accurately spaced slots was machined into a single-crystal silicon surface. These slots were then filled with tungsten by CVD, via the hydrogen reduction of tungsten hexafluoride. Following tungsten deposition, the silicon negative was etched away to leave the tungsten collimating grid structure. The project was divided into five tasks: (1) identify materials of construction for the replica and final collimating grid structures; (2) identify and implement a micromachining technique for manufacturing the negative collimator replicas (performed by NASA/JPL); (3) develop a CVD technique and processing parameters suitable for the complete tungsten densification of the collimator replicas; (4) develop a chemical etching technique for the removal of the collimator replicas after the tungsten deposition process; and (5) fabricate and deliver tungsten collimating grid specimens.

  13. High energy collimating fine grids

    NASA Technical Reports Server (NTRS)

    Arrieta, Victor M.; Tuffias, Robert H.; Laferla, Raffaele

    1995-01-01

    The objective of this project was to demonstrate the fabrication of extremely tight tolerance collimating grids using a high-Z material, specifically tungsten. The approach taken was to fabricate grids by a replication method involving the coating of a silicon grid substrate with tungsten by chemical vapor deposition (CVD). A negative of the desired grid structure was fabricated in silicon using highly wafering techniques developed for the semiconductor industry and capable of producing the required tolerances. Using diamond wafering blades, a network of accurately spaced slots was machined into a single-crystal silicon surface. These slots were then filled with tungsten by CVD, via the hydrogen reduction of tungsten hexafluoride. Following tungsten deposition, the silicon negative was etched away to leave the tungsten collimating grid structure. The project was divided into five tasks: (1) identify materials of construction for the replica and final collimating grid structures; (2) identify and implement a micromachining technique for manufacturing the negative collimator replicas (performed by NASA/JPL); (3) develop a CVD technique and processing parameters suitable for the complete tungsten densification of the collimator replicas; (4) develop a chemical etching technique for the removal of the collimator replicas after the tungsten deposition process; and (5) fabricate and deliver tungsten collimating grid specimens.

  14. A Java commodity grid kit.

    SciTech Connect

    von Laszewski, G.; Foster, I.; Gawor, J.; Lane, P.; Mathematics and Computer Science

    2001-07-01

    In this paper we report on the features of the Java Commodity Grid Kit. The Java CoG Kit provides middleware for accessing Grid functionality from the Java framework. Java CoG Kit middleware is general enough to design a variety of advanced Grid applications with quite different user requirements. Access to the Grid is established via Globus protocols, allowing the Java CoG Kit to communicate also with the C Globus reference implementation. Thus, the Java CoG Kit provides Grid developers with the ability to utilize the Grid, as well as numerous additional libraries and frameworks developed by the Java community to enable network, Internet, enterprise, and peer-to peer computing. A variety of projects have successfully used the client libraries of the Java CoG Kit to access Grids driven by the C Globus software. In this paper we also report on the efforts to develop server side Java CoG Kit components. As part of this research we have implemented a prototype pure Java resource management system that enables one to run Globus jobs on platforms on which a Java virtual machine is supported, including Windows NT machines.

  15. GridOPTICS Software System

    SciTech Connect

    Akyol, Bora A; Ciraci, PNNL Selim; Gibson, PNNL Tara; Rice, PNNL Mark; Sharma, PNNL Poorva; Yin, PNNL Jian; Allwardt, PNNL Craig; PNNL,

    2014-02-24

    GridOPTICS Software System (GOSS) is a middleware that facilitates creation of new, modular and flexible operational and planning platforms that can meet the challenges of the next generation power grid. GOSS enables Department of Energy, power system utilities, and vendors to build better tools faster. GOSS makes it possible to integrate Future Power Grid Initiative software products/prototypes into existing power grid software systems, including the PNNL PowerNet and EIOC environments. GOSS is designed to allow power grid applications developed for different underlying software platforms installed in different utilities to communicate with ease. This can be done in compliance with existing security and data sharing policies between the utilities. GOSS not only supports one-to-one data transfer between applications, but also publisher/subscriber scheme. To support interoperability requirements of future EMS, GOSS is designed for CIM compliance. In addition to this, it supports authentication and authorization capabilities to protect the system from cyber threats. In summary, the contributions of the GOSS middleware are as follows: • A platform to support future EMS development. • A middleware that promotes interoperability between power grid applications. • A distributed architecture that separates data sources from power grid applications. • Support for data exchange with either one-to-one or publisher/subscriber interfaces. • An authentication and authorization scheme for limiting the access to data between utilities.

  16. Buildings-to-Grid Technical Opportunities: From the Grid Perspective

    SciTech Connect

    Kropski, Ben; Pratt, Rob

    2014-03-28

    This paper outlines the nature of the power grid, lists challenges and barriers to the implementation of a transactive energy ecosystem, and provides concept solutions to current technological impediments.

  17. Running medical image analysis on GridFactory desktop grid.

    PubMed

    Orellana, Frederik; Niinimaki, Marko; Zhou, Xin; Rosendahl, Peter; Müller, Henning; Waananen, Anders

    2009-01-01

    At the Geneva University Hospitals work is in progress to establish a computing facility for medical image analysis, potentially using several hundreds of desktop computers. Typically, hospitals do not have a computer infrastructure dedicated to research, nor can the data leave the hospital network for the reasons of privacy. For this purpose, a novel batch system called GridFactory has been tested along-side with the well-known batch system Condor. GridFactory's main benefits, compared to other batch systems, lie in its virtualization support and firewall friendliness. The tests involved running visual feature extraction from 50,000 anonymized medical images on a small local grid of 20 desktop computers. A comparisons with a Condor based batch system in the same computers is then presented. The performance of GridFactory is found satisfactory.

  18. Grid Visualization Tool

    NASA Technical Reports Server (NTRS)

    Chouinard, Caroline; Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steven

    2005-01-01

    The Grid Visualization Tool (GVT) is a computer program for displaying the path of a mobile robotic explorer (rover) on a terrain map. The GVT reads a map-data file in either portable graymap (PGM) or portable pixmap (PPM) format, representing a gray-scale or color map image, respectively. The GVT also accepts input from path-planning and activity-planning software. From these inputs, the GVT generates a map overlaid with one or more rover path(s), waypoints, locations of targets to be explored, and/or target-status information (indicating success or failure in exploring each target). The display can also indicate different types of paths or path segments, such as the path actually traveled versus a planned path or the path traveled to the present position versus planned future movement along a path. The program provides for updating of the display in real time to facilitate visualization of progress. The size of the display and the map scale can be changed as desired by the user. The GVT was written in the C++ language using the Open Graphics Library (OpenGL) software. It has been compiled for both Sun Solaris and Linux operating systems.

  19. National transmission grid study

    SciTech Connect

    Abraham, Spencer

    2003-05-31

    The National Energy Policy Plan directed the U.S. Department of Energy (DOE) to conduct a study to examine the benefits of establishing a national electricity transmission grid and to identify transmission bottlenecks and measures to address them. DOE began by conducting an independent analysis of U.S. electricity markets and identifying transmission system bottlenecks using DOE’s Policy Office Electricity Modeling System (POEMS). DOE’s analysis, presented in Section 2, confirms the central role of the nation’s transmission system in lowering costs to consumers through increased trade. More importantly, DOE’s analysis also confirms the results of previous studies, which show that transmission bottlenecks and related transmission system market practices are adding hundreds of millions of dollars to consumers’ electricity bills each year. A more detailed technical overview of the use of POEMS is provided in Appendix A. DOE led an extensive, open, public input process and heard a wide range of comments and recommendations that have all been considered.1 More than 150 participants registered for three public workshops held in Detroit, MI (September 24, 2001); Atlanta, GA (September 26, 2001); and Phoenix, AZ (September 28, 2001).

  20. Grid technologies empowering drug discovery.

    PubMed

    Chien, Andrew; Foster, Ian; Goddette, Dean

    2002-10-15

    Grid technologies enable flexible coupling and sharing of computers, instruments and storage. Grids can provide technical solutions to the volume of data and computational demands associated with drug discovery by delivering larger computing capability (flexible resource sharing), providing coordinated access to large data resources and enabling novel online exploration (coupling computing, data and instruments online). Here, we illustrate this potential by describing two applications: the use of desktop PC grid technologies for virtual screening, and distributed X-ray structure reconstruction and online visualization.

  1. Simulation of an Isolated Tiltrotor in Hover with an Unstructured Overset-Grid RANS Solver

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, Elizabeth M.; Biedron, Robert T.

    2009-01-01

    An unstructured overset-grid Reynolds Averaged Navier-Stokes (RANS) solver, FUN3D, is used to simulate an isolated tiltrotor in hover. An overview of the computational method is presented as well as the details of the overset-grid systems. Steady-state computations within a noninertial reference frame define the performance trends of the rotor across a range of the experimental collective settings. Results are presented to show the effects of off-body grid refinement and blade grid refinement. The computed performance and blade loading trends show good agreement with experimental results and previously published structured overset-grid computations. Off-body flow features indicate a significant improvement in the resolution of the first perpendicular blade vortex interaction with background grid refinement across the collective range. Considering experimental data uncertainty and effects of transition, the prediction of figure of merit on the baseline and refined grid is reasonable at the higher collective range- within 3 percent of the measured values. At the lower collective settings, the computed figure of merit is approximately 6 percent lower than the experimental data. A comparison of steady and unsteady results show that with temporal refinement, the dynamic results closely match the steady-state noninertial results which gives confidence in the accuracy of the dynamic overset-grid approach.

  2. Charting the collision between a seed coat fragment and newly-designed lint cleaner grid bars

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An experiment was run to determine how a seed coat fragment (SCF) reacts after colliding with newly-designed grid bars mounted on a saw-type lint cleaner simulator. A high-speed video camera recorded the action that took place. Ten experimental grid bars were tested. The included angle of the sha...

  3. Assistive Awareness in Smart Grids

    NASA Astrophysics Data System (ADS)

    Bourazeri, Aikaterini; Almajano, Pablo; Rodriguez, Inmaculada; Lopez-Sanchez, Maite

    The following sections are included: * Introduction * Background * The User-Infrastructure Interface * User Engagement through Assistive Awareness * Research Impact * Serious Games for Smart Grids * Serious Game Technology * Game scenario * Game mechanics * Related Work * Summary and Conclusions

  4. Modal Analysis for Grid Operation

    SciTech Connect

    2011-03-03

    MANGO software is to provide a solution for improving small signal stability of power systems through adjusting operator-controllable variables using PMU measurement. System oscillation problems are one of the major threats to the grid stability and reliability in California and the Western Interconnection. These problems result in power fluctuations, lower grid operation efficiency, and may even lead to large-scale grid breakup and outages. This MANGO software aims to solve this problem by automatically generating recommended operation procedures termed Modal Analysis for Grid Operation (MANGO) to improve damping of inter-area oscillation modes. The MANGO procedure includes three steps: recognizing small signal stability problems, implementing operating point adjustment using modal sensitivity, and evaluating the effectiveness of the adjustment. The MANGO software package is designed to help implement the MANGO procedure.

  5. Uptakes of e-Science in Asia

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon C.

    e-Science refers to either computationally intensive science or data intensive science that is carried out in highly distributed computing environments [1, 2]. Intuitively, incentives of the new paradigm of science are mainly due to the relentless pursuit of new science and building capacity to digest unprecedented scale of scientific data for more knowledge. Originating from the requirements of LHC collaboration, a LHC Computing Grid (LCG) system is constructed to fulfill worldwide data sharing and resource integration from early 2002 [3]. Based on LCG, the global grid-based e-Infrastructure was established very quickly in Europe, America and Asia, to support wider scientific disciplines of e-Science. Many international e-Science joint efforts on astronomy, life science, earth science, environmental changes, and humanities and social sciences are now taking advantage of the same e-Infrastructure to achieve synergy greater than sum of individuals.

  6. 3D laser inspection of fuel assembly grid spacers for nuclear reactors based on diffractive optical elements

    NASA Astrophysics Data System (ADS)

    Finogenov, L. V.; Lemeshko, Yu A.; Zav'yalov, P. S.; Chugui, Yu V.

    2007-06-01

    Ensuring the safety and high operation reliability of nuclear reactors takes 100% inspection of geometrical parameters of fuel assemblies, which include the grid spacers performed as a cellular structure with fuel elements. The required grid spacer geometry of assembly in the transverse and longitudinal cross sections is extremely important for maintaining the necessary heat regime. A universal method for 3D grid spacer inspection using a diffractive optical element (DOE), which generates as the structural illumination a multiple-ring pattern on the inner surface of a grid spacer cell, is investigated. Using some DOEs one can inspect the nomenclature of all produced grids. A special objective has been developed for forming the inner surface cell image. The problems of diffractive elements synthesis, projecting optics calculation, adjusting methods as well as calibration of the experimental measuring system are considered. The algorithms for image processing for different constructive elements of grids (cell, channel hole, outer grid spacer rim) and the experimental results are presented.

  7. Revised Extended Grid Library

    SciTech Connect

    Martz, Roger L.

    2016-07-15

    The Revised Eolus Grid Library (REGL) is a mesh-tracking library that was developed for use with the MCNP6TM computer code so that (radiation) particles can track on an unstructured mesh. The unstructured mesh is a finite element representation of any geometric solid model created with a state-of-the-art CAE/CAD tool. The mesh-tracking library is written using modern Fortran and programming standards; the library is Fortran 2003 compliant. The library was created with a defined application programmer interface (API) so that it could easily integrate with other particle tracking/transport codes. The library does not handle parallel processing via the message passing interface (mpi), but has been used successfully where the host code handles the mpi calls. The library is thread-safe and supports the OpenMP paradigm. As a library, all features are available through the API and overall a tight coupling between it and the host code is required. Features of the library are summarized with the following list: • can accommodate first and second order 4, 5, and 6-sided polyhedra • any combination of element types may appear in a single geometry model • parts may not contain tetrahedra mixed with other element types • pentahedra and hexahedra can be together in the same part • robust handling of overlaps and gaps • tracks element-to-element to produce path length results at the element level • finds element numbers for a given mesh location • finds intersection points on element faces for the particle tracks • produce a data file for post processing results analysis • reads Abaqus .inp input (ASCII) files to obtain information for the global mesh-model • supports parallel input processing via mpi • support parallel particle transport by both mpi and OpenMP

  8. Smart Wire Grid: Resisting Expectations

    SciTech Connect

    Ramsay, Stewart; Lowe, DeJim

    2014-03-03

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  9. Reinventing Batteries for Grid Storage

    ScienceCinema

    Banerjee, Sanjoy

    2016-07-12

    The City University of New York's Energy Institute, with the help of ARPA-E funding, is creating safe, low cost, rechargeable, long lifecycle batteries that could be used as modular distributed storage for the electrical grid. The batteries could be used at the building level or the utility level to offer benefits such as capture of renewable energy, peak shaving and microgridding, for a safer, cheaper, and more secure electrical grid.

  10. Smart Wire Grid: Resisting Expectations

    ScienceCinema

    Ramsay, Stewart; Lowe, DeJim

    2016-07-12

    Smart Wire Grid's DSR technology (Discrete Series Reactor) can be quickly deployed on electrical transmission lines to create intelligent mesh networks capable of quickly rerouting electricity to get power where and when it's needed the most. With their recent ARPA-E funding, Smart Wire Grid has been able to move from prototype and field testing to building out a US manufacturing operation in just under a year.

  11. Parallel Power Grid Simulation Toolkit

    SciTech Connect

    Smith, Steve; Kelley, Brian; Banks, Lawrence; Top, Philip; Woodward, Carol

    2015-09-14

    ParGrid is a 'wrapper' that integrates a coupled Power Grid Simulation toolkit consisting of a library to manage the synchronization and communication of independent simulations. The included library code in ParGid, named FSKIT, is intended to support the coupling multiple continuous and discrete even parallel simulations. The code is designed using modern object oriented C++ methods utilizing C++11 and current Boost libraries to ensure compatibility with multiple operating systems and environments.

  12. Grid cells without theta oscillations in the entorhinal cortex of bats.

    PubMed

    Yartsev, Michael M; Witter, Menno P; Ulanovsky, Nachum

    2011-11-02

    Grid cells provide a neural representation of space, by discharging when an animal traverses through the vertices of a periodic hexagonal grid spanning the environment. Although grid cells have been characterized in detail in rats, the fundamental question of what neural dynamics give rise to the grid structure remains unresolved. Two competing classes of models were proposed: network models, based on attractor dynamics, and oscillatory interference models, which propose that interference between somatic and dendritic theta-band oscillations (4-10 Hz) in single neurons transforms a temporal oscillation into a spatially periodic grid. So far, these models could not be dissociated experimentally, because rodent grid cells always co-exist with continuous theta oscillations. Here we used a novel animal model, the Egyptian fruit bat, to refute the proposed causal link between grids and theta oscillations. On the basis of our previous finding from bat hippocampus, of spatially tuned place cells in the absence of continuous theta oscillations, we hypothesized that grid cells in bat medial entorhinal cortex might also exist without theta oscillations. Indeed, we found grid cells in bat medial entorhinal cortex that shared remarkable similarities to rodent grid cells. Notably, the grids existed in the absence of continuous theta-band oscillations, and with almost no theta modulation of grid-cell spiking--both of which are essential prerequisites of the oscillatory interference models. Our results provide a direct demonstration of grid cells in a non-rodent species. Furthermore, they strongly argue against a major class of computational models of grid cells.

  13. CDF GlideinWMS usage in grid computing of high energy physics

    SciTech Connect

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor; /Fermilab

    2010-01-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  14. CDF GlideinWMS usage in Grid computing of high energy physics

    NASA Astrophysics Data System (ADS)

    Zvada, Marian; Benjamin, Doug; Sfiligoi, Igor

    2010-04-01

    Many members of large science collaborations already have specialized grids available to advance their research in the need of getting more computing resources for data analysis. This has forced the Collider Detector at Fermilab (CDF) collaboration to move beyond the usage of dedicated resources and start exploiting Grid resources. Nowadays, CDF experiment is increasingly relying on glidein-based computing pools for data reconstruction. Especially, Monte Carlo production and user data analysis, serving over 400 users by central analysis farm middleware (CAF) on the top of Condor batch system and CDF Grid infrastructure. Condor is designed as distributed architecture and its glidein mechanism of pilot jobs is ideal for abstracting the Grid computing by making a virtual private computing pool. We would like to present the first production use of the generic pilot-based Workload Management System (glideinWMS), which is an implementation of the pilot mechanism based on the Condor distributed infrastructure. CDF Grid computing uses glideinWMS for its data reconstruction on the FNAL campus Grid, user analysis and Monte Carlo production across Open Science Grid (OSG). We review this computing model and setup used including CDF specific configuration within the glideinWMS system which provides powerful scalability and makes Grid computing working like in a local batch environment with ability to handle more than 10000 running jobs at a time.

  15. Grid-Enabled High Energy Physics Research using a Beowulf Cluster

    NASA Astrophysics Data System (ADS)

    Mahmood, Akhtar

    2005-04-01

    At Edinboro University of Pennsylvania, we have built a 8-node 25 Gflops Beowulf Cluster with 2.5 TB of disk storage space to carry out grid-enabled, data-intensive high energy physics research for the ATLAS experiment via Grid3. We will describe how we built and configured our Cluster, which we have named the Sphinx Beowulf Cluster. We will describe the results of our cluster benchmark studies and the run-time plots of several parallel application codes. Once fully functional, the Cluster will be part of Grid3[www.ivdgl.org/grid3]. The current ATLAS simulation grid application, models the entire physical processes from the proton anti-proton collisions and detector's response to the collision debri through the complete reconstruction of the event from analyses of these responses. The end result is a detailed set of data that simulates the real physical collision event inside a particle detector. Grid is the new IT infrastructure for the 21^st century science -- a new computing paradigm that is poised to transform the practice of large-scale data-intensive research in science and engineering. The Grid will allow scientist worldwide to view and analyze huge amounts of data flowing from the large-scale experiments in High Energy Physics. The Grid is expected to bring together geographically and organizationally dispersed computational resources, such as CPUs, storage systems, communication systems, and data sources.

  16. Sensitivity of 30-cm mercury bombardment ion thruster characteristics to accelerator grid design

    NASA Technical Reports Server (NTRS)

    Rawlin, V. K.

    1978-01-01

    The design of ion optics for bombardment thrusters strongly influences overall performance and lifetime. The operation of a 30 cm thruster with accelerator grid open area fractions ranging from 43 to 24 percent, was evaluated and compared with experimental and theoretical results. Ion optics properties measured included the beam current extraction capability, the minimum accelerator grid voltage to prevent backstreaming, ion beamlet diameter as a function of radial position on the grid and accelerator grid hole diameter, and the high energy, high angle ion beam edge location. Discharge chamber properties evaluated were propellant utilization efficiency, minimum discharge power per beam amp, and minimum discharge voltage.

  17. Accessing ultra-high pressures and strain rates in the solid state: An experimental path to extreme materials science on the Omega and NIF lasers

    NASA Astrophysics Data System (ADS)

    Lorenz, K. Thomas

    2005-03-01

    A new approach to materials science at extreme pressures and strain rates has been developed on the Omega laser, using a ramped plasma piston drive. The laser drives a shock through a solid plastic reservoir that unloads at the rear free surface, expands across a vacuum gap, and stagnates on the metal sample under study. This produces a gently increasing ram pressure, compressing the sample nearly isentropically. The peak pressure on the sample, diagnosed with VISAR measurements, can be varied by adjusting the laser energy and pulse length, gap size, and reservoir density, and obeys a simple scaling relation.^1 This has been demonstrated at OMEGA at pressures to 200 GPa in Al foils. In an important application, using in-flight x-ray radiography, the material strength of solid-state samples at high pressure can be inferred by measuring the reductions in the growth rates (stabilization) of Rayleigh-Taylor (RT) unstable interfaces. RT instability measurements of solid Al-6061-T6 and vanadium, at pressures of 20-100 GPa and strain rates of 10^6 to 10^8 s-1, show clear material strength effects. High-pressure experimental designs based on this drive have been developed for the NIF laser, predicting that solid-state samples can be quasi-isentropically driven to pressures an order of magnitude higher than on Omega - accessing new regimes of dense, high-pressure matter. [1] J. Edwards et al., Phys. Rev. Lett., 92, 075002 (2004).

  18. Grid accounting service: state and future development

    NASA Astrophysics Data System (ADS)

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-06-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  19. Grid accounting service: state and future development

    SciTech Connect

    Levshina, T.; Sehgal, C.; Bockelman, B.; Weitzel, D.; Guru, A.

    2014-01-01

    During the last decade, large-scale federated distributed infrastructures have been continually developed and expanded. One of the crucial components of a cyber-infrastructure is an accounting service that collects data related to resource utilization and identity of users using resources. The accounting service is important for verifying pledged resource allocation per particular groups and users, providing reports for funding agencies and resource providers, and understanding hardware provisioning requirements. It can also be used for end-to-end troubleshooting as well as billing purposes. In this work we describe Gratia, a federated accounting service jointly developed at Fermilab and Holland Computing Center at University of Nebraska-Lincoln. The Open Science Grid, Fermilab, HCC, and several other institutions have used Gratia in production for several years. The current development activities include expanding Virtual Machines provisioning information, XSEDE allocation usage accounting, and Campus Grids resource utilization. We also identify the direction of future work: improvement and expansion of Cloud accounting, persistent and elastic storage space allocation, and the incorporation of WAN and LAN network metrics.

  20. On the application of Chimera/unstructured hybrid grids for conjugate heat transfer

    NASA Technical Reports Server (NTRS)

    Kao, Kai-Hsiung; Liou, Meng-Sing

    1995-01-01

    A hybrid grid system that combines the Chimera overset grid scheme and an unstructured grid method is developed to study fluid flow and heat transfer problems. With the proposed method, the solid structural region, in which only the heat conduction is considered, can be easily represented using an unstructured grid method. As for the fluid flow region external to the solid material, the Chimera overset grid scheme has been shown to be very flexible and efficient in resolving complex configurations. The numerical analyses require the flow field solution and material thermal response to be obtained simultaneously. A continuous transfer of temperature and heat flux is specified at the interface, which connects the solid structure and the fluid flow as an integral system. Numerical results are compared with analytical and experimental data for a flat plate and a C3X cooled turbine cascade. A simplified drum-disk system is also simulated to show the effectiveness of this hybrid grid system.

  1. Navier-Stokes simulation of rotor-body flowfield in hover using overset grids

    NASA Technical Reports Server (NTRS)

    Srinivasan, G. R.; Ahmad, J. U.

    1993-01-01

    A free-wake Navier-Stokes numerical scheme and multiple Chimera overset grids have been utilized for calculating the quasi-steady hovering flowfield of a Boeing-360 rotor mounted on an axisymmetric whirl-tower. The entire geometry of this rotor-body configuration is gridded-up with eleven different overset grids. The composite grid has 1.3 million grid points for the entire flow domain. The numerical results, obtained using coarse grids and a rigid rotor assumption, show a thrust value that is within 5% of the experimental value at a flow condition of M(sub tip) = 0.63, Theta(sub c) = 8 deg, and Re = 2.5 x 10(exp 6). The numerical method thus demonstrates the feasibility of using a multi-block scheme for calculating the flowfields of complex configurations consisting of rotating and non-rotating components.

  2. The functional micro-organization of grid cells revealed by cellular-resolution imaging.

    PubMed

    Heys, James G; Rangarajan, Krsna V; Dombeck, Daniel A

    2014-12-03

    Establishing how grid cells are anatomically arranged, on a microscopic scale, in relation to their firing patterns in the environment would facilitate a greater microcircuit-level understanding of the brain's representation of space. However, all previous grid cell recordings used electrode techniques that provide limited descriptions of fine-scale organization. We therefore developed a technique for cellular-resolution functional imaging of medial entorhinal cortex (MEC) neurons in mice navigating a virtual linear track, enabling a new experimental approach to study MEC. Using these methods, we show that grid cells are physically clustered in MEC compared to nongrid cells. Additionally, we demonstrate that grid cells are functionally micro-organized: the similarity between the environment firing locations of grid cell pairs varies as a function of the distance between them according to a "Mexican hat"-shaped profile. This suggests that, on average, nearby grid cells have more similar spatial firing phases than those further apart.

  3. Complete coverage of space favors modularity of the grid system in the brain

    NASA Astrophysics Data System (ADS)

    Sanzeni, A.; Balasubramanian, V.; Tiana, G.; Vergassola, M.

    2016-12-01

    Grid cells in the entorhinal cortex fire when animals that are exploring a certain region of space occupy the vertices of a triangular grid that spans the environment. Different neurons feature triangular grids that differ in their properties of periodicity, orientation, and ellipticity. Taken together, these grids allow the animal to maintain an internal, mental representation of physical space. Experiments show that grid cells are modular, i.e., there are groups of neurons which have grids with similar periodicity, orientation, and ellipticity. We use statistical physics methods to derive a relation between variability of the properties of the grids within a module and the range of space that can be covered completely (i.e., without gaps) by the grid system with high probability. Larger variability shrinks the range of representation, providing a functional rationale for the experimentally observed comodularity of grid cell periodicity, orientation, and ellipticity. We obtain a scaling relation between the number of neurons and the period of a module, given the variability and coverage range. Specifically, we predict how many more neurons are required at smaller grid scales than at larger ones.

  4. Complete coverage of space favors modularity of the grid system in the brain.

    PubMed

    Sanzeni, A; Balasubramanian, V; Tiana, G; Vergassola, M

    2016-12-01

    Grid cells in the entorhinal cortex fire when animals that are exploring a certain region of space occupy the vertices of a triangular grid that spans the environment. Different neurons feature triangular grids that differ in their properties of periodicity, orientation, and ellipticity. Taken together, these grids allow the animal to maintain an internal, mental representation of physical space. Experiments show that grid cells are modular, i.e., there are groups of neurons which have grids with similar periodicity, orientation, and ellipticity. We use statistical physics methods to derive a relation between variability of the properties of the grids within a module and the range of space that can be covered completely (i.e., without gaps) by the grid system with high probability. Larger variability shrinks the range of representation, providing a functional rationale for the experimentally observed comodularity of grid cell periodicity, orientation, and ellipticity. We obtain a scaling relation between the number of neurons and the period of a module, given the variability and coverage range. Specifically, we predict how many more neurons are required at smaller grid scales than at larger ones.

  5. Algebraic grid generation for complex geometries

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1991-01-01

    An efficient computer program called GRID2D/3D has been developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2D and 3D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation. The distribution of grid points within the spatial domain is controlled by stretching functions and grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For 2D spatial domains the boundary curves are constructed by using either cubic or tension spline interpolation. For 3D spatial domains the boundary surfaces are constructed by using a new technique, developed in this study, referred to as 3D bidirectional Hermite interpolation.

  6. Ion beamlet vectoring by grid translation

    NASA Technical Reports Server (NTRS)

    Homa, J. M.; Wilbur, P. J.

    1982-01-01

    Ion beamlet vectoring is investigated by collecting deflection and divergence angle data for two-grid systems as a function of the relative displacement of these grids. Results show that at large displacements, accelerator grid impingement becomes a limiting factor and this determines the useful range of beamlet deflection. Beamlet deflection was shown to vary linearly with grid offset angle over this range. Values of deflection-to-offset angle ratio and useful range of deflection are presented as functions of grid-hole geometries, perveance levels, and accelerating voltages. It is found that the divergence of the beamlets is unaffected by deflection over the useful range of beamlet deflection. The grids of a typical dished-grid ion thruster are examined to determine where over the grid surface the grid offsets exceed the useful range, which indicates the regions on the surface where high accelerator grid impingment is probably occurring.

  7. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve

    2010-04-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  8. ReSS: Resource Selection Service for National and Campus Grid Infrastructure

    SciTech Connect

    Mhashilkar, Parag; Garzoglio, Gabriele; Levshina, Tanya; Timm, Steve; /Fermilab

    2009-05-01

    The Open Science Grid (OSG) offers access to around hundred Compute elements (CE) and storage elements (SE) via standard Grid interfaces. The Resource Selection Service (ReSS) is a push-based workload management system that is integrated with the OSG information systems and resources. ReSS integrates standard Grid tools such as Condor, as a brokering service and the gLite CEMon, for gathering and publishing resource information in GLUE Schema format. ReSS is used in OSG by Virtual Organizations (VO) such as Dark Energy Survey (DES), DZero and Engagement VO. ReSS is also used as a Resource Selection Service for Campus Grids, such as FermiGrid. VOs use ReSS to automate the resource selection in their workload management system to run jobs over the grid. In the past year, the system has been enhanced to enable publication and selection of storage resources and of any special software or software libraries (like MPI libraries) installed at computing resources. In this paper, we discuss the Resource Selection Service, its typical usage on the two scales of a National Cyber Infrastructure Grid, such as OSG, and of a campus Grid, such as FermiGrid.

  9. Interoperability of GADU in using heterogeneous Grid resources for bioinformatics applications.

    SciTech Connect

    Sulakhe, D.; Rodriguez, A.; Wilde, M.; Foster, I.; Maltsev, N.; Univ. of Chicago

    2008-03-01

    Bioinformatics tools used for efficient and computationally intensive analysis of genetic sequences require large-scale computational resources to accommodate the growing data. Grid computational resources such as the Open Science Grid and TeraGrid have proved useful for scientific discovery. The genome analysis and database update system (GADU) is a high-throughput computational system developed to automate the steps involved in accessing the Grid resources for running bioinformatics applications. This paper describes the requirements for building an automated scalable system such as GADU that can run jobs on different Grids. The paper describes the resource-independent configuration of GADU using the Pegasus-based virtual data system that makes high-throughput computational tools interoperable on heterogeneous Grid resources. The paper also highlights the features implemented to make GADU a gateway to computationally intensive bioinformatics applications on the Grid. The paper will not go into the details of problems involved or the lessons learned in using individual Grid resources as it has already been published in our paper on genome analysis research environment (GNARE) and will focus primarily on the architecture that makes GADU resource independent and interoperable across heterogeneous Grid resources.

  10. Privacy protection in HealthGrid: distributing encryption management over the VO.

    PubMed

    Torres, Erik; de Alfonso, Carlos; Blanquer, Ignacio; Hernández, Vicente

    2006-01-01

    Grid technologies have proven to be very successful in tackling challenging problems in which data access and processing is a bottleneck. Notwithstanding the benefits that Grid technologies could have in Health applications, privacy leakages of current DataGrid technologies due to the sharing of data in VOs and the use of remote resources, compromise its widespreading. Privacy control for Grid technology has become a key requirement for the adoption of Grids in the Healthcare sector. Encrypted storage of confidential data effectively reduces the risk of disclosure. A self-enforcing scheme for encrypted data storage can be achieved by combining Grid security systems with distributed key management and classical cryptography techniques. Virtual Organizations, as the main unit of user management in Grid, can provide a way to organize key sharing, access control lists and secure encryption management. This paper provides programming models and discusses the value, costs and behavior of such a system implemented on top of one of the latest Grid middlewares. This work is partially funded by the Spanish Ministry of Science and Technology in the frame of the project Investigación y Desarrollo de Servicios GRID: Aplicación a Modelos Cliente-Servidor, Colaborativos y de Alta Productividad, with reference TIC2003-01318.

  11. Greening the Grid - Advancing Solar, Wind, and Smart Grid Technologies (Spanish Version)

    SciTech Connect

    2016-04-01

    This is the Spanish version of 'Greening the Grid - Advancing Solar, Wind, and Smart Grid Technologies'. Greening the Grid provides technical assistance to energy system planners, regulators, and grid operators to overcome challenges associated with integrating variable renewable energy into the grid.

  12. Scaling Up Renewable Energy Generation: Aligning Targets and Incentives with Grid Integration Considerations, Greening The Grid

    SciTech Connect

    Katz, Jessica; Cochran, Jaquelin

    2015-05-27

    Greening the Grid provides technical assistance to energy system planners, regulators, and grid operators to overcome challenges associated with integrating variable renewable energy into the grid. This document, part of a Greening the Grid toolkit, provides power system planners with tips to help secure and sustain investment in new renewable energy generation by aligning renewable energy policy targets and incentives with grid integration considerations.

  13. Detecting Extreme Events in Gridded Climate Data

    SciTech Connect

    Ramachandra, Bharathkumar; Gadiraju, Krishna; Vatsavai, Raju; Kaiser, Dale Patrick; Karnowski, Thomas Paul

    2016-01-01

    Detecting and tracking extreme events in gridded climatological data is a challenging problem on several fronts: algorithms, scalability, and I/O. Successful detection of these events will give climate scientists an alternate view of the behavior of different climatological variables, leading to enhanced scientific understanding of the impacts of events such as heat and cold waves, and on a larger scale, the El Nin o Southern Oscillation. Recent advances in computing power and research in data sciences enabled us to look at this problem with a different perspective from what was previously possible. In this paper we present our computationally efficient algorithms for anomalous cluster detection on climate change big data. We provide results on detection and tracking of surface temperature and geopotential height anomalies, a trend analysis, and a study of relationships between the variables. We also identify the limitations of our approaches, future directions for research and alternate approaches.

  14. CPV Vs. PV from a grid-matching perspective

    NASA Astrophysics Data System (ADS)

    Strobach, E.; Bader, S.; Faiman, D.; Solomon, A. A.; Meron, G.

    2012-10-01

    In a recently-published series of papers we studied the general problem of matching large PV systems of various kinds to the electricity grid. In those studies CPV systems were simulated via an artificial ansatz in which they were treated as 2-axis flat-panel PV systems whose light sensitivity is only to the direct normal component (DNI) of solar irradiance. The present study replaces this ansatz with an experimentally validated model of a real CPV system. We compare a genuine 2-axis high-concentration CPV system with a static flat-panel PV system from the standpoint of matching both system types to the Israeli electricity grid.

  15. Mesh Generation via Local Bisection Refinement of Triangulated Grids

    DTIC Science & Technology

    2015-06-01

    UNCLASSIFIED Mesh Generation via Local Bisection Refinement of Triangulated Grids Jason R. Looker Joint and Operations Analysis Division Defence...Science and Technology Organisation DSTO–TR–3095 ABSTRACT This report provides a comprehensive implementation of an unstructured mesh generation method...relatively simple to implement, has the capacity to quickly generate a refined mesh with triangles that rapidly change size over a short distance, and does

  16. ASCR Science Network Requirements

    SciTech Connect

    Dart, Eli; Tierney, Brian

    2009-08-24

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

  17. Smart Grid Information Clearinghouse (SGIC)

    SciTech Connect

    Rahman, Saifur

    2014-08-31

    Since the Energy Independence and Security Act of 2007 was enacted, there has been a large number of websites that discusses smart grid and relevant information, including those from government, academia, industry, private sector and regulatory. These websites collect information independently. Therefore, smart grid information was quite scattered and dispersed. The objective of this work was to develop, populate, manage and maintain the public Smart Grid Information Clearinghouse (SGIC) web portal. The information in the SGIC website is comprehensive that includes smart grid information, research & development, demonstration projects, technical standards, costs & benefit analyses, business cases, legislation, policy & regulation, and other information on lesson learned and best practices. The content in the SGIC website is logically grouped to allow easily browse, search and sort. In addition to providing the browse and search feature, the SGIC web portal also allow users to share their smart grid information with others though our online content submission platform. The Clearinghouse web portal, therefore, serves as the first stop shop for smart grid information that collects smart grid information in a non-bias, non-promotional manner and can provide a missing link from information sources to end users and better serve users’ needs. The web portal is available at www.sgiclearinghouse.org. This report summarizes the work performed during the course of the project (September 2009 – August 2014). Section 2.0 lists SGIC Advisory Committee and User Group members. Section 3.0 discusses SGIC information architecture and web-based database application functionalities. Section 4.0 summarizes SGIC features and functionalities, including its search, browse and sort capabilities, web portal social networking, online content submission platform and security measures implemented. Section 5.0 discusses SGIC web portal contents, including smart grid 101, smart grid projects

  18. Power grid reliability and security

    SciTech Connect

    Bose, Anjan; Venkatasubramanian, Vaithianathan; Hauser, Carl; Bakken, David; Anderson, David; Zhao, Chuanlin; Liu, Dong; Yang, Tao; Meng, Ming; Zhang, Lin; Ning, Jiawei; Tashman, Zaid

    2015-01-31

    This project has led to the development of a real-time simulation platform for electric power grids called Grid Simulator or GridSim for simulating the dynamic and information network interactions of large- scale power systems. The platform consists of physical models of power system components including synchronous generators, loads and control, which are simulated using a modified commercial power simulator namely Transient Stability Analysis Tool (TSAT) [1] together with data cleanup components, as well as an emulated substation level and wide-area power analysis components. The platform also includes realistic representations of communication network middleware that can emulate the real-time information flow back and forth between substations and control centers in wide-area power systems. The platform has been validated on a realistic 6000-bus model of the western American power system. The simulator GridSim developed in this project is the first of its kind in its ability to simulate real-time response of large-scale power grids, and serves as a cost effective real-time stability and control simulation platform for power industry.

  19. Redirecting science

    SciTech Connect

    Aaserud, F.

    1990-01-01

    This book contains the following chapters. Science policy and fund-raising up to 1934; The Copenhagen spirit at work, late 1920's to mid-1930s; The refugee problem, 1933 to 1935; Experimental biology, late 1920s to 1935; and Consolidation of the transition, 1935 to 1940.

  20. D. Carlos de Bragança, a Pioneer of Experimental Marine Oceanography: Filling the Gap Between Formal and Informal Science Education

    NASA Astrophysics Data System (ADS)

    Faria, Cláudia; Pereira, Gonçalo; Chagas, Isabel

    2012-06-01

    The activities presented in this paper are part of a wider project that investigates the effects of infusing the history of science in science teaching, toward students' learning and attitude. Focused on the work of D. Carlos de Bragança, King of Portugal from 1889 to 1908, and a pioneer oceanographer, the activities are addressed at the secondary Biology curriculum (grade 10, ages 15, 16). The proposed activities include a pre-visit orientation task, two workshops performed in a science museum and a follow-up learning task. In class, students have to analyse original historical excerpts of the king's work, in order to discuss and reflect about the nature of science. In the museum, students actively participate in two workshops: biological classification and specimen drawing. All students considered the project relevant for science learning, stating that it was important not only for knowledge acquisition but also for the understanding of the nature of science. As a final remark we stress the importance of creating activities informed by the history of science as a foundation for improving motivation, sustaining effective science teaching and meaningful science learning, and as a vehicle to promote a closer partnership between schools and science museums.